Over at QandO, Dale Franks says: "The bottom line is that, if you are a foreigner, and if you intend to reside in this country, then those of us who are already here have a perfect right to boot you out the moment you displease us."

That doesn't seem quite right.

Franks was discussing a bunch of proposed British laws that would authorize the deportation of any "non-citizens" found guilty of "fostering hatred" or "glorifying terrorism." The law wouldn't apply to British citizens who, of course, have the freedom of speech and association. So is this distinction between citizen and non-citizen tenable, especially here in the United States? Franks says yes; I would say no. For starters, at least here in the U.S., we're treading on iffy constitutional grounds: Basic rights such as due process, equal protection, and the freedoms of speech and association should, in theory, apply to all "persons" within the United States, not just citizens. I'd go further: the fact that non-citizens cannot vote often makes it more essential that they be afforded protections under the law. This was James Madison's murmur-provoking view—"Aliens are not more parties to the laws, than they are parties to the Constitution"—and most federal courts have by and large agreed with Madison, although the current Supreme Court has obviously wavered.

But set aside the constitutional issues, or let the lawyers duke it out. Pragmatically, does it make sense to restrict the rights of non-citizens? David Cole, who would no doubt be vigorously shaking his head right now, has warned: "Our own historical experience with restricting fundamental rights on the basis of citizenship"—an experience that includes Dred Scott—"should give us pause." He quotes Yale law professor Alexander Bickel, who makes a philosophical point: "A relationship between government and the governed that turns on citizenship can always be dissolved or denied [since] citizenship is a legal construct, an abstraction, a theory." I don't think we're in immediate danger of anything dissolving, mind you, but it's worth thinking about. We make our government a government "of the citizens, by the citizens, and for the citizens" at our peril.

A more practical concern is that alienating non-citizens—by, say, restricting their free speech rights, on pain of deportation—could, in theory, make those communities less likely to cooperate with law enforcement in rooting out terrorist cells. Especially, as was the case after 9/11, when most of the non-citizens rounded up and detained are never even charged with a crime and, for all we know, not guilty of inciting anything. Another worry, and this ought to concern even those who don't think "foreigners" deserve rights, is that limiting the rights of non-citizens almost always sets a disturbing precedent for limiting the rights of citizens. Exhibit A: Yasser Hamdi, Exhibit B: Jose Padilla. Two U.S. citizens held indefinitely, under "wartime powers" that, as the president assured us in the heady Arab-hunting days after 9/11, were intended to apply only to "foreigners." Then we have the historical precedents, again, citing David Cole:

The line between alien and citizen has often been crossed before. In fact, two of the most shameful episodes of our nation's history also had their provenance in measures initially targeted at non-citizens. The McCarthy era of the 1940s and 1950s, in which thousands of Americans were tarred with guilt by association, was simply an extension to citizens of a similar campaign using similar techniques against alien radicals in the first Red Scare thirty years earlier. The earlier Red Scare, which culminated in the arrests of thousands of aliens for their political associations during the Palmer Raids, was coordinated by a young J. Edgar Hoover, then in the Alien Radical division of the Justice Department. Hoover applied what he had learned in the first Red Scare to U.S. citizens during the second Red Scare, which targeted thousands of them.

The same pattern underlies the internment of U.S. citizens of Japanese descent during World War II. Since 1798, the Enemy Aliens Act has authorized the president during wartime to arrest, detain, deport, or otherwise restrict the freedom of anyone over fourteen years old who is a citizen of the country with which we are at war, without regard to any actual evidence of disloyalty, sabotage, or danger. The justification for that law, which the Supreme Court has upheld as constitutional, is that during wartime one can presume that citizens of the enemy country are loyal to their own country, not ours, and that there is insufficient time to identify those who are actually disloyal.

In World War II we simply extended that argument to U.S. citizens through the prism of race. The Army argued that persons of Japanese descent, even if they were technically American citizens because they were born here, remained for all practical purposes "enemy aliens," presumptively likely to be loyal to Japan.

Now granted, some might think that locking up communists is just peachy, but it ought to give civil libertarians a moment's pause. A government that can restrict the rights of non-citizens has all the tools it needs to do the same to citizens. Honestly, I won't go so far as to say that no liberties can ever be restricted in the name of security, but I will say that the sort of double-standard Franks proposes, and the distinction between the basic rights of citizens and basic rights of non-citizens, seems wholly untenable to me.

Just about everyone and his economist mother has researched the ways in which socioeconomic status correlates with health. In the United States, the fraction of people in 'excellent' or 'very good' health in the top income quartile hovers at around 40 percentage points above that in the bottom quartile. It doesn't take long to come up with all sorts of theories for this: varying access to health care, poor behavioral habits on the part of the poor, differing environmental factors, differing exposure to stressful events. But no one quite knows for sure.

Anyway, a new RAND paper by James P. Smith looks at this problem in greater detail, trying to figure out which aspects of socioeconomic status actually matter for health. For instance, he looks at the stock market gains during the 1990s—when many people became unexpectedly wealthy—and suggests that income and wealth gains, by themselves, do not decrease the chance of disease onset. This may not be an ideal sample set, though, since those who gained in the stock market were already fairly well-off.

On the other hand, education correlates remarkably well with better health; perhaps in part, Smith suggests, because people with higher levels of education can better manage complicated treatment regiments. An experiment involving a diabetes treatment seems to suggest just that. (Programs that forced patients to stick to the regiment had large health-effects on the less-educated.) But as always, proving causation is another matter—why are educated people, apparently, better at self-management? Maybe they're more likely to have jobs with more free time. Nevertheless, the "education effect" really is so significant, and persists even into old age.

Two other findings. First, Smith points out that the link between socioeconomic status and health may exist, in part, because the latter causes the former. The onset of a serious chronic disease, after all, really does take a pair of fists to a person's job and salary. So perhaps the "health gap" causes socioeconomic inequality, rather than the other way around. Second, and more importantly, a growing body of research suggests that economic circumstances during childhood seem to have a serious bearing on health later on in life. Still, no one knows exactly why, although theories abound. The importance of nutrition in the womb is one. Interesting fact: In the olden days, and even among current adults, life expectancy varied significantly with the season of birth. In the northern hemisphere, for instance, 50-year-olds who were born in October and November—and hence, whose mothers had access to cheap and plentiful fruits, vegetables, and eggs during pregnancy—could expect to live about 3/4 of a year longer than those born in the spring. I don't know if that's still true for people growing up today, but it might be.

This guy just can't believe how much Hollywood hates America, freedom, and our brave soldiers - or something like that. From his hysterical TownHall.com column (sorry for the redundancy, but I want to be clear):

Here's the pitch: with box-office numbers trending down, studio executives are suddenly greenlighting movies they can describe to shareholders as 'controversial' or 'timely.' Whether the films are anti-American or otherwise demoralizing to the war effort is apparently immaterial. Its appetite whetted by "Fahrenheit 9/11"'s $222 million worldwide gross, Hollywood thinks it's found a formula for both financial security and critical plaudits: noxious anti-American storylines, bathed in the warm glow of star power.

Shocking! Hollywood to make films with eye towards market demand! Just what sort of trend lines might these heartless profit-mongers be looking at?

Der Spiegel takes an inside look at the negotiations between the EU and Iran over the latter's nuclear program. At this point, the obstacles to progress are the same as they've always been: Europe's willing to hand out whole baskets full of carrots—and it has to, since it can't really wield any sort of stick—while Iran seems more interested in guarantees that Israel and the United States won't attack it. But the Bush administration has no interest in handing out any such guarantees. Why? In the New Republic this week, Michael Mazarr points out that a U.S. attack on Iran would likely provoke the letter to strike back, against oil fields, against U.S. military interests, against American civilians. Basically, no one would benefit from a conflict, and everyone would benefit from engagement. Now maybe Iran would, in fact, reject any sort of grand bargain with the United States, or any sort of engagement along the lines proposed by Kenneth Pollack. But that's no reason not to try. This whole Iran impasse is really one of the more inexplicable Bush administration actions over the past five years—and that's saying a lot.

Punching in numbers on the calculator—that's what the Center for Public Integrity's been up to lately (in case you were wondering), and they've recently discovered that lobbyists and other special interest groups have spent nearly $1 billion in 2004 in statehouses around the country. Now that doesn't sound like all that much, but it comes out to five lobbyists and $130,000 per legislator, influence that's hard to resist. Certainly, then, legislatures ought to take CPI's recommendations for "revolving door" and disclosure law changes seriously.

But all that money—can it ever be curbed? Probably not. Special interests will always be among us. On campaign finance, at least, I agree with the Heritage Foundation—there's no way to limit the flow of money; it always finds a way. The 2004 election proved that, and recently-passed federal legislation, from the energy bill to the bankruptcy bill, proved that McCain-Feingold didn't make Congress any less willing to jump in bed with big business. Meanwhile, publicly-financed campaigns, higher congressional salaries, and other ideas for limiting the demand for money may make "clean" elections a reality someday, but it seems very likely that no one will ever eradicate the horde of lobbyists hanging around state capitols and D.C., where the real action takes place. CPI's proposed reforms, however nice, amount to one finger in a very leaky dike.

One to note, however, is that not all "special interests" should be painted with the same broad brush, as CPI tends to do. Corporations will try to buy influence—tax breaks, subsidies, loosened workplace restrictions—and labor unions will push right back and try to stop them. Both are "special interests," yes, but it's pretty clear that they're not the same. Without hordes of lobbyists from groups like the AFL-CIO, or the NAACP, over the years, progressive change and liberal social reform in this country would have been much-diminished. So as useful as new restrictions on lobbying may be—at least to get much of this influence-peddling into the sunlight—I'm not sure that a government free of "special interests" would necessarily be a good thing.

Nadezhda suggests an oft-neglected point in this post, which I'm taking slightly out of context. Right now, in Iraq at least, the operating assumption is that the Sunnis are our bitter foes, and Salafist groups like al-Qaeda pose an existential threat to the United States, while the Shiites are our friends and natural allies. That may turn out to be true: the Shiites could indeed spearhead a "reformist" element within Islam. On the other hand, no one can predict the future, and ten years down the line, it could be Shiites—perhaps backed by, oh I don't know, Iran and our erstewhile allies in Iraq—that are attacking U.S. interests, the Sunnis our natural allies, and the CIA could be kicking itself for arming and training Shiite militias in Iraq. Who knows? It's one of the reasons why arming one group to fight another always seems like a short-sighted and potentially disastrous idea. And that's exactly why the neoconservatives opposed that sort of strategy. But then engaged in it anyway. Oh, whatever.

On a slightly different note, the Times is carrying this story about a Shiite coup in Baghdad today—SCIRI, one of the Shiite parties running the new government, ousted the mayor and installed one of its own militiamen. It's a situation in which it's hard to claim that the ousted mayor, installed as he was by coalition forces, somehow has more legitimacy than the thugs who ousted him; but, on the other hand, no one wants this sort of thing to become a regular feature of life in Iraq. On the other hand, it's pretty clear that U.S. intervention in the matter would prove none too popular.

False Concern

On Monday Rep. Curt Weldon (R-PA) brought a New York Times reporter into his Congressional office to meet with an unidentified intelligence official who said: 1) that there is a top-secret military intelligence group called "Able Danger" 2) this group knew in the summer of 2000 that Mohammed Atta, and several other participants in the September 11, 2001 attacks were not only members of al-Qaeda, but present in the United States as well, and 3) the group hesitated to pass on the information to law enforcement agencies because of prohibitions against foreign intel agencies spying on citizens and green card holders.

But the same article very clearly says that that none of the four terrorists who 'Able Danger' fingered in 2000 had green cards. That's confirmed by the 9/11 Commission; they all had some form of tourist visa. Yet Weldon seems confused on the point. In June a local paper quoted him in as saying, "Because the men had green cards, they couldn't touch them." And Government Security News, a biweekly newsletter that reported the story on Monday, seems to have bought what was apparently Weldon's line until very, very recently: they were untouchable because they had green cards.

So when that statement became inoperative, the new line was that the damn lawyers wouldn't let the intel folks tip off the FBI or other domestic law enforcement because they had a "sense of discomfort" about breaking some sprit of the law. This is, as Robert Novak would say, bullshit. First, the law is clear: citizens and permanent residents (the formal term for green card holders) get this protection, while people on holiday or business trips don't. Second, as the Times op-ed page deliciously points out today (how's that for timing) the law is violated all the time. Third, according to Human Rights First, getting exemptions to the law for people suspected of being foreign agents isn't that tough.

I have little trouble believing that there is something called 'Able Danger,' or that it knew about these four men well before the attacks. And it certainly would be consistent with most views of pre-attack intelligence operations that the information wasn't shared. But the rationale for why the names weren't passed on just doesn't have legs. So why float this balloon now? Perhaps Weldon and others are interested in further watering down protections against spying on citizens and permanent residents. Or maybe it's just nice to blame the lawyers and civil-libertarians rather than the intelligence community or Bush's (and, yes, Clinton's) lax approach to Al Qaeda. Or maybe he's just full of it.

Max Sawicky is so very right about this:

There are no tax cuts. Banish that phrase from your mind. You haven't seen any. Republican control of the White House and Congress has yielded trillions in tax increases since January of 2001. How can this be? Simple. When you spend more, and when you pass laws that commit the government to spending more in the future, you increase taxes, sooner or later. Spending not financed by current taxes will be financed by future taxes. A debt increase is the present value of future increased taxes. If taxpayers merely pay interest on the debt incurred, forever, the present value of the interest payments is the initial increase in debt.

On the other hand, that's not all, strictly speaking, true. Some people will get genuine tax cuts—and guess who they are? The Center on Budget and Policy Priorities ran the numbers on this a while back, and found that if the government cuts taxes and then increases spending, the ultimate burden of these virtual "tax increases" would fall on middle- and lower-class families. That is: a tax cut, followed by the sort of deficit spending Bush and the Republicans are so fond of, followed by progressive financing of the deficit, would ultimately make the bottom and middle 20 percentiles worse off, in total, by a couple hundred bucks, while those making over $1 million would come out ahead in the end, by some $60,000 apiece, on average. Not a bad deal, when all is said and done. So yes, a few Americans are getting tax cuts, even under Sawicky's definition.

Businessweek has an interesting story about corporations that are trying to engage in "social responsibility" in order to stay competitive:

Take Sewell Avant. The 25-year-old senior procurement analyst graduated from the Georgia Institute of Technology in 2002. During college, he cleaned churches and did regular social projects with fraternity brothers. Now he's carrying on that tradition at Home Depot. He took a day off, without pay, to help mix concrete at the playground project in Marietta. His entire department will do more kiddie-park construction on a weekend in August. For Avant, volunteering adds meaning to his day-to-day job. "Employees are trying to marry their work and nonwork lives. If the company gives them a chance to do that, then they're happier," says C.B. Bhattacharya, associate professor of marketing at Boston University's School of Management.

That's why younger companies are baking the social responsibility concept into their culture -- and demanding investors accept the cost. Costco Wholesale Corp. has long offered generous compensation to its workers, to the scorn of Wall Street and the detriment of its stock price. In the 1980s, networking giant Cisco Systems Inc. opened its first office in East Palo Alto, Calif., a run-down neighborhood amid the prosperity of Silicon Valley. Cisco Chairman John Morgridge worked as "principal for the day" at a school next door. "We're in business to get results. This is just a different currency," says Tae Yoo, Cisco's vice-president for corporate affairs.

Interesting, though it seems unlikely that the "social responsibility" trend will spread too far. Not so long as Wall Street continues to punish any sort of behavior that deviates from profit-maximization. A few rogue CEOs here and there, like Costco's James Sinegal, will have the nerve—and ability—to buck the stock market, but they seem the exception rather than the rule. From a political point of view, meanwhile, some corporations may be getting antsy at the fact that most voters—including many conservative voters—increasingly distrust the power of large corporations, and that fear could spark an outburst of "social responsibility." But in truth, business interests have very little to fear from a political backlash—not so long as they have Congress by the thumbs.

And at most, consumer activists can only train their attention on a small subset of corporations at any given time; so the relatively few companies under fire, like Nike and, perhaps someday, Wal-Mart, may clean up their practices in order to sidle out of the spotlight, but I'm not sure that all adds up to a growing trend. That's not to disparage the companies that are becoming kinder and gentler; it's just to say that it seems very unlikely that corporations will do something that can conflict with their bottom line for largely haphazard reasons.

The ICG's John Pendergast—who was interviewed a while back here at Mother Jones—had an important op-ed on Darfur yesterday that kicks down the idea that the genocide will somehow just go away on its own: "The crisis in Darfur is deepening, not abating. New numbers from the United Nations reveal that 3.5 million Darfurians are in need of emergency aid, a sharp increase over what the misguided optimists expected. Mass rapes continue; lifesaving humanitarian aid is frequently blocked; and impunity for those responsible remains intact."

(Via Coalition for Darfur, which is also essential reading for the ongoing crises in Congo and Niger.)