Coming to America

I have a confession to make. Most of my ancestors came to America undocumented. They were without passports or even a green card, and they didn’t go through immigration processing.

That’s because most of ’em got here in the 18th century.

The ones we can document, anyway. Tracing the family tree backward through the generations, one sometimes hits a dead end. We’ll learn that William married Amanda in some Appalachian holler during the Andy Jackson administration but find no clue where Amanda came from or how long she and her folks had been here. My family were not exactly, um, aristocrats, so record-keeping was haphazard. But I do know that one branch can be traced back to some of the original Pennsylvania Dutch, arriving ca. 1710. Two of my great-times-four grandfathers fought in the Revolution. Generations of my foremothers bounced west on buckboards, gave birth in long cabins, and dug gardens in the virgin wilderness.

The latecomers included my mother’s mother’s grandparents, who arrived from Ireland shortly after the Civil War. And the absolute last guy off the boat was my father’s father’s father, William Thomas of Dwygyfylchi, Wales, who arrived ca. 1885. He married Minnie King, whose father Fielding King had marched through Georgia with General Sherman in a Missouri infantry volunteer regiment. We haven’t traced the generations of Kings back very far, however, so we have no idea when they came to America.

Anyway, this means none of them went through Ellis Island, which didn’t open for business until 1892. A few years ago, when Ellis Island became a national monument, the feds ran print ads with historic photos of Ellis Island immigrants. The captions claimed that the Ellis Island people “built America,” which pissed me off because that wasn’t true. By 1892 all of our major cities were already established; the intercontinental railroad was completed and running; the Midwestern fields cleared from the wilderness by my ancestors were well-tilled and filled with rows of corn. The Ellis Island people just filled the place in some, as far as I was concerned.

Well, OK, they filled it in a lot. Fifteen million immigrants arrived in America between 1890 and the outbreak of World War I in 1914. Earlier waves of immigrants had mostly come for the virtually free farmland, and they fanned out across the prairies and plains. But a large part of the fifteen million remained in cities and took factory jobs. They brought with them talent and industriousness but also crime and poverty and other problems that overwhelmed the cities. This in turn brought about a growth in government and a shifting of government programs from local to state to federal. For example, beginning in the 1910s the states, and eventually the feds, established “welfare” programs to relieve the destitution of immigrants; in earlier times, destitution had been dealt with by local “poor laws.”

Eventually they and their descendants assimilated to America, but it’s equally true that America assimilated to them. This is a very different country, physically and culturally, than it would have been had immigration been cut off in, say, 1886. The newcomers had not shared the experience of carving a nation out of the wilderness and fighting the Civil War. For a people often discriminated against, the Ellis Island-era immigrants were remarkably intolerant of African Americans and shut them out of the labor unions, making black poverty worse. And early state and federal welfare programs provided services only to whites. Immigrants literally took bread out of the mouths of the freedmen and their descendants, exacerbating racial economic disparities that we’re still struggling with today.

Much of American culture as it existed in the 1880s — the music, the folk tales, the way foods were cooked — was washed away in the flood of immigration and survived only in isolated places like rural Kentucky, where the descendants of colonial indentured servants still pretty much had the place to themselves. Here in the greater New York City area I am often dismayed at how much people don’t know about their own country. There are second- and third-generation Americans here who don’t know what a fruit cobbler is, for example. And as for knowing the words to “My Darling Clementine” or “Old Dan Tucker” — fuhgeddaboudit.

On the other hand, there are bagels. It’s a trade-off, I suppose.

I bring this up by way of explaining why I am bemused by some of the negative reactions to yesterday’s immigrant demonstrations. Yes, I realize there’s a distinction between legal and illegal immigrants nowadays. There is reason to be concerned about large numbers of unskilled workers flooding the job market and driving down wages — we learned a century ago that can be a problem. But the knee-jerk antipathy to all things Latino — often coming from newbies (to me, if you’re less than three generations into America, you’re a newbie) who aren’t fully assimilated themselves — is too pathetic. They’re worried about big waves of immigrants changing American culture? As we’d say back home in the Ozarks, ain’t no use closin’ the barn door now. Them cows is gone.

(I can’t tell you how much I’d love to confront Little Lulu and say, “Lordy, child, when did they let you in?”)

Near where my daughter lives in Manhattan there’s a church that was built by Irish immigrants. It is topped by a lovely Celtic cross. Now the parishioners are mostly Dominican. In forty years, if it’s still standing, maybe the priests will be saying masses in Cilubà, or Mandarin, or Quechuan. Stuff changes. That’s how the world is. That’s how America is, and how it always has been. Somehow, we all think that the “real” America is the one that existed when our ancestors got off the boat. That means your “real” America may be way different from mine. Fact is, if we could reconstitute Daniel Boone and show him around, he wouldn’t recognize this country at all. I think they had apple pie in his day, but much of traditional American culture — baseball, jazz, barbecue, John Philip Sousa’s “Stars and Stripes Forever” — didn’t exist in Daniel Boone’s “real” America.

Latinos, of course, already are American, and in large parts of the U.S. Latino culture had taken root before the Anglos showed up. This makes anti-Latino hysteria particularly absurd, because Latino culture is not new; it’s already part of our national cultural tapestry. And who the bleep cares if someone sings the national anthem in Spanish? As Thomas Jefferson said in a different context, it neither picks my pocket nor breaks my leg. I’m sure the anthem has been sung in many languages over the years, because the U.S. has always been a multilingual nation. Along with the several native languages, a big chunk of the 19th-century European farmers who fanned out across the prairies and plains lived in communities of people from the same country-of-origin so they didn’t have to bother to learn English. And many of them never did. It’s a fact that in the 19th century, in many parts of the U.S., German was more commonly spoken than English.

Yes, maybe someday America will be an officially bilingual nation, and maybe someday flan will replace apple pie. Flan is good, and there are many multilingual nations that somehow manage to make it work — India, China, Belgium, and Switzerland come to mind. Even much of my great-grandpa’s native Wales stubbornly persists in speaking Welsh. Multilingualism doesn’t have to be divisive unless bigotry makes it so.

What’s essential to the real America — our love of liberty — is the only constant. And, frankly, it’s not illegal immigrants who are a threat to liberty.

Identifying Evil

Sometimes the worst evil is done by good people who do not know that they are not good. — Reinhold Niebuhr

Via Avedon — David Gerrold has written a post reflecting on the nature of evil. One of his points is that the way evil is usually portrayed on television and in the movies is phony.

People like to pretend — they like to pretend to be vampires and monsters and princesses and Vulcans and whatnots.

And that’s what most Hollywood evocations of evil are — people pretending, because they have no sense of the reality. That’s what was wrong with this particular recreation of the Manson Family; they played it like a bunch of teenagers giddily enjoying their own awfulness. …

… in this show, evil wasn’t portrayed as evil, but as a bunch of Hollywood actors pretending to be evil, chewing the scenery, baring their teeth, flashing their eyes, and practicing their wicked laughs — bwahahahaha. It was pretense.

Real evil looks very different from Hollywood evil.

Hannah Arendt, in her book about the trial of Adolf Eichman, the architect of the Holocaust, Eichmann in Jerusalem: A Report on the Banality of Evil writes of how she sat there day after day, trying to understand how a mild-looking human being could have authored such monstrousness. Ultimately, she coined the phrase “the banality of evil” to describe the essential thoughtlessness — ie. without thought, without feeling, without compassion — that results in evil deeds. The monsters of the Holocaust weren’t monsters, they were acting without regard, without conscious awareness, without empathy, without connection to the larger spiritual realm of humanity.

For a long time I’ve noticed that when racists are portrayed in films they are nearly always depicted as people who are scowling (or smirking) and disagreeable all the time; think Rod Steiger in The Heat of the Night. Yet in my experience — I grew up in an all-white redneck zone — racists can seem to be lovely people in any other context; they can be soft-spoken, considerate, and reasonableness itself except on the matter of race. It’s as if some part of their conscience were missing. It can be hard to grasp that nice Mr. Smith who voluntarily cuts the grass on the church lawn, or sweet Mrs. Johnson who bakes pies for the old folks’ home, would be capable of evil. Yet history tells us that a whole lot of “ordinary” people have taken part in evil acts in the past.

Gerrold writes, “I think evil occurs as a complex cocktail of forces.” I suspect most people are capable of evil if they get caught up in these forces. This is not an excuse for evil, but a warning to take care to recognize those forces and avoid them. People fall into evil because they don’t recognize evil as evil. They mistake it for justice, or righteousness, or even God’s Will.

“Evil does not see itself as evil,” writes Gerrold. “Those who commit evil acts do not see those acts as evil or even malicious. They see themselves as justified.” This is exactly right.

Osama bin Laden and his 9/11 flunkies believed their terrorist attack was righteous and justified, as did Tim McVeigh when he blew up the federal building. Even the all-time great evildoers like Hitler and Stalin and Mao no doubt rationalized their actions as serving a greater good.

A couple of years ago I argued that most of us think of evil as an intrinsic quality that some people have and others don’t, or at least have very little of. If you see evil that way, the next step is to assume that “evil” people are so dangerous and corrupted that “good” people are justified in whatever they do to get rid of them. Thus, “evil” and “good” people are different not because of what they do, but because of who they are. But when you start thinking that way, you’re opening the door to evil and inviting it in.

There’s no question that what took place in that prison was horrible, but the Arab world has to realize that the U.S. shouldn’t be judged on the actions of a…well, we shouldn’t be judged on our actions. It’s our principles that matter, our inspiring, abstract notions. Remember: just because torturing prisoners is something we did, doesn’t mean it’s something we would do. — Rob Corddry, The Daily Show

How many times have you heard a rightie say something like this

The difference between you and me is that, deep down inside, you cannot accept the fact that there are truly evil people in the world. The difference between the liberal and conservative viewpoints boils down to this: you think that, deep down inside, the Islamic nutjobs really only want to have a nice house and a yard, and raise their children in a loving and safe environment, just like all the people you know. Whereas I think that they are truly evil people, like the Nazis, that want more than anything else to destroy all that we hold dear. And they are more than willing to sacrifice their lives, their families, everything in their hatred of all that is good and beautiful.

What most righties don’t understand about evil is how seductive it is. The seduction begins with the notion that “his hatred of me is evil, but my hatred of him is justified.” The fellow who wrote that paragraph may not yet be completely besotted with evil, but he is sure as hell flirting with it.

I say evil is as evil does. It’s not who you are; it’s what you do, that is evil. Or not.

Again, I’m not saying that evil acts should be forgiven, or that people shouldn’t defend themselves from evil or seek to apprehend or even destroy dangerous people before they can harm others. I’m just saying that as we do these things, we must take care not to be seduced by evil ourselves. And that’s hard. It takes a lot of self-honesty and self-discipline.

And it takes recognizing evil as evil. Evil doesn’t wear a big E on its T shirt. Evil can seem to be virtuous. It flatters your ego. And it can feel really good.

See also: Jill at Feministe, “God and Abortion Rights.

Crabs in a Barrel

At the Washington Post, Eugene Robinson writes about “The Meltdown We Can’t Even Enjoy.”

It’s frustrating. The three overlapping forces that have sent this country in so many wrong directions — the conservative movement, the neoconservative movement and the Republican Party — are warring among themselves, doing their best impression of crabs in a barrel, and sensible people can’t even enjoy the spectacle. That’s because it’s hard to take pleasure in the havoc they’ve caused and the disarray they will someday leave behind.

“Crabs in a barrel” — what perfect imagery! Can’t you just imagine all the righties, all the Bush culties and fundies and neocons and Big Gubmint-hating quasi-libertarians confined together within their shared lies and resentments? And as the reality of their failed ideologies closes in, see how they pull in their eyestalks and scramble for whatever crumbs of self-validation they can find?

Today the Right Blogosphere is swarming over the critical news that Borders Books refuses to stock a magazine that published the Danish Mohammed cartoons. Other recent blogswarms involved displays of the Mexican flag. For the past couple of days righties have labored mightily to assure themselves that the opinions offered by some retired FISA judges was the opposite of what the judges actually said it was. They’re still picking through the intelligence garbage dumped by John Negroponte. John Podhoretz of the National Review criticizes the just-released Jill Carroll for not being anti-Muslim enough. And for the past several days a number of them, led by John Fund, have been obsessed over a former Taliban member enrolled at Harvard.

Crumbs, I say. The same people who spent the past several years congratulating each other for their grand “ideas” are running (sideways) from big issues as fast as their scaly little legs can scramble. Robinson continues,

It would all be entertaining if the stakes weren’t so high. Iraqis and Americans are dying; the treasury is bleeding; real people, not statistics, are at the center of the immigration debate. Iran is intent on joining the nuclear club. Hallowed American traditions of privacy, fairness and due process are being flouted, and thus diminished.

Jay Bookman of the Atlanta Journal-Constitution takes a gloomier look at the Big Pcture

It’s not merely that the Bush administration has run aground on its own illusions. The real problem runs deeper, much deeper, and at its core, I think, lies the fact that out of fear and laziness we insist on trying to address new problems with old ideologies, rhetoric and mind-sets.

To put it bluntly, we don’t know what to do, and so we do nothing.

Run through the list: We have no real idea how to address global warming, the draining of jobs overseas, the influx of illegal immigrants, our growing indebtedness to foreign lenders, our addiction to petroleum, the rise of Islamic terror . . .

Those are very big problems, and if you listen to the debate in Congress and on the airwaves, you can’t help but be struck by the smallness of the ideas proposed to address them. We have become timid and overly protective of a status quo that cannot be preserved and in fact must be altered significantly.

The Republicans, for example, continue to mouth a cure-all ideology of tax cuts, deregulation and a worship of all things corporate, an approach too archaic and romanticized to have any relevance in the modern world, as their five years in power have proved.

The GOP’s sole claim to bold action — the decision to invade Iraq in the wake of Sept. 11, 2001 — instead epitomizes the problem. The issue of Islamic terrorism is complex and difficult, and by reverting immediately to the brute force of another era, we made the problem worse.

Yet in recent years the Dems in Washington have offered little else but tweaks to the Republican agenda.

It’s not as if the big, bold ideas needed to address our real problems don’t exist. Sure, they exist — among people with no power to implement them. And thanks to the VRWC echo chamber, those people are painted as dangerous, radical, impractical loonies by just about everyone in both parties and in major news media. Eugene Robinson calls on the Dems to “put together an alternative program that will begin to undo some of the damage the conservative-neocon-GOP nexus has wrought.” But the party as it exists now hardly seems capable of such a challenge. It’s too compromised, too tired, too inbred.

What’s a progressive to do?

As a practical matter, the way Americans conduct elections makes third parties irrelevant. If we had run-off elections or a parliamentary system, I’d say abandon the Dems and form something new. But our system marginalizes third parties; there’s no way around that. Our only hope is to reform the Dems.

Meanwhile, conservatives are being challenged to choose between loyalty and principle. On the Blogosphere, loyalty seems to be winning out. And the righties scurry to hide inside fantasies that George W. Bush is a great leader, and the majority of the American people are still behind him. Snap snap snap.

War Powers

I’ve been thinking about comments to the last post regarding war powers and presidents. Seems to me that if we ever take our country back from the wingnuts we’ve got to revisit the issue of war powers.

First, we need to rethink war itself. How do we distinguish a “state of war” from a “military action”? Is the U.S. in a state of war every time any American soldier somewhere in the world is under fire? Is the U.S. in a state of war when, for example, American military personnel take part in a NATO action such Kosovo?

You might remember this CBS interview of Condi Rice by Wyatt Andrews (November 11, 2005):

QUESTION: Madame Secretary, thanks for joining us. I want to start with the Congressional investigation into “exaggerated intelligence.” Why the counter-offensive? Mr. Hadley was out yesterday. The President seems to be out today. Why the counter-offensive?

SECRETARY RICE: Well, this is simply a matter of reminding people of what the intelligence said about Iraqi weapons of mass destruction, about the fact that for 12 years the United Nations passed Security Council resolution after Security Council resolution calling on Saddam Hussein to cooperate about his weapons of mass destruction, about report after report, after report that talked about the absence of any data on what he had done with these weapons of mass destruction and calling on him to make a full account, the fact that we went to war in 1998 because of concerns about his weapons of mass destruction.

So Saddam Hussein and weapons of mass destruction were linked. The Oil-for-Food program —

QUESTION: Madame Secretary, I don’t mean to interrupt, but you said ’98. Did you mean ’91?

SECRETARY RICE: No, I mean in ’98 when there was —

QUESTION: You mean the cruise missile —

SECRETARY RICE: That’s right, when there were cruise missile attacks to try to deal with this weapons of mass destruction, and the fact that he was not cooperating with the weapons inspectors. And of course, you can go back to ’91 when we found that his weapons programs had been severely underestimated by the IAEA and others. So I think that is what people are reminding us, what the intelligence said prior to the war.

Did you know we were at war in 1998? It slipped right by me. But lo, here’s an article on the Weekly Standard web site about the glorious “Four Day War” of 1998.

Of course, the Weekly Standard probably didn’t call it a Four Day War in 1998. I’m not a Weekly Standard subscriber and don’t have access to their archives, so I don’t know for sure. But considering that this glorious little war began on December 16, 1998, and that the House began formal hearings to impeach President Clinton on December 19, 1998, it doesn’t seem some people were standing behind their President in time of war. Or maybe they were behind him, but they had knives out at the time.

In fact, Condi and the Weekly Standard were both taking part in a “Clinton did it too” propaganda effort designed to blame President Clinton for President Bush’s “mistake” about the WMDs. The Four Day War, not recognized as such as the time, was declared retroactively for political expedience. My point is that wars are getting awfully subjective these days. Right now most of us on the Left think of the War on Terror as a metaphor, but righties see it as a real shootin’ war, by damn, just like WWII. If John Wayne were alive he’d be makin’ movies about it already.

When is a war, a war? We know there’s a war when Congress declares war, but what about undeclared wars?

I wrote about this last December, and our own alyosha added an excellent analysis in the comments that deserves another read. In a nutshell, it appears we’re moving into a new phase of history in which wars between nations will be rare. Instead, “wars” will be waged by decentralized organizations with no fixed national boundaries or territories. Such wars won’t have recognizable ends, because there won’t be a surrender or a peace treaty. It’s likely we’re going to be involved in some level of military actions against such organizations pretty much perpetually for the rest of our lives. What seemed to be a state of emergency after 9/11 is now the new normal.

I believe we need to re-think constitutional war powers in light of this new reality.

The Constitution [Article I, Section 8] says Congress shall have power

To declare War, grant Letters of Marque and Reprisal, and make Rules concerning Captures on Land and Water;

To raise and support Armies, but no Appropriation of Money to that Use shall be for a longer Term than two Years;

To provide and maintain a Navy;

To make Rules for the Government and Regulation of the land and naval Forces;

To provide for calling forth the Militia to execute the Laws of the Union, suppress Insurrections and repel Invasions;

To provide for organizing, arming, and disciplining the Militia, and for governing such Part of them as may be employed in the Service of the United States, reserving to the States respectively, the Appointment of the Officers, and the Authority of training the Militia according to the discipline prescribed by Congress …

On the other hand, the President (Article II, Section 2)

The President shall be Commander in Chief of the Army and Navy of the United States, and of the Militia of the several States, when called into the actual Service of the United States.

Now, I interpret that to mean that the President’s military role is subordinate to Congress’s military role. In any event, Congress is supposed to be the part of government that decides whether we’re at war or not. But then there’s the pesky War Powers Act, which says,

The constitutional powers of the President as Commander-in-Chief to introduce United States Armed Forces into hostilities, or into situations where imminent involvement in hostilities is clearly indicated by the circumstances, are exercised only pursuant to (1) a declaration of war, (2) specific statutory authorization, or (3) a national emergency created by attack upon the United States, its territories or possessions, or its armed forces.

Dahlia Lithwick provides background on the War Powers Act here. See also John Dean. Both of these articles were written immediately after 9/11, before we were publicly talking about invading Iraq.

For the moment I’m putting aside consideration of how closely Bush is adhering to the War Powers Act provisions. Instead, I just want to suggest that Congress revisit the statutory authorization thing so that future presidents can’t fear-monger the nation into a war that drags on for years after the original causes of the war were found to be smoke and mirrors.

It’s one thing to give a President some room to maneuver in case of emergency, and he has to act to protect the United States and its territories before Congress can get itself together to declare anything. But when there is no emergency, especially no emergency to the territory of the United States, I see no reason for Congress to hand off its war-declaring powers to the President. No more undeclared wars. If someone can think of a reason this would be a bad idea, I’d like to hear it.

If the President wants to use some limited military action — say, a four-day bombing campaign — Congress can give permission — a “use of force” resolution — but Congress should stipulate limits (in time or resources, or both), and it must be made clear that this resolution is not equivalent to a war declaration and the President is not to assume any special “war powers.” That is, he’s not to assume any powers the Constitution doesn’t give him in peacetime.

What if a President declares an emergency and starts a war per clause 3 in the paragraph above, but Congress looks on and says, WTF? There’s no emergency! There should be some way for Congress to be able to rein in the President in this circumstance — Sorry, no emergency! You’ve got so many days to bring the troops back, or it’s mandatory impeachment! I’m not sure how that would be done, but it’s clear we need to provide for it to keep future Bushes in check.

Regarding presidential war powers — the Constitution makes no provision for presidents to take on extra powers during war. In the past, some presidents have taken extraconstitutional actions when they believed it was necessary to save the nation from an enemy or insurrection. Lincoln’s suspension of habeas corpus is the standard example. People still argue whether he was justified in doing so, but the circumstances were extreme — citizens, not just armies, were shooting and killing each other and were also shooting at militia called to Washington to protect the capital. In some places civil authority had completely broken down. And Lincoln acted openly, not secretly, and he made it clear he was only taking this action without prior consent of Congress because Congress was not in session and the emergency was dire. When Congress came back into session Lincoln requested approval for his actions. The power he had used rightfully belonged to Congress, and he didn’t claim otherwise.

Bush, on the other hand, acts in secret, and usurps powers of Congress when Congress is in session. There’s no excuse for that unless the threat to the nation is immediate — a mighty enemy navy is about to land in Oregon, for example. Otherwise, he is obliged to work with Congress and abide by laws written by Congress, as I argued in the last post. Otherwise, he’s setting himself up to be a military dictator.

So on the first day of the Post-Bush era, when we have a new birth of freedom and can begin to function as a real democracy again, we should come up with some laws — maybe even a constitutional amendment to be sure it sticks — that will be binding on future presidents and congresses. Vietnam might have been a fluke, but Vietnam and Iraq in one lifetime reveal a flaw in the system that needs correcting.

Don’t Be a Tool

I want to thank alert reader Jim Murphy for sending this photo. What a hoot. Does the Weenie think he’s really fooling anybody?

This photo also made me think about something Sidney Blumenthal wrote in the Guardian article I discussed in the last post. “In a recently published hagiography on the theme of Bush-as-Prince-Hal, Rebel-in-Chief, written by the rightwing pundit Fred Barnes, Bush explained to him that his job is to ‘stay out of minutiae, keep the big picture in mind,'” Blumenthal writes.

So why all the photo ops that accomplish nothing but PR for Bush? What “big picture” does Bush have in mind with the toolbelt?

Years ago I heard some Republican pundit say that the difference between presidents Carter and Reagan was that Carter got bogged down in the details of operating the ship, but Reagan stayed at the wheel and steered. Well, our Dubya acts as if he’s just a passenger on a luxury cruise, and he spends most of his time rolling dice in the ship casino.

Recently Margeret Carlson wrote about the United Arab Emirates port deal [update: which may no longer be an issue, as the UAE is divesting itself of US holdings]:

George W. Bush believes in delegating, and delegate he did, to the Committee on Foreign Investment in the United States, a multiagency body created in 1975 to assess the security risks when foreigners want to invest in this country. The commission has turned down one deal out of 1,500. …

… The commission is supposed to buck decisions of this magnitude up to the president for final review. That didn’t happen here, but the president doesn’t mind. He wouldn’t have done things differently if he had been consulted.

But the decision alarmed just about everyone else, including Bush’s most loyal lieutenants. Bush has always boasted that his steely judgment is what we want in a crisis, even though it failed him when confronted with an intelligence report headed, “Osama bin Laden Promises to Strike Inside U.S.” …

…Republican Senator Lindsey Graham came out against the sale, and Tom DeLay, the former majority leader, warned the president that he had made a “huge mistake” that Congress would overturn.

Bush sees this as just another one of those details that a big-picture CEO, who prides himself on an empty in-box, isn’t supposed to trouble himself with.

It’s like the detail of whether he was cutting back on funding for alternative energies at the moment he was announcing in his State of the Union speech that he was doing the opposite. The wind- and solar-power lab in Colorado where the president spoke two days ago had to hastily rehire 32 researchers fired because of Bush budget cuts, so as not to embarrass the president who’d come to speak about getting over our “addiction to oil.”

Can He Mean Chertoff?

He calls the lack of money to back up his proposal a “mixed message.” Others might see it as hypocrisy or worse. Alternative energy is going to get as much traction in Bush land as the mission to Mars he announced in his 2004 State of the Union address.

To Bush, this is an instance where the big picture is concern for an ally and global trade trumps other things. Besides, he says, the Department of Homeland Security will be riding herd on the Dubai crowd.

Can he mean Homeland Security Secretary Michael Chertoff, the so-called smart one raked over the coals for his disgraceful handling of Katrina by Republican Senator Susan Collins last week, the one who couldn’t do his job because of Brownie — or was it the other way around?

There is no professional who knows what Chertoff is doing in charge of homeland security. The department Bush built from scratch is a disgrace, largely because to Bush all civil servants are bureaucrats and the government a pinata to be hit until all the goodies are disgorged.

There’s a distinction between seeing the big picture and being totally clueless, but this distinction seems to elude the President. He acts less like a CEO than an dim-witted aristocrat who needs a body servant to tie his shoes.

Bush has no idea what a President does, but with the right props and lighting he can look as if he’s doing a heck of a job.

Must Read

Today’s Bob Herbert column, courtesy of True Blue Liberal —

Eisenhower delivered his farewell address to a national television and radio audience in January 1961. “This conjunction of an immense military establishment and a large arms industry is new in the American experience,” he said. He recognized that this development was essential to the defense of the nation. But he warned that “we must not fail to comprehend its grave implications.”

“The potential for the disastrous rise of misplaced power exists and will persist,” he said. “We must never let the weight of this combination endanger our liberties or democratic processes.” It was as if this president, who understood war as well or better than any American who ever lived, were somehow able to peer into the future and see the tail of the military-industrial complex wagging the dog of American life, with inevitably disastrous consequences. …

… The way you keep the wars coming is to keep the populace in a state of perpetual fear. That allows you to continue the insane feeding of the military-industrial complex at the expense of the rest of the nation’s needs. “Before long,” said Mr. Jarecki in an interview, “the military ends up so overempowered that the rest of your national life has been allowed to atrophy.”

Be sure to read the whole thing.

Touching Innocence

This is a sorta kinda followup to the last post, which discussed matters of life and death, space and time, religion, law, morality, and what it is to be human. Which was a tad ambitious now that I think about it. But I request that people not add comments disagreeing with this post until you’ve read that one. This will save us both a lot of time.

Anyway, I see that some righties are upset about a British court ruling that will allow physicians to impose a “do not resuscitate” order for Baby Charlotte, a desperately ill two-year-old, against the wishes of her parents.

The rightie blogger of Stop the ACLU asks,

Is this the direction America is headed? Is this where the ACLU, and the “right to die” folks will take us?

Kim Priestap of Wizbang blames socialized medicine:

Baby Charlotte’s health is fragile normally, so she will go through health scares like this again. This will cost Britain a lot of money. Since Britain has a nationalized healthcare system, funded by taxpayer money, it’s in the state’s best interest to let her die.

What’s going on here? As a mother myself I’m very uncomfortable when government interferes with family decisions like this. I tend to think that when the family is agreed the patient should be resuscitated, the doctors should respect the decision and not involve courts. I don’t know enough about Baby Charlotte to be able to judge whether there is a compelling reason to make an exception in her case. I infer from news stories that the doctors consider her case to be hopeless and that keeping her alive is just making her suffer. And her parents see things very differently.

I argued in the last post that humans need to struggle with hard choices. When governments or other institutions swoop into our lives and make our choices for us, it makes us less human. And this is true even when we make “bad” choices (within the law, of course). Our decisions may be less important than the process we go through to make them. So in that respect I’m sympathetic to the rightie point of view.

However … the title of this post doesn’t refer to Baby Charlotte. It refers to the righties who are oh, so innocent of the facts of life and death these days.

Nearly a year ago us “culture of death” liberals took up the cause of Sun Hudson, a Texas baby whose life support was terminated against family wishes. Although their diagnoses may differ, the legal situations of Baby Sun and Baby Charlotte seem to me to be nearly identical. If anything, Sun’s case was more extreme than Charlotte’s. His mother (father unknown) wanted aggressive medical care to continue, but the law sided with physicians who decided enough had been enough. Baby Sun’s breathing tube was removed on March 15, 2005, and he died of asphyxiation within minutes.

My understanding is that Sun Hudson’s prognosis really was hopeless. But then, so was Terri Schiavo’s.

Sun Hudson died three days before Terri Schiavo’s feeding tube was removed for the last time. Some of you might recall that righties got a tad excited about the Schiavo case. However, they were mostly silent about Sun Hudson — slipped their attention, I guess. Were it not for liberal blogs I wouldn’t have heard about Sun Hudson either.

Why were righties so oblivious to the Sun Hudson case? One explanation is that the law that allowed his life to be terminated had been signed by then-Governor George W. Bush.

The federal law that President Bush signed early yesterday in an effort to prolong Terri Schiavo’s life appears to contradict a right-to-die law that he signed as Texas governor, prompting cries of hypocrisy from congressional Democrats and some bioethicists.

In 1999, then-Gov. Bush signed the Advance Directives Act, which lets a patient’s surrogate make life-ending decisions on his or her behalf. The measure also allows Texas hospitals to disconnect patients from life-sustaining systems if a physician, in consultation with a hospital bioethics committee, concludes that the patient’s condition is hopeless.

Bioethicists familiar with the Texas law said yesterday that if the Schiavo case had occurred in Texas, her husband would be the legal decision-maker and, because he and her doctors agreed that she had no hope of recovery, her feeding tube would be disconnected. [Knight Ridder]

The Sun Hudson story came out just as the VRWC media echo chamber was working overtime to promote George W. Bush as a champion of life. Faux News’s Bill O’Reilly first commented on the Sun Hudson story before he discovered the Bush angle, forcing him to flip-flop harder than a trout on a hot pier. While Terri Schiavo’s parents were depicted as noble and pure of heart, Sun Hudson’s mother became a deranged black woman who couldn’t face reality. Never fear; O’Reilly had flip-flopped back by April when he attacked the ACLU for (perhaps) being behind “infanticide for impaired babies.”

Let’s go back to the Texas Advance Directives Act of 1999, which is the law under which Sun Hudson’s life was terminated. Put very simply, the law allows a health care facility to discontinue life support against the wishes of the patient’s family. The law requires the facility to jump through a number of hoops before it can do this, which ensures there is an overwhelming medical consensus that the patient’s condition is hopeless before the plug is pulled. The family has the option of finding another medical facility willing to continue life support. But other medical facilities are unlikely to take such a patient, especially if the patient will be a drain on the budget.

In other words, if the family is wealthy enough to pay the costs of Grandma’s care and make a generous contribution to the hospital building fund, Grandma lives. If the family’s insurance is capped and they’ve already spent the second mortgage to pay her medical bills, she dies. To paraphrase (well, OK, mock) Kim Priestap of Wizbang (see above), Grandma’s care will cost hospitals a lot of money, so it’s in their best interest to let her die.

There are two issues to be addressed here, both involving rightie inability to face reality. The first is regarding health care and how it is paid for. Just about every nation on earth affluent enough for most citizens to own a microwave has some kind of national health care system. The exception is the United States. In a recent Newsweek column, Jane Bryant Quinn (hardly a socialist) said that America’s health-care system is turning into a lottery.

The winners: the healthy and well insured, with good corporate coverage or Medicare. When they’re ill, they get—as the cliche goes—”the best health care in the world.” The losers: those who rely on shrinking public insurance, such as Medicaid (nearly 45 million of us), or go uninsured (46 million and rising).

To slip from the winners’ circle into the losers’ ranks is a cultural, emotional and financial shock. You discover a world of patchy, minimal health care that feels almost Third World. The uninsured get less primary or preventive care, find it hard to see cardiologists, surgeons and other specialists (waiting times can run up to a year), receive treatment in emergencies, but are more apt to die from chronic or other illnesses than people who pay. That’s your lot if you lose your corporate job and can’t afford a health policy of your own.

Years ago there was a joke in circulation that said a conservative is a liberal who got mugged. The new joke is that a liberal is a conservative who’s lost his health insurance.

The point is that all the evil, inhumane things going on in Other Countries That Have Socialized Medicine are happening here, too. Righties just refuse to acknowledge them. Among those Other Countries, Britain is a good “bad example” because they’ve underfunded their system for years. Meanwhile, we in the U.S. spend far more per capita than other nations (see this report in PDF format; note especially Figure 1 on page 3) but we’re getting worse results (see Table 1, page 4). By some measures we’re getting even worse results than those cheapskate Brits.

And the moral is, people whose health care system is a broken down mess shouldn’t be pointing fingers at other peoples’ health care systems.

The other issue I see here is the touching innocence of righties regarding hopelessly terminal patients. Physicians have made decisions not to aggressively treat hopeless patients, especially suffering hopeless patients, since Hippocrates. Generally they’ve done it quietly and without drawing attention to themselves, but they’ve done it. For example, since the 19th century physicians have prescribed larger and larger doses of opiates to ease the pain of dying patients, knowing that eventually the dosage will be fatal. And as far as the family ever knew, it was the cancer that killed Grandpa, not that last dose of morphine.

Just about any health care professional will confirm this. I’m sure such decisions are being made all over America even as you read this.

The reason we’re hearing about such cases these days is, IMO, multifold. First, in the past medicine wasn’t all that effective. It was easy for doctors to make a show of “doing all we can” because in truth there wasn’t a whole hell of a lot they could do. But now we can do so much more. We have medical technology that will retain life in a body even when the person that body once sustained has long since dissipated, as in Terri Schiavo’s case. The line between life and death itself has blurred.

Second, because of the technology, more and more families refuse to accept a hopeless prognosis. I understand even anencephalic babies are sometimes put on life support these days, even though those babies have no hope of survival. In earlier times, the only choice offered parents would have been whether they wanted to hold the baby while it died, or not.

And third, mass media and our “reality TV” culture make sure the more controversial decisions get global publicity. In earlier times, these matters wouldn’t have been been discussed outside the family. Today, people with less than a half-assed idea of the facts can plaster their uninformed opinions all over the Web.

As individuals, as a nation, as a society, as a species, we’ve got hard choices to make. These choices involve ourselves and our loved ones. We need to make some mature, non-politicized judgments about how to pay for health care. We must think rationally about how much of our health-care resources should be spent on futile care. We need non-hysterical discussion about if, or when, governments should intervene in family decisions. These are all complex issues. Reasonable people will disagree on many points. But we’re going to get nowhere until we’re able to face some hard realities.

Which means we’re going to get nowhere as long as righties dominate the discussion.

When Life Begins, or Not (Formerly Chao-chou’s Dog Has Puppies)

I see that Lance Mannion has taken up the question of when “life” begins. I see that Shakespeare’s Sister mostly agrees with Lance; Jedmunds of Pandagon mostly doesn’t.

Now I want to confuse everyone by arguing that “when life begins” is the wrong question. It’s the wrong question because life doesn’t begin. Or, at least, it hasn’t begun on this planet in a very long time. However life got to Earth — between 3 and 4 billion years ago, I believe — once it established it hasn’t been observed to “begin” again. It just continues, expressing itself in countless forms. The forms come and go — in a sense — but not life itself.

It will be argued that fertilization marks the beginning of a unique individual and is, therefore, a significant moment in the life process — the point when a life begins. But let’s say a couple of weeks later the egg divides into twins or triplets. Did those individuals’ lives begin with the conception? Or, since they didn’t exist as individuals at conception, is the cell division something like an existential reboot?

Further, in the grand scheme of things, is any one moment really separable from all the other moments, the couplings, the countless episodes of cell mitosis going back to the first stromatolites and microbes and macromolecules to the beginning, which is beginningless as far as I know, considering that a stray enzyme at any point over billions of years would have resulted in you being a lungfish?

I don’t have an answer to that. I’m just sayin’ “beginnings” are way overrated.

The real question, seems to me, is when does an individual begin? Is there a clear, bright moment at which we can all agree, “yep, that’s Fred,” and be done with it?

Some argue that the product of pregnancy is a unique individual from conception because its DNA is different from its mother’s. But if unique DNA combinations are what make a unique individual, you’d have to conclude that the twins from the third paragraph are the same person, divided. And if we give you a transplanted heart, lung, and kidney, each with unique DNA combinations from their respective donors, does that make you four different people?

I don’t think science can help us with this one, people. Indeed, if you step back and look at human civilization throughout space and time, you might notice that “person” is a social construct that has been constructed in very different ways by different societies. At various times only men, or only people of a certain skin color, or only people from our tribe, or only people of a particular caste or class, were considered “persons.” We may think we have reached maximum enlightenment by considering all human beings “persons” (assuming we all do, which I question), but it’s possible our distant descendants will expand “person” to include, say, other primates, whales, dolphins, and border collies. You never know.

The argument made by many opponents to legal abortion is that the product of pregnancy is human life, and human life is sacred; therefore, it must be protected. There’s no question that a living human embryo is both alive and human, but when you call it “sacred” you’re throwing a religious concept into the mix. And the great religions of the world do not at all agree on the question of when (or even whether) “human life” becomes “sacred.” Some say at conception, some say at “quickening,” some say at viability, some say at birth. And some will tell you that everything and nothing are equally sacred, so stop asking stupid questions.

The reason we’re even having this discussion is to settle the question of abortion as a matter of law. But as a legal matter, the question of when humans are allowed to take the lives of other humans rarely has absolutist answers. Some kind of regulation about who can kill whom is necessary for civilization, since we can’t very comfortably live together in communities without some assurance our neighbors won’t throttle us in our sleep. But there are always loopholes. Through history, in many societies (even Christian ones), a noble could kill a peasant or slave without penalty. Today governments can order wars or impose a death penalty, and legally that’s not murder.

I tend to get impatient with people who argue that laws are based on morality, and abortion is immoral, therefore it ought to be illegal. As I said in the last paragraph, there are some laws essential to human civilization. These laws regulate who can kill whom and who can own what. They make commerce possible by imposing penalties for fraud. They make complex human enterprises possible by enforcing contracts. Exactly how law has regulated these matters has changed considerably over time; the important point is that, within a given society, there are basic rules everyone is supposed to agree to so that society can function.

The realm of morality, however, is separate from the realm of legality. There are all manner of things that we might consider immoral that are not, in fact, illegal; adultery is a good example. Such acts may have harmful personal consequences, but regulating them isn’t necessary to civilization. And I don’t see what’s immoral about, say, misjudging how many coins you should put in the parking meter. That’s why I tend to see the legal versus moral question on a Venn diagram. The diagram here isn’t entirely accurate since the blue area should be bigger — law and morality intersect more often than they don’t. I’m just saying that answering the moral question of abortion (assuming we ever will) does not tell us whether an act should be legal or not. In fact, since abortion is legal (with varying restrictions) in most democratic nations today with no discernible damage to civilization itself, I’d say the abortion question falls outside the blue area of the diagram.

On the question of morality I disagree a lot with Ezra Klein when he says “confused polling on abortion is evidence that Americans have confused views on abortion.” I think people are not so much confused as limited. Our conceptions of life or humanity or individuality or the self are to a large extent conditioned into us by our culture. It’s very hard to step outside of our conditioning and take a broader view. We’re all blind men feeling an elephant — our ideas about what an elephant is depend on what particular part we happen to be feeling (an elephant is like a a tree trunk? a wall? a fan?). Following this metaphor, there are all manner of people in America today who do not feel confused at all about that elephant. They’ve got hold of its trunk, and they are certain it’s just like a snake. End of argument.

If anything, most people aren’t confused enough.

Our notions of where a fetus fits on the morality scale depend very much on the angle from which we view the question. A fetus is human. But humans are sentient, and a fetus is (so science tells us) insentient. A fetus is like a parasite, or a lower life form. A fetus is God. A fetus is a baby. A fetus is not a baby. A fetus is a potential baby. A fetus is sacred. Nothing is sacred. Everything is sacred.

How about, All of the above?

In case you’re wondering, from a Buddhist perspective it might be argued that since a “person” is an aggregate of the five skandhas (form, sensation, perception, discrimination, consciousness) and an embryo or fetus has only form, it’s not a person. On the other hand, Buddhism teaches that each of us is all of us, throughout space and time. The cells of whatever is conceived contain all life forms, from the beginningless beginning to the endless end, perfect and complete. Interfering with life’s attempts to express itself is a serious matter.

So where does that leave us? It leaves us with individuals who have to make hard choices. Struggling with hard choices is a distinctively human activity. I think it’s something we need to do to be fully human. It helps us wake up. The decisions we make may be less important than the fact that we can make decisions.

I have written in the past (such as here) why I think abortion should be legal, at least until the fetus is viable. My opinion is based mostly on the effects of abortion law in the lives of women. You might notice I don’t spin my wheels much over the question of morality, since I’ve come to see that morality depends on the state of mind in which one acts as much as the act itself. People do “good” things for selfish reasons, and “bad” things for altruistic reasons. Judge not, lest ye be judged.

So, I say, ambiguity is good for you; don’t be afraid of it. Go forth and be human and work it out for yourselves.

[Note: The title of the post refers to the first koan of The Mumonkon. If it doesn’t make any sense to you, that’s OK.]

It’s Us, Too

Roger Ailes’s objections notwithstanding — I’ll come back to them in a minute — David Ignatius’s column in today’s Washington Post comes close to saying the same thing I said in the “Patriotism v. Paranoia” post below. I wrote,

In the past century or so our species, worldwide, has undergone some seismic social shifts. People no longer remain neatly sorted by skin color, language, and cultural history. All over the globe people of diverse ethnic and social backgrounds are having to learn to live together. Once upon a time “foreign” places were far, far away. But air travel has brought them closer in terms of travel time; now every foreign place on the globe is just over the horizon. Soon foreigners will be sitting in our laps.

I think nationalism arose and became dominant in the 20th century largely because of these seismic social shifts. People who can’t handle the shifts retreat into nationalism as a defense.

Ignatius describes what he calls the “connectedness to conflict” paradox, which says that more “connected” people become, the more conflicts seem to arise.

… as elites around the world become more connected with the global economy, they become more disconnected from their own cultures and political systems. The local elites “lose touch with what’s going on around them,” opening up a vacuum that is filled by religious parties and sectarian groups, Sidawi contends. The modernizers think they are plugging their nations into the global economy, but what’s also happening is that they are unplugging themselves politically at home.

In his column Ignatius quotes Francis Fukuyama and a couple of over over-educated ivory-tower types as they try to figure out why it is that the Middle East is in such turmoil because of its contacts with the West. And that’s the problem; these guys are all westerners trying to figure out what’s wrong with Middle Easterners and not noticing that a variation of the same thing is going on right here in the good ol’ U.S. of A., not to mention Europe and other western-type spots.

If by “elites” you substitute “people who aren’t afraid of other cultures and of social and cultural change” I think you get a clearer picture. I don’t think the not-afraid people are necessarily “elites.” Some of the most retrenched nationalists are wealthy, well-educated and well-connected. What they’re not, is modern.

And in a kind of double-paradox, many people who are working hard at “plugging their nations into the global economy” are some of the same people who exploit local nationalistic and xenophobic feelings to stay in political power. Think Republicans Party.

Ignatius, Fukuyama, et al. scratch their heads over democracy and alienation, and of “elites” becoming “disconnected” with their own cultures, and write up a lot of verbose papers expressing highfalutin’ theories. Look, guys, this isn’t difficult. People are afraid of change. They are especially afraid of change that seems to threaten their autonomy and self-identity. And if they think this change is being imported by odd-colored people with exotic accents, don’t expect ’em to roll out the welcome wagon.

This rebellion against change, this retreat into nationalism, is happening all over the globe. It’s happening in Europe, big-time. It’s in the Middle East. And it’s happening here, too, although we’re a bit more subdued about it. So far. But as I noted here, the ongoing Muslim cartoon crisis, for example, amounted to Middle Eastern anti-modernists and western right-wingers whipping each other into a mutual hate frenzy. Granted the western wingnuts haven’t resorted to riots and destruction; they’ve been content with escalating hate speech. But the distinction is merely one of degree, not of kind.

Here’s where Roger Ailes comes in — in the remainder of his column, Ignatius postulates that all these people around the world are going berserk because they have the internets. Ignatius writes,

McLean argues that the Internet is a “rage enabler.” By providing instant, persistent, real-time stimuli, the new technology takes anger to a higher level. “Rage needs to be fed or stimulated continually to build or maintain it,” he explains. The Internet provides that instantaneous, persistent poke in the eye. What’s more, it provides an environment in which enraged people can gather at cause-centered Web sites and make themselves even angrier. The technology, McLean notes, “eliminates the opportunity for filtering or rage-dissipating communications to intrude.” I think McLean is right. And you don’t have to travel to Cairo to see how the Internet fuels rage and poisons reasoned debate. Just take a tour of the American blogosphere.

The connected world is inescapable, like the global economy itself. But if we can begin to understand how it undermines political stability — how it can separate elites from masses, and how it can enhance rage rather than reason — then perhaps we will have a better chance of restabilizing a very disorderly world.

Oh, good. Just cut ’em off from the Web and the natives won’t be so restless. Roger Ailes writes,

Oh, for the good old days — pre-1990s — a time when our sectarian wars and riots and lynchings and genocides were civilized affairs, based on pure, sweet reason. Oh, paradise lost!

I’d like to apologize personally to David Ignatius and Tom Friedman and Francis Fukuyama and Thomas P.M. Barnett and, most of all, to Charles M. McLean, who runs a trend-analysis company called Denver Research Group Inc., for coarsening the discourse. It was wrong of me to think that my opinions might be worth consideration even though I knew I didn’t have a book contract. Clearly, it was my rage that blinded me to the fact that I was poisoning reasoned debate and undermining political stability and separating elites from masses.

And I was such a nice fellow before October 2002; really, I was.

Let the healing begin.

Snark.