People With One Watch, Part III

    I celebrate myself;
    And what I assume you shall assume;
    For every atom belonging to me, as good belongs to you. (Walt Whitman, Song of Myself)

(Note: This is a continuation of the previous two posts; if you haven’t read those, much of this won’t make sense.)

Embryonic stem cell research, from two perspectives. Elizabeth C. writes,

Regarding stem cell research, destroying human life at any time prior to its ability to sustain itself is murder. To the thinking mind, the term “harvesting” is descriptive enough to prevent legalization. We are messing with life itself, believing ourselves so scientifically advanced that we can get away with it. It’s just a matter of time before the legalized slaughter of the lambs via abortion finds us unprepared for the ultimate results: A world deprived of what would have been, had life been allowed. …

… My beloved grandchildren are proof enough for me that lives lost via abortion and stem cell research would have been lives loved, had their biological parents not made the easiest choice in today’s McDonald’s society, here today, gone tomorrow, whatever the reason.

The other perspective: Laurie Strongin writes in the Washington Post about the death of her son, Henry, who was born with Fanconi’s anemia. “Our only hope lay on the frontiers of science, in human embryo and stem cell research,” she writes. She found a doctor named Mark Hughes, then chief of reproductive and prenatal genetics at the National Institutes of Health, who had pioneered a stem cell procedure he thought could save Henry.

But on Jan. 9, 1997, an article in The Washington Post reported that Hughes was violating a two-year-old federal ban on human embryo research with his work on PGD.

Under the ban, Hughes was barred from performing that work as part of his position at NIH. Refusing to abandon his research or the families who were depending on it, he set up a lab as part of an in vitro fertility program at a private hospital across the street in Bethesda. But he was considered in violation of the federal law because his work at the hospital employed NIH research fellows and used NIH equipment — a refrigerator.

Over the following weeks, the daily headlines all read the same to me: Henry is going to die. As our doctor was forced to resign from his job and faced congressional hearings, Henry’s blood counts declined. We searched for alternatives to PGD, but none existed. The politically triggered delay had stolen precious time in our race to save Henry’s life. On Dec. 11, 2002, he died in my arms.

The procedure that Henry was denied because of a refrigerator was the same one used to save the life of Molly Nash, who also was born with Fanconi’s anemia. Today Molly is eleven years old and free from disease.

The odds that any particular blastocyst, once frozen, will ever become a baby are, well, long. It’s likely most will never be thawed. A large part of those that are thawed will not survive thawing. And of the select few that survive thawing and are implanted in a uterus, only some will result in a pregnancy. Yet by some twisted moral algebra, these blastocysts are considered more precious (to some people, like Elizabeth C.) than a child like Henry.

Recent news stories say that about 400,000 surplus frozen embryos are in storage in America. But according to this article in the current issue of Mother Jones, the number 400,000 represents the embryos stored in 2002. Four years later, there is every reason to believe the actual number is higher — close to half a million — and growing rapidly.

In other words, during the time it took for about 110 “snowflake children” to be born, another 200,000 blastocysts went into storage.

The Fetus People have persuaded themselves that since only a small percentage of stored embryos have been designated by their “parents” to be made available for research, the remainder are just sitting around waiting for Mommy and Daddy to thaw them out and pop them in the oven. This is, of course, nonsense. As the Mother Jones article linked above makes clear, the in vitro process requires creating a surplus of blastocysts to achieve one pregnancy. But once treatment is over many parents struggle with the choice of storing, donating, or destroying the leftovers. Many couples choose to store the blastocysts even though they have no intention of using them —

[A] woman described her embryos as a psychic insurance policy, providing “intangible solace” against the fundamental parental terror that an existing child might die. “What if [my daughter] got leukemia?” said yet another, who considered her frozen embryos a potential source of treatment. A patient put the same notion more bluntly: “You have the idea that in a warehouse somewhere there’s a replacement part should yours get lost, or there is something wrong with them.”

For others, embryos carried a price tag that made them seem like a consumer good; a few parents considered destroying them to be a “waste” of all the money spent on treatment.

Michael Kinsley, who supports stem cell research, writes that “if embryos are human beings with full human rights, fertility clinics are death camps.”

In any particular case, fertility clinics try to produce more embryos than they intend to implant. Then — like the Yale admissions office (only more accurately) — they pick and choose among the candidates, looking for qualities that make for a better human being. If you don’t get into Yale, you have the choice of attending a different college. If the fertility clinic rejects you, you get flushed away — or maybe frozen until the day you can be discarded without controversy.

And fate isn’t much kinder to the embryos that make this first cut. Usually several of them are implanted in the hope that one will survive. Or, to put it another way, in the hope that all but one will not survive. And fertility doctors do their ruthless best to make these hopes come true.

Kinsley argues that if one genuinely believes that destroying a blastocyst to extract stem cells is murder, then logically one must also be opposed to in vitro fertilization. The routine practices of fertility clinics destroy far more blastocysts than would ever likely be destroyed for stem cell research. “And yet, no one objects, or objects very loudly,” Kinsley says. “President Bush actually praised the work of fertility clinics in his first speech announcing restrictions on stem cells.”

The fact is, opponents of stem cell research routinely lie — to themselves, to each other, to anyone who will listen — in order to defend their belief that embryonic stem cell research is immoral. This suggests to me that the real reasons people object to stem cell research have less to do with moral principle than with some deeply submerged but potent fear. And this takes us back to elective ignorance. Something about flushing all those blastocysts makes the Fetus People uncomfortable in a way that condemning Henry Strongin to death does not. The arguments they make against stem cell research, which are mostly a pile of lies and distortions, are not the reasons they are opposed to stem cell research. They are the rationalizations created to justify their opposition.

Exactly what it is that frightens the Fetus People so is beyond the scope of a blog post. I hope the social psychologists will get out their chi squares and p values and get to work on finding the answer. But I hypothesize that many of them have years of ego investment in anti-abortion propaganda, to the point that they’re chanting “life begins at conception” in their sleep. If they give so much as a millimeter of ground on the “conception” issue their entire worldview, which includes their self-identity, will crumble apart. Hence, they are less concerned with saving Henry Strongin than with saving blastocysts. Hence, elective ignorance.

I’ve explained my views on “when life begins” before. Many on the Right are absolutely certain that “conception” is the only possible answer, but in fact there are a multitude of different answers that can be arrived at both scientifically and philosophically. As this essay explains nicely, across time and cultures there have been many different opinions as to when life “begins.” Even the Catholic Church has changed its papal mind several times in its history.

The Fetus People argue that since a human blastocyst is human, and alive, it must be human life and therefore entitled to all the rights and privileges and protections the law allows. Others of us think claiming a blastocyst is equal in value to, say, Nelson Mandela is self-evidently absurd; DNA does not equal personhood. And “human life” doesn’t explain why blastocysts are protected with more ferocity than Henry Strongin. We might giggle at Senator Brownback’s Amazing Talking Embryos, but in truth we’re allowing medical and scientific policies to be set by people with simplistic, childish, even primitive ideas about medicine and science. Not funny.

As I explained in Part I of this little trilogy, we’re all conditioned from birth to understand ourselves and the world around us in a certain way. Ultimately our understanding of blastocysts and Henrys and their relative value is based on how we understand some pretty basic stuff, like selfness and beingness, life and death, us and other. Those who insist that life “begins” at conception have a very rigid and narrow understanding of these matters.

I’m going to attempt to explain my understanding as best I can, just as an example. I don’t expect anyone to agree with me, which is not a problem as far as I’m concerned.

As I’ve explained elsewhere, it seems to me that life doesn’t “begin” at all. However it got to this planet four billion years ago, it hasn’t been observed to “begin” since. Instead, life expresses itself in myriad forms. And whatever it is you are is a result of a process stretching back those four billion years. Calling any point a “beginning” seems arbitrary to me.

What is the self? If you’ve ever done time in a Zen monastery, that’s the question the Roshi brings up, over and over again. Kensho might be defined as a paradigm shift of self-ness; a realization that you are not what you thought you were. The realized self is not something that can be explained, but a basic (if crude) analogy is that an individual “self” is a phenomenon of life, as a wave is a phenomenon of ocean. When a wave begins nothing is added to the ocean, and when a wave ceases nothing is taken away from the ocean. Although a wave is a distinct phenomenon, it is also ocean. A person is a distinct being, yet at the same time a person is the great ocean of Being. At birth, nothing is gained; at death, nothing is lost.

So, while I am I and you are you, at the same time I am you and you are me, whether we like it or not.

As I said, this is very crude, and if you ever get interested in Buddhism don’t attach to it. Concepts are always short of reality. But if you understand yourself this way, then you understand all individuals and organisms throughout space and time as a great interconnected process. And you and me and all the blastocysts in the IVF clinics and all the suffering people waiting for the cures that stem cell research promises are all One. In a sense, every atom belonging to one of us as good belongs to everybody.

The “life begins at conception” model, on the other hand, assumes that at conception the individual is broken off from all the rest of Creation and hence is alone in the universe. Seems cold, I say. The Buddha taught that understanding yourself this way leads to grasping and greed, which is the source of all suffering. (See The Four Noble Truths.) Thus the Fetus People are making us all miserable with their campaigns to Save Every Blastocyst while keeping people who have already dissipated back into the Ocean of Being hooked up to life support. At the base of this is (I postulate) their own existential fear.

That’s my take, which you are free to dismiss; I don’t insist everyone share my worldview. But I argue that there is nothing moral about saving surplus blastocysts from being used in medical research, just as there is nothing principled about lying to yourself and others to justify your opinions. Indeed, from a Buddhist perspective it is deeply immoral to keep hundreds of thousands of blastocysts in cold storage — where they are not expressing life — when they could be used to alleviate suffering and express life through other individuals.

If you respect life, you don’t waste it.

    What do you think has become of the young and old men?
    And what do you think has become of the women and children?
    They are alive and well somewhere;
    The smallest sprout shows there is really no death;
    And if ever there was, it led forward life, and does not wait at the end to arrest it,
    And ceas’d the moment life appear’d. (Walt Whitman)

People With One Watch, Part II

This is a continuation of the previous post. I want to look at elective ignorance and the stem cell controversy. However, there are some basic points I want to clarify in this post before I go on to the main point.

“Embryonic” stem cells are derived from the cells that make up the inner cell mass of a blastocyst. Although sometimes a blastocyst is described as an embryo in a very early stage, in fact it is a conceptus in a pre-embryonic state. In humans, a blastocyst develops in the fallopian tube from the fertilized egg (zygote) and then moves (usually) to the uterus, where it implants itself. A pregnancy begins with the implantation of the blastocyst, which then develops into an embryo.

Embryonic stem cells are controversial because acquiring new cells requires destroying a blastocyst. Research scientists want to use the excess blastocysts stored at in vitro fertilization clinics that are going to be destroyed anyway, so there is no need to fertilize an egg for the purpose of obtaining stem cells. Those who object say the blastocyst is a human life, so destroying it is murder. Eventually embryonic stem cells for research may be obtained by cloning, which of course is controversial also.

For the most part, blogosphere opinion on embryonic stem-cell research splits across right-left lines, with the occasional exception. The Right is certain that conducting embryonic stem cell research is immoral. The Left is certain that not conducting embryonic stem cell research is wasteful, and I would call it immoral. To the Left, the Right’s arguments are silly. To the Right, the Left’s arguments are sinister.

As many on the Right point out, the President’s recent veto of the stem cell bill does not result in a ban on embryonic stem cell research, but maintains a ban on federal funding for research. Research with other funding can still be conducted. There is even an exception — embryonic stem cell lines that already existed before August 2001 can be used in federally funded research. And federal funding is available for research on adult stem cells. The Right believes federal policy is an acceptable compromise.

The Left points out that the stem cell lines available for federally funded research have been contaminated with mouse cells, which limits their use. The Left argues also that the ban on federal funds is close to a de facto ban. As this PBS Nova report points out,

Most basic biomedical science in this country—the early, exploratory research—is funded by federal dollars, with the National Institutes of Health taking the lead (to the tune of $20 billion in research-related funding a year). Scientists say that no field of research can flourish without access to this kind of government support. Yet the Harvard scientists you’ll meet in our NOVA scienceNOW segment are barred from using federal funds for the research we describe. If they already head government-funded labs, none of the equipment they’ve purchased can be used to create brand new human embryonic stem cells, to work with any such cells created after 2001, or to create cloned human embryos for stem cell research. That means not a microscope, not a petri dish, not one glass beaker. Scientist Doug Melton, who receives private funds from the Howard Hughes Medical Institute, has gone so far as to equip an entirely separate lab, at an undisclosed location, for this work.

From the American Society on Hematology:

With fewer opportunities for federal funding in human embryonic stem cell research, private sector and state efforts are gaining prominence, outside of the federal government’s oversight, control, and peer review mechanisms. Furthermore, several foreign countries are encouraging and/or actively investing in stem cell research, thereby posing the potential threat of loss of American scientific prominence in this emerging field, possible emigration of the best and brightest American scientists, and definite diminution in the number of talented foreign graduate students, postdoctoral fellows, and senior scientists who otherwise would come to the US for their training and to conduct research in this important area of scientific inquiry.

Adult stem cells can be taken from many parts of a human body, but most come from bone marrow. Contrary to claims from the Right, adult stems cells are not a substitute for embryonic stem cells. Both types of stem cells hold therapeutic promise, but not the same promise. Adult and embryonic stem cells have different properties and different potentials. According to the International Society for Stem Cell Research, embryonic stem cells are pluripotent, meaning they can be developed into many types of cells. Adult stem cells have so far not been found to have this property. On the other hand, adult stem cells have been used successfully to treat blood disorders. There is ongoing research into their use in treating breast cancer, coronary artery diseases, and other conditions. Adult stem cell research is important, also, but claims that adult stem cells have the same potential as embryonic stem cells are simply not true, based on research so far.

Another difference is that, once established in culture, large numbers of stem cells from embryos can be grown for a long time — indefinitely, under the right circumstances — and these cells will retain their unique properties. This is not true of adult stem cells. Stem cells are also obtained from umbilical cord blood and the pulp under baby teeth, and these cells may survive culturing longer than adult cells. These cells haven’t yet been found to have the same pluripotent quality of embryonic cells.

Some on the Right claim there is no evidence whatsoever that embryonic stem cells hold any therapeutic promise. This claim is based on cherry-picked “facts” — see for example, this web page featuring some quotes pulled from newspaper articles — e.g., “Not a single embryonic stem cell has ever been tested in a human being, for any disease”; “‘No one in human embryonic-stem cells will tell you that therapies are around the corner”; etc. In fact, human embryonic stem cells have been successfully turned into insulin-producing cells, blood cells and nerve cells. As reported by Maggie Fox of Reuters (July 16):

“They hold promise in different areas today,” said David Meyer, co-director, of the Cedars-Sinai International Stem Cell Research Institute, which is set to formally open in Los Angeles on Monday

“Adult stem cells will lead to cures much sooner than embryonic. However, the potential for embryonic, once we understand the biology, will be the greater,” Meyer said in a telephone interview.

Groups such as the Juvenile Diabetes Research Foundation and the American Association for Cancer Research say work with embryonic stem cells is vital to understanding how to regenerate diseased or damaged cells, tissues and organs.

For example:

— On July 3, a team at the University of California at Los Angeles reported they had transformed human embryonic stem cells into immune cells known as T-cells — offering a way to restore immune systems ravaged by AIDS and other diseases.

— In June, a team at Johns Hopkins University in Baltimore transplanted stem cells from mouse embryos into paralyzed rats and helped them walk again. Researchers at the University of California at Irvine have done similar work using human embryonic stem cells in rats.

Weirdly, people opposed to embryonic stem cells on moral grounds often are compelled to lie about the research:

David Prentice of the Family Research Council, which opposes embryonic stem-cell research, issued a statement saying adult stem-cell research was actively helping, or close to helping, people with at least 65 diseases.

But in Friday’s issue of the journal Science, three stem-cell experts — Steven Teitelbaum of Washington University in St. Louis, Shane Smith of the Children’s Neurobiological Solutions Foundation in Santa Barbara, California and William Neaves of the Stowers Institute for Medical Research in Kansas City — wrote a detailed rebuttal of these claims and said at best Prentice accurately portrayed only nine of the studies.

What does it say when people lie to defend an opinion on morality?

Right now preclinical work with embryonic stem cells is moving slowly through animal testing, and there are some obstacles to overcome before human trials begin. It is true that one human trial on Parkinson’s Disease patients was stopped in 2001 when 15 percent of the patients developed side effects that were worse than the Parkinson’s. This illustrates the need for caution. However, preclinical research on embryonic stem cells and Parkinson’s disease continues and is showing some promise.

Some argue that because research on embryonic stem cells has yet to result in treatment for human disease, the research is worthless. They ignore the fact that embryonic stem cells were first isolated in 1998. Research on embryonic stem cells is still at an early stage. Studies on adult stem cells, on the other hand, began in the 1960s.

Most medical breakthroughs take a long time to develop. Researchers began trying to develop a polio vaccine in the early 1900s. Jonas Salk began his research in 1947. Human clinical trials began in 1954. The safer Sabine vaccine became available in 1963. It took this much time just to develop a safe and effective vaccine, which is something that had been done successfully before. Developing any new therapy takes time and is terribly expensive — “discovering, testing, and manufacturing one new drug can take between 10 and 15 years and cost nearly a billion dollars.” Stem cell therapy is a far more complex project than developing a new drug.

The naysayers are, essentially, arguing that because the research hasn’t yet developed therapies for human use it never will, even though the enormous majority of scientists believe otherwise.

Here’s one interesting story, however

The story of Molly Nash illustrates how stem cell tools and therapies can work together to save lives. The Colorado child was born with Fanconi’s anemia, a genetic blood disease with an especially poor prognosis. Most patients rarely reach adulthood and die of leukemia. A bone marrow transplant from a healthy sibling with a matched HLA or immune profile can cure the disease, but Molly was an only child and her parents — both carriers of the deadly gene — were fearful of having another child with the disease. They used in vitro fertilization, pre-implantation diagnosis and a cord blood transplant in an attempt to save their child. PGD was used to screen 24 embryos made in the laboratory. One embryo was disease-free and matched Molly’s immune profile. The blastocyst was implanted and nine months later her sibling, named Adam, was born. The stem cells from Adam’s umbilical cord were given to Molly and today she is eleven years old and free from disease.

You can argue about the ethics of having a baby to obtain an umbilical cord to save a child if you like, but this case illustrates that the potential for stem cell therapies are very real.

There is a lot of confusion over stem cell cloning. When stem cell researchers talk about cloning, they mean therapeutic cloning, not reproductive cloning. In therapeutic cloning the cloned cells do not develop into an embryo but instead are used only to develop stem cells.

One other point — a South Korean scientist recently admitted to faking research on stem cell cloning. After this became news, a few on the Right came to believe that all research on embryonic stem cells throughout space and time was, therefore, faked. This is nonsense.

Of course, if the stem cells in question didn’t involve destroying blastocysts, there’d be no controvery. Which takes us back to the leftover blastocyst question.

Opponents of embryonic stem cell research argue that most embryos could be implanted in a uterus someday, which is absurd when you consider the cost and time and the fact that the number of blastocysts in storage is growing rapidly. Liza Mundy writes in the current issue of Mother Jones:

In 2002, the Society for Assisted Reproductive Technology—the research arm for U.S. fertility doctors—decided to find out how many unused embryos had accumulated in the nation’s 430 fertility clinics. The rand consulting group, hired to do a head count, concluded that 400,000 frozen embryos existed—a staggering number, twice as large as previous estimates. Given that hundreds of thousands of ivf treatment rounds have since been performed, it seems fair to estimate that by now the number of embryos in limbo in the United States alone is closer to half a million.

This embryo glut is forcing many people to reconsider whatever they thought they thought about issues such as life and death and choice and reproductive freedom. It’s a dilemma that has been quietly building: The first American ivf baby was born in 1981, less than a decade after Roe v. Wade was decided. Thanks in part to Roe, fertility medicine in this country developed in an atmosphere of considerable reproductive freedom (read: very little government oversight), meaning, among other things, that responsibility for embryo disposition rests squarely with patients. The number of ivf rounds, or “cycles,” has grown to the point that in 2003 about 123,000 cycles were performed, to help some of the estimated 1 in 7 American couples who have difficulty conceiving naturally. Early on, it proved relatively easy to freeze a lab-created human embryo—which unlike, say, hamburger meat, can be frozen, and thawed, and refrozen, and thawed, and then used. (To be precise, the technical term is “pre-embryo,” or “conceptus”; a fertilized egg is not considered an embryo until about two weeks of development, and ivf embryos are frozen well before this point.) Over time—as fertility drugs have gotten more powerful and lab procedures more efficient—it has become possible to coax more and more embryos into being during the average cycle. Moreover, as doctors transfer fewer embryos back into patients, in an effort to reduce multiple births, more of the embryos made are subsequently frozen.

And so, far from going away, the accumulation of human embryos is likely to grow, and grow, and grow.

The cold truth is that blastocysts generated in IVF clinics and not implanted into a uterus are often discarded immediately. Most of the blastocysts that are frozen will either degrade or be discarded eventually. A small number of available blastocysts have been implanted into adopting mothers, creating the “snowflake babies” — 110 have been born so far. A large portion of the blastocysts that are thawed and implanted will fail to result in a baby, however.

It should be obvious to anyone thinking clearly that “embryo adoptions” are not going to be the solution to the growing glut of frozen blastocysts. And if destroying a blastocyst is immoral, why is it more immoral to use it for potentially life-saving medical research than it is to send it straight to an incinerator? This makes no sense to me.

Now I can finally write about what I wanted to write about to begin with, which is looking at the “moral” issue of embryonic stem cells from many perspectives. I’m going to argue tomorrow that there is nothing at all immoral about embryonic stem cell research, but it is deeply immoral to deny medical researchers the use of surplus blastocysts.

Also: Alternet, “Stem Cell Research Could Make Miracles Happen“; Bob Geiger, “Right wing should adopt 400,000 frozen embryos.”

People With One Watch, Part I

One of my favorite sayings is “A man with one watch knows what time it is; a man with two watches is never sure.”

The point — other than no two wristwatches in your possession ever tell exactly the same time — is that the more knowledge you have of an issue, the more likely you are to see more than one side of it. But over the years I’ve run into an astonishing (to me, anyway) number of people who interpret the saying to mean that it’s better to have just one watch.

When people have limited perspectives because of limited knowledge, you might assume that giving people more knowledge would give them broader perspectives. But then there’s the phenomenon of elective ignorance. People practicing elective ignorance start with a point of view and then admit into evidence only those facts that support their point of view. Those with a really bad case of elective ignorance become incapable of acknowledging facts that contradict their opinions. You can present data to them all day long, and it won’t make a dent; “bad” facts are shoved off the edge of consciousness before they get a chance to complicate the E.I. sufferer’s worldview.

Please note that elective ignorance is not necessarily connected to an individual’s intelligence potential. A person can possess sufficient cerebral material to store and comprehend considerable knowledge but elect not to use it. High-I.Q. people with E.I. Syndrome will sometimes concoct elaborate and fantastical rationalizations to explain why some facts are “bad” and others are “good.” These rationalizations will make sense only to those who have elected the same worldview, of course, which leads us to the Dittohead Corollary — People whose opinions are shaped by E.I. pathologies cannot grasp why other people don’t understand issues as “clearly” as they do. Therefore, they assume something sinister stands between those other people and the elected reality; e.g., “liberals hate America.”

Ideologies can be understood as a form of codified elective ignorance, or a strategy to make the world easier to understand by limiting one’s cognitive choices. This is not necessarily a bad thing. Since we all have finite cognitive resources, adopting an ideology is one way to obtain a workable understanding of issues without devoting the time and brain work required to become an expert. As long as a person appreciates that his understand and knowledge are incomplete, and he remains capable of changing his view as he learns more, this doesn’t qualify as Elective Ignorance Syndrome. Further, it can be useful for people within a society to adopt similar worldviews. That way they can reach consensus on social issues without perpetually re-inventing the perspective wheel, so to speak.

We’re all conditioned from birth to understand ourselves and the world we live in a certain way. By the time we’re adults, we all live in a conceptual box — a complex of paradigms — made up of who we think we are and how we think our lives and the world are supposed to be. The way we understand most things may seem “self-evident” but is nearly always a matter of conditioning. Social psychologists say that what most of us call “reality” is a social construct, meaning that people who grow up in the same culture tend to live in very similar conceptual boxes. Put another way, living in the same culture predisposes people to develop similar paradigms.

People who grow up in different cultures live in different conceptual boxes, however, which is why “foreign” people and cultures often don’t make sense to us, and why we don’t make sense to them. “Open minded” people are those who have at least a vague notion that diverse social constructs of reality are possible and are not necessarily bad. “Closed minded” people, on the other hand, cannot fathom that any other social construct of reality than the one they possess is possible. These people find foreign cultures sinister and frightening; see the Dittohead Corollary, above.

People with extreme E.I Syndrome feel threatened by anything “different,” however, even when that “different” is the next-door neighbor with opposing political views. It’s vital to understand that E.I. people perceive threats to their worldview as threats to themselves, because their self-identities are integrated into their worldview. In other words, the conceptual box they live in is who they are. Any challenge to the integrity of the box must be fought by any means necessary.

That’s why you can’t win a pissing contest with a wingnut, for example. Oh, you can absolutely crush their every argument with facts and logic, but that won’t matter; they won’t back down. If you continue to try to “win” they’ll fall back on all manner of logical fallacies, rote talking points, circular reasoning, and sheer nastiness, until you finally decide the argument is eating too much of your time and energy and walk away. Then they declare victory — not because they’ve proven themselves to be correct, but because they’ve turned away a challenge to the box. Put another way, while you’re presenting data and explaining concepts, they’re guarding their cave. That’s why I don’t even bother to argue with wingnuts any more. It’s as futile as explaining rocket science to hyenas, and possibly as dangerous.

I should add that E.I. Syndrome can be found on the extreme leftie fringe as well — International A.N.S.W.E.R. comes to mind. And E.I. Syndrome explains why extremist political ideologies, either Left or Right, lead to totalitarianism. But at the moment the leftie fringe in America is so marginalized and powerless it’s easy to ignore. The Right, on the other hand, has to be dealt with, like it or not.

I bring this up because, IMO, most of our political conflicts — both international and intra-national — are being stirred up by people with one watch. From here I could launch into a discussion of just about anything in the news — the Middle East is an obvious choice — but what got me going today was the stem cell research ban. President Bush’s “boys and girls” comment from yesterday was an expression of paradigm. And (I’m sure you realize) Fetus People are flaming One Watch types. I want to elaborate on this, but as I’ve gone on for a while already I’ll bump the elaboration to another post.

Identity Crises

I spent part of a day in London last summer, about six weeks after the 7/7 subway bombings. I was fresh back from an adventure safari into deepest Wales and was too tired to do much more in London than ride around in one of those double-decker tourist buses. But at least I looked at London, which provided an interesting contrast to New York City six weeks after the 9/11 attacks.

By late October 2001 New York City had begun to dismantle the thousands of shrines that sprang up after 9/11 and spread like kudzu over the sidewalks, lampposts, and scaffolding. Six weeks on, some shrines were entirely gone; others merely trimmed back. But in October 2001 the city was blooming with American flags. Rockefeller Center was an ocean of flags, and if you looked up and down Fifth Avenue you could see more flags than you could count, flapping away into infinite distance. It was quite a spectacle.

In London, the only visible remembrance of 7/7 that I saw was the lonely little sign in the photo at the top of this post. The only other clue that London had recently endured anything out of the ordinary was the tour guide’s cheerful announcement that the bus would not be stopping at Buckingham Palace for security reasons. I didn’t go to the subway stations associated with the 7/7 bombings (I considered it, but thinking of how tourists gawking at Ground Zero made me feel queasy, I decided — out of respect to London — to stay away). I assume there were flowers and signs and visible expressions of grief around those stations. But if so the shrines were confined to those stations and not drizzled liberally all over the bleeping city, as they were in New York.

Neither had London turned into a flag festival. In fact, I barely saw a Union Jack the entire time I was in Britain. Of course, the English flag is not the Union Jack, because that is the British flag. The English flag is the Cross of St George, which I understand is waved enthusiastically by soccer fans wherever an English team is playing. Perhaps the English are less given to flying national flags because they’re ambivalent about which national flag to fly.

By contrast, the Welsh fly their dragon flag from anything that will hold still for it. When you enter Wales by car you are greeted by a big, proud, dragon-festooned sign that says Croeso i Gymru — Welcome to Wales. They want you to know you ain’t in England any more, boyo. But when you drive into England from Wales you get no clue at all that you’ve crossed a border, except that suddenly all the road signs are entirely in English.

The English also seem a bit ambivalent about national anthems. “God Save the Queen” is the anthem of the entire United Kingdom, and since other parts of the UK have their own anthems, there is some controversy about whether “GStQ” is the proper anthem for English soccer matches. And if it isn’t, what is? I understand some English rugby teams have adopted “Land of Hope and Glory” as the English anthem, while others prefer “Swing Low, Sweet Chariot,” for some unfathomable reason. But soccer teams haven’t made up their minds. There is no official Scottish anthem, but the Scots unofficially have adopted “Flower of Scotland,” or sometimes “Scotland the Brave.” On the island of Britain only the Welsh are not at all confused about anthems; theirs is “Hen Wlad Fy Nhadau,” diolch yn fawr iawn (thank you very much).

The difference in reaction to terrorist attack, New York v. London, might be explained by the larger magnitude of the New York attacks. Further, attacks on London from foreign enemies have not passed from living memory. I suspect Londoners born since the Blitz have absorbed the bombing of London into their national identity, and they are guided by that brave example. We Americans have no such collective memory to guide us. Even those of us who have heard of the War of 1812 may not be aware that the British captured and burned Washington, DC, in 1814. For us, that was too long ago to count. For most Americans, our invincibility from foreign attack is part of our national identity. The 9/11 attacks were not just atrocities committed by foreigners against our fellow citizens; they were a violation of our collective ego.

The London subway bombers, however, were British citizens. There’s another difference. But they were British citizens who did not define themselves as British, apparently.

English national identity may be going through a different kind of identity crisis. I suspect the English may be going through a deep, if subtle, re-evaluation of what it is to be English, especially as something distinct from being British. Or maybe not. It’s subtle, as I said.

Americans tend to use England and Britain as synonyms, as Shakespeare himself did in Richard II

    This royal throne of kings, this sceptred isle,
    This earth of majesty, this seat of Mars,
    This other Eden, demi-paradise,
    This fortress built by Nature for herself
    Against infection and the hand of war,
    This happy breed of men, this little world,
    This precious stone set in the silver sea,
    Which serves it in the office of a wall
    Or as a moat defensive to a house,
    Against the envy of less happier lands,—
    This blessed plot, this earth, this realm, this England.

But the Celts, dug into Britain’s edges and highlands since the time of Roman and Saxon invaders, stubbornly refused to surrender their own unique identities. In recent years Wales and Scotland have won some degree of home rule, and the English gave up their centuries-long effort to eradicate the Welsh language, allowing Wales to be officially bilingual. In effect, Scotland and Wales have demanded recognition as British, but not English, and England has agreed.

Now there is an English Question. The devolution of Britain’s old centralized government now allows the Scots and Welsh some say-so over matters specific to Scotland and Wales. But what about England? As Tony Wright observed, matters now decided by the Welsh Assembly for Wales and the Scottish Parliament for Scotland are decided by the British Parliament for England.

Those who warned that devolution to Scotland and Wales would trigger the break-up of Britain have turned out to be emphatically wrong. Those who argued for devolution as a means of keeping the British project up and running have been no less emphatically vindicated. Yet it has, ineluctably, also created the English Question, and it is to this that attention now has to turn. The future of Britain, and of Britishness, may well depend on whether we can find a convincing answer to it.

It is reported that Scottish Labour MPs decided not to sign up to the parliamentary rebellion against the Government’s education white paper because it would draw attention to the anomaly of Scottish MPs deciding on education in England when English MPs have no say on education in Scotland. In fact of course it did precisely the opposite, especially as abstention of view was not intended to be translated into abstention of vote. Similarly, the smoking ban in England was voted on by Scottish and Welsh MPs despite the fact that in Scotland and Wales the issue is a matter for devolved decision.

I’m not aware that the English are pushing for home rule for themselves. The oddness of this may be apparent to everyone but the English. Maybe centuries of seeing themselves as the lords and rulers of all of Britain have left them thinking of Scotland and Wales as relics of history, or vestigial organs — sort of the way non-native Americans think of Indian reservations. As exemplified by the signs (or lack thereof) along the English-Welsh border, the Welsh are far more interested in the integrity of national boundaries than the English.

But now the English are asking, “Hey — where’s our national anthem?” Did they not notice this void before? Is noticing the void now a signal that the English are re-defining themselves vis-à-vis the Welsh and Scots? And if so, will “Welcome to England” signs someday be posted along the roads leaving Wales?

The matter of racial minorities in Britain complicates the identity thing even further. I haven’t spent nearly enough time in Britain to fully understand where the Brits are with this. My impression is that, while most Brits are determined to put on a tolerant face, there’s some racism bubbling under the surface. For example, during the Welsh safari I was told that some English people are buying homes in Wales because Wales is still mostly white.

Yes, but it’s still mostly white for the same reason the Ozarks are still mostly white — a shortage of good jobs and other economic opportunities. Centuries of low status have left Wales still beautiful, but poor. Like the Ozarks, it’s not a place large numbers of people move to; if you live there, chances are you were born there. If racial identity could so override national identity and cause an English person to move in with the poor cousins, among whom “you’re acting English” is an insult — how ironic is that? And isn’t it interesting how we seem to have layers of identities, and that we push one forward and pull back another, depending on circumstances?

And yes, I realize I’m leaving Northern Ireland out of this discussion. That’s a whole ‘nother level of complication that could add several feet to the length of this post.

I started musing about the English because of this post by Michelle Malkin, who in her artless way managed to turn a remembrance of 7/7 into a smear of the British. Brits are not nearly hateful or intolerant enough to suit Malkin. The Unhinged One links to this op ed in the Washington Times by Diana West, who is disturbed because an entire 13 percent of Britain’s Muslim population believe the 7/7 suicide bombers should be considered martyrs.

And, apparently, some among this 13 percent are not shy about expressing their opinions. West quotes one of these, Anjem Choudary:

“Who says you own Britain, anyway?” Mr. Choudary replied. “Britain belongs to Allah. The whole world belongs to Allah … If I go to the jungle, I’m not going to live like the animals, I’m going to propagate a superior way of life. Islam is a superior way of life.”

Offensive, yes, but I don’t see a big distinction between Mr. Chaoudary’s attitude and that of many of the Christian Right who live among us here in America. They all disgust me, yet if I make faces at the Christian Whackjobs I’m a Bad Person, according to the righties.

Malkin and West are angry that Brits permitted Chaoudary and others to demonstrate outside the Danish Embassy during the recent cartoon war, and that British police protected the demonstrators from violent reprisal. The demonstrators had to be protected because they carried signs praising both the 7/7 and 9/11 terrorists. West writes,

Hundreds of demonstrators marched through London, praising the 7/7 killers or calling for the murder of journalists who publish Mohammed cartoons. And the police stood by.

More accurately, they made sure the protest went off smoothly, as the Times Online reported. “People who tried to snatch away [the placards] were held back by police,” the newspaper reported. “Several members of the public tackled senior police officers guarding the protesters, demanding to know why they allowed banners that praised the ‘Magnificent 19’ — the terrorists who hijacked the aircrafts used on September 11, 2001 — and others threatening further attacks on London.” …

… The “Newsnight” show on which Mr. Choudary subsequently appeared included news footage of an English bobby vigorously silencing such a citizen, described as a van driver, who, according to the televised report, had angrily criticized the Muslim protesters. It is tragically enlightening.

“Listen to me, listen to me,” said the policeman, shaking his finger at the van driver. “They have a right to protest. You let them do it. You say things like that you’ll get them riled and I end up in [trouble]. You say one more thing like that, mate, and you’ll get yourself nicked [arrested] and I am not kidding you, d’you understand me?”

Van driver: “They can do whatever they want and I can’t?”

Policeman: “They’ve got their way of doing it. The way you did it was wrong. You’ve got one second to get back in your van and get out of here.”

Van driver: [bitter] “Freedom of speech.”

This vignette wasn’t law and order in action. It was desperate, craven appeasement. As the bobby put it, “You say things like that, you’ll get them riled.” And we mustn’t get them riled. Let Anjem Choudary and his band of thugs praise mass killings, threaten more attacks and advocate murder by beheading on London streets in broad daylight, but don’t get them riled.

Unfortunately, neither Malkin nor West spell out what they would have done in this situation. Would they have refused to allow the Muslims to demonstrate? That sets off all kinds of questions about when the government can stop demonstrations and when it can’t, and Malkin and West do not address those questions. Would they have had the police step aside and let the demonstration turn violent? What if the police stepped aside and people — Muslim and non-Mulsim — were killed? Would Malkin and West have been happy then? They don’t say. They don’t grapple with the hard issues. They just know that Muslims should not be permitted to do things that anger Malkin and West.

Muslims in America seem a lot more docile than Muslims in Europe. Is this because they are less angry than British Muslims? Or is it that they are more afraid of what might happen to them if they speak their minds? If the latter is true, what does that say about Americans? Does it say we are not “appeasers,” or does it say we have less respect than the Brits for freedom of speech? Does suppressing speech make the problem of Islamic extremism go away, or does it sweep Islamic extremism under a rug? What might happen if Muslim extremists demonstrated in New York City with signs that praised the 9/11 terrorists? Would the NYPD be able to keep the peace? Would the NYPD try to keep the peace?

And what are the 13 percent angry about, by the way?

According to this article in the June 22 Economist, Muslims in Europe are angrier than Muslims in America. The article poses various possible reasons for this. But this one was most intriguing to me (emphasis added):

Amid all the confusion, there is one clear trend among European Muslims. Islam is increasingly important as a symbol of identity. About a third of French schoolchildren of Muslim origin see their faith rather than a passport or skin colour as the main thing that defines them. Young British Muslims are inclined to see Islam (rather than the United Kingdom, or the city where they live) as their true home.

It does not help that all Europeans, whatever their origin, nowadays find themselves “identity-shopping” as the European Union competes with the older nation-states for their loyalty. No wonder many young European Muslims find that the umma—worldwide Islam—tugs hardest at their heart-strings.

Hmm, there’s that identity thing again.

The Patients Are Running the Asylum

Awhile back I wrote a post called “Patriotism v. Nationalism,” which was followed up by “Patriotism v. Paranoia,” “Patriotism v. Francis Fukuyama,” “Patriotism v. Hate Speech,” and probably some other posts.

I bring those old posts up because Christopher Dickey has a splendid article on the Newsweek web site that makes many of the same points. Dickey sites George Orwell’s 1945 essay, “Notes on Nationalism,” and argues that the American Right has become the embodiment of Orwellian nationalism. That is not good.

American nationalism, unlike American patriotism, is different — and dangerous.

The second part of Orwell’s definition tells you why. Nationalism is the habit of identifying oneself with a single nation or an idea, “placing it beyond good and evil and recognizing no other duty than that of advancing its interests.” Patriotism is essentially about ideas and pride. Nationalism is about emotion and blood. The nationalist’s thoughts “always turn on victories, defeats, triumphs and humiliations. … Nationalism is power-hunger tempered by self-deception.”

One inevitable result, wrote Orwell, is vast and dangerous miscalculation based on the assumption that nationalism makes not only right but might-and invincibility: “Political and military commentators, like astrologers, can survive almost any mistake, because their more devoted followers do not look to them for an appraisal of the facts but for the stimulation of nationalistic loyalties.” When Orwell derides “a silly and vulgar glorification of the actual process of war,” well, one wishes Fox News and Al Jazeera would take note.

For Orwell, the evils of nationalism were not unique to nations, but shared by a panoply of “isms” common among the elites of his day: “Communism, political Catholicism, Zionism, anti-Semitism, Trotskyism and Pacifism.” Today we could drop the communists and Trotskyites, perhaps, while adding Islamism and neo-conservatism. The same tendencies would apply, especially “indifference to reality.”

Get this part:

“All nationalists have the power of not seeing resemblances between similar sets of facts,” said Orwell. “Actions are held to be good or bad, not on their own merits but according to who does them, and there is almost no kind of outrage-torture, the use of hostages, forced labor, mass deportations, imprisonment without trial, forgery, assassination, the bombing of civilians-which does not change its moral color when committed by ‘our’ side.… The nationalist not only does not disapprove of atrocities committed by his own side, but has a remarkable capacity for not even hearing about them.”

Hammer. Nail. Head.

It’s this aspect of nationalism that peacemakers in the Middle East find so utterly confounding. The Israelis and the Palestinians, Iraq’s Sunnis and Kurds and Shiites, Iranians and Americans have developed nationalist narratives that have almost nothing in common except a general chronology. “In nationalist thought there are facts which are both true and untrue, known and unknown,” Orwell wrote, in a spooky foreshadowing of Defense Secretary Donald Rumsfeld’s nationalist musings. “A known fact may be so unbearable that it is habitually pushed aside and not allowed to enter into logical processes, or on the other hand it may enter into every calculation and yet never be admitted as a fact, even in one’s own mind.”

I think this tells us a lot about why righties cannot be reasoned with, which is more or less the subject of the three previous posts on this blog. This post, for example, is about the way righties frame arguments to confound any attempt at rational response (quoting Tristero):

Like, “So, would you rather Saddam stay in power?” this is a framing of the issue that provides for not even the hint of an intellectually coherent response, let alone a “dialogue.” It is designed to elicit the narrowest range of acceptable responses, responses that reduce disagreement with Bushism to a quibble.

Or, the way they’re turning agreement for the Hamdan decision into support for terrorists, which is absurd, but righties will cut off their own lips before they’ll admit the point is absurd. A few righties, I believe, know good and well their arguments are absurd but make them anyway, probably because they’ve got a vested interest in righties running things. But the bulk of them really don’t know their arguments are absurd, because they’ve walled off large parts of their brains. As Orwell said, “A known fact may be so unbearable that it is habitually pushed aside and not allowed to enter into logical processes, or on the other hand it may enter into every calculation and yet never be admitted as a fact, even in one’s own mind.”

Gene Lyons:

For years, the idea’s been percolating through the right’s well-organized propaganda apparatus that Democrats aren’t loyal Americans.

Regarding Ann Coulter’s ludicrous book, “Slander,” I once wrote that “the ‘liberal’ sins [she ] caricatures—atheism, cosmopolitanism, sexual license, moral relativism, communism, disloyalty and treason—are basically identical to the crimes of the Jews as Hitler saw them.” Michael Savage, Michael Reagan, Sean Hannity, Michelle Malkin, Rush Limbaugh and others peddle the same sterilized American update of an ancient slur. Limbaugh recently called 80 percent of Times subscribers “jihadists.” Now the Bush White House, desperate to prevail in 2006 congressional elections, has taken up the cry. Reasonable people never want to believe that extremists believe their own rhetoric. But quit kidding yourselves. This is mass psychosis. The next terrorist strike, should it happen, will be blamed on the enemy within: treasonous “liberals” who dissent from the glorious reign of George W. Bush. Unless confronted, it’s through such strategems that democracies fail and constitutional republics become dictatorships.

Have a nice day!

Reactionaries

A commenter who labels himself “r4d20” left comments to the “Being Liberal Doesn’t Mean Being a Patsy” post, here and here, and I want to answer these comments at length because the writer brings up some important points. Beginning with:

Not to be a pendant, but the first step in elevating the culture is to at least get some terms more specific than “righties/lefties”, or “the right/the left”. I understand that its a quick and easy reference point, but I think that excessive use of generalities does interfere with clear thought.

I am a big proponent of using words and phrases with precision, but in our current political culture attempts to define various factions by standard political nomenclature will fail, IMO, because the partisan forces tearing us apart are not fundamentally political forces, but cultural ones.

Once upon a time I referred to righties as “conservatives,” because that’s what they called themselves, but whether they are or are not conservative depends a whole lot on how you define conservative. And that’s a perilous thing to do, because if you go by the bare-bones dictionary definition, “One who strongly favors retention of the existing order; orthodox, traditionalist, etc.,” the next thing you have to do is figure out what “existing order” is to be retained, and that can change over time and from place to place.

According to The Reader’s Companion to American History (Eric Foner and John Garraty, eds. Houghon Mifflin, 1991),

A uniquely American form of conservatism first arose in opposition to the nation’s sense of boundless optimism about human nature under democracy. And for roughly the first two hundred years of the Republic, conservatism was defined politically and culturally by its fears of the political excesses, economic egalitarianism, and cultural vulgarity generated by a democratic society shorn of any aristocratic restraints.

This is from an excellent overview of conservatism in America by Fred Siegel that can be found on this page, but you have to scroll down to get to it. It’s under the “American History” heading, and begins “The Reagan presidency has been hailed as the high point of twentieth-century American conservatism.” To understand fully where I’m coming from here it would be helpful to read the whole thing, but I’m just going to quote a little more, skipping to the 1920s —

According to what came to be known as “constitutional morality,” legislation supporting the right to unionize or limiting children’s working hours was an un-American form of group privilege. Laissez-faire conservatism reached its intellectual apogee in the 1920s. A critic complained that by 1924 you didn’t have to be a radical to be denounced as un-American: “according to the lights of Constitution worship you are no less a Red if you seek change through the very channels which the Constitution itself provides.”

In Europe conservatism was based on hereditary classes; in America it was based on hereditary religious, ethnic, and racial groups. The GOP, a largely Protestant party, looked upon itself as the manifestation of the divine creed of Americanism revealed through the Constitution. To be a conservative, then, was to share in a religiously ordained vision of a largely stateless society of self-regulating individuals. This civil religion, preached by President Herbert Hoover, was shattered by the Great Depression and the usurpation of the government by an “alien” power, Franklin D. Roosevelt, in league with “un-American,” that is, unexceptionalist ideas.

Conservatives were traumatized by their fall from grace. Diminished in place and prestige, they consoled themselves with bizarre conspiracy theories and cranky accusations of communist infiltration. Overwhelmed and resentful, they did not so much address the disaster of the depression as yearn for the days when they were able to run their towns, their businesses, and their workers in the manner to which they had been accustomed. Then, in 1940, just when it seemed they had Roosevelt on the ropes, World War II revived and extended his presidency.

At war’s end conservatives unleashed their frustrations. On the one hand, postwar popular conservatism was based on an anticommunist hysteria that antedated the antics of Senator Joe McCarthy. Politics for the McCarthyites was not so much a matter of pursuing material interests as a national screen on which to project their deepest cultural fears.

From here, Siegel goes on to describe the conservative political revival that began with Barry Goldwater’s presidential bid in 1964 and the conservative intellectuals and activists of the 1960s who called for a “restoration” of pre-New Deal America.

But this new conservatism did not so much win the country over to its perspective as board the empty ship of state vacated by a 1960s liberalism that had self-destructed. Conservatism triumphed because New Deal liberalism was unable to accommodate the new cultural and political demands unleashed by the civil rights revolution, feminism, and the counterculture, all of which was exacerbated by the Kulturkampf over Vietnam.

I agree with Siegel that New Deal liberalism, along with the New Left, had self-destructed by the 1970s, although the New Deal itself has yet to be entirely dismantled. But while “identity politics” and other factors splintered liberalism into thousands of ineffectual pieces, the Right got its act together. Some extremely wealthy right-wingers — Richard Mellon Scaife, Joseph Coors, Lynde and Harry Bradley, and Smith Richardson, among others — provided the seed money for the mighty right-wing think tank-media infrastructure, which you can read more about here. This infrastructure has put control of most of the federal government and news media safely in right-wing hands.

Yet, weirdly, the Right continues behave as if it is a desperate fight against a mythical “liberal elite” that runs everything, in spite of the fact that it doesn’t exist, and that progressivism itself has been cast out of power and left wandering in the wilderness for at least 40 years.

Today you’ve got the “social” conservatives, who want to return to 19th-century cultural mores; the “free market” conservatives, who want to return to the Gilded Age; the “Christian” conservatives who want to return to a theocratic America that never actually existed except in their imaginations; and the neoconservatives, who have taken the notions of American exceptionalism to new and more demented heights. And variations thereof.

Somehow these diverse groups have formed a coalition they label “conservative”, in spite of the fact that they advance contradictory agendas. Contemporary conservatism, for example, advocates restricting civil liberties in the name of freedom and extols small government while building the mightiest military-industrial complex the world has ever seen. About the only thing the various elements of the coalition have in common is that they all hate liberals, meaning not actual liberals but a cartoon straw man that represents liberalism in their minds, but which has little resemblance to those of us who are still foolish enough to call ourselves “liberals” in spite of the fact that we’re asking to be rounded up and shipped out on the first bus to the re-education camps.

This conservatism, IMO, isn’t all that conservative. It’s far more radical, revolutionary even, to label conservative. I think reactionary gets closer to it, although the standard dictionary definition of reactionaries — people who vehemently, often fanatically oppose progress and favor return to a previous condition — only works up to a point. Aggressive imperialism is a bit hard to square with returning to a “previous condition,” for example. To make that work you need to understand their urge to impose American hegemony on the rest of the world as a pro-active isolationism — eliminating the “threat” of foreignness by gettin’ it before it gets us.

In other ways, of course, reactionary works quite well — the stubborn refusal to admit that global warming is really happening, for example.

But ultimately, to paraphrase Siegel, I think the current American Right is all about politics as a national screen on which to project their deepest cultural fears.

And, since we’ve got to call these people something, I say “rightie” works as well as anything else.

In its extreme forms, rightieness is just hate. I mean, what are Michelle Malkin’s or Ann Coulter’s political principles, other than that they hate large groups of people that they associate with “the Left”? The hate comes first; whatever political principles they claim were adopted as props to justify the hate.

The commenter r4d20 continues,

While I choose to register Republican, like many/most people I straddle the line, which means that hardcore lefties call me “right” and hardcore righties call me “left”. According to the current “talking points” I am both a jingoistic warmonger, and a pro-Al Queda traitor – but at least both agree I should be shot 🙂 .

Even as a “Rightie” I have more in common with a “moderate” leftie than with a Christian Conservative. As a “leftie” I have more in common with a moderate rightie than with almost any Anarchist or Socialist.

Yet, somehow, politics on the blogosphere has divided itself fairly neatly into “right” and “left” camps, and all (except, these days, the purer libertarians) know extinctively in which camp they and everyone else should be sorted.

Here on the Left Blogosphere, you’d have a hard time finding an anarchist or genuinely socialist blogger. Most of us bloggers are the political heirs of New Deal Democrats. Most of us hold political positions that would have been considered “centrist” or even moderately conservative years ago. Yet today we’re painted as a radical “leftie” fringe utterly beyond the pale of decent, Gawd-fearing American politics. Much of the Right Blogosphere has utterly slipped its tether to reality, yet it gets called “centrist.”

And these days, a “moderate” is someone who doesn’t know what the hell is going on. If you want to preserve long-established American political processes, if you believe in the rule of law and the Bill of Rights and separation of powers and all that old stuff, you’re a leftie. Unless you just say you believe in those things even while you are trying to destroy them, which would make you a rightie.

But if the moderates on each side have been conditioned to think of all the people on the “other side” as extremist stereotypes then they will naturally choose the extremists of their own side over those of the other. The only winners are the wingnuts who maintain their support out of hyped-up fear of possible doomsday alternatives.

Yes, but the wingnuts really are going to bring about doomsday if we don’t stop them. Fence-straddling is not a sustainable position these days.

Dear Media, Part I: Diagnosis

Stranger of Blah3.com speaks for many of us:

Dear Media,

I hope you all enjoy lying in that bed you’ve made.

All those years of making excuses for George W. Bush’s ineptness, inadequacies, and illegalities have earned you absolutely nothing. You brushed aside his lack of experience and intellectual incuriosity in 1999 and 2000, mostly because you didn’t like Al Gore. Your behavior gave him a much better position from which to steal the 2000 election.

You bought the spin from Bush’s minions, ignoring the crisis that was taking place in Florida after the election. You believed every lie they came up with, from ‘The votes have been counted and re-counted and re-counted’ to ‘Al Gore is trying to steal the election,’ and you decided that letting Bush take office (in the most literal sense possible) was ‘best for the country.’

You papered over the fact that he was scared out of his mind on September 11, 2001 – to the point where he flew to Idaho to hide – in favor of painting him as a ‘resolute leader.’ You swallowed, hook, line, and sinker, every lie that came out of the White House in the run-up to the invasion of Iraq – in many cases embellishing the lies to make them sound more plausible. …

… And after all this, Bush and Cheney and Congress and Coulter and every wingnut pundit, whom you’ve coddled and accommodated every step of the way, show their appreciation how?

They want to muzzle you. They want to imprison you. They want to try you for treason.

Stranger links to an The American Prospect article about radio talk show host Melanie Morgan, who is the same raving loon who “debated” the SWIFT program with Al Sharpton on Monday night’s Hardball. TAP quotes Morgan suggesting that New York Times editor Bill Keller should be sent to the gas chamber for treason. She was more moderate on Hardball and was willing to reduce Keller’s sentence to 20 years behind bars.

To be a liberal in America today is to look at news media and despair. Sometime between the Watergate Era and today, the whole bleeping profession of journalism turned into the Right’s Pet Goat. The much compromised New York Times is Exhibit A. You’d think the Bush Administration would be grateful to the Times for its help with the WMDs scam. But no; the Times is now the ur-Goat.

The catastrophe that is contemporary American journalism is described in detail in Eric Boehlert’s new book Lapdogs: How the Press Rolled Over for Bush. I’m not going to repeat Boehlert’s arguments here; many of you know them, anyway. Instead, I want to look at the bigger picture of journalism and politics.

To see the bigger picture, you have to step back from political issues and parties, including our much-beloved debate on whether Democrats are hopeless. Instead, consider the political culture of the United States. I argue that our national political culture is so sick and contaminated that it no longer supports the democratic processes of politics and government. Sheer entropy has kept democracy lumbering along — it takes either a long time or a lot of force to stop a really big mass that’s been in motion for a while. But a political culture utterly inhospitable to rational political discussion, as ours has become, will shut democracy down eventually.

If we’re going to restore the United States to functionality as a democratic republic, our primary goal is to heal the national political culture. Otherwise, it won’t matter which party we support or how many elections we win, because the patient — democracy in America — will still be dying. But if we can heal the culture, the job of reforming other political institutions — like the Democratic and Republican parties — will be easier.

For example, many progressives have concluded it is pointless to support Democrats, because as soon as a Democrat gets inside the Beltway his spinal column is ripped right out of him. Time and time again, we’ve seen Democratic politicians make grand speeches to their liberal constituents, but once we get them elected they do little more than offer ineffectual objections to the ruling right-wing power juggernaut. And we’re all sick of this.

But I say that progressivism’s salvation will not come from any political leader or party, Democrat or otherwise. It will come from media reform. This is true because no matter who we elect, and no matter what progressive legislators might want to accomplish, they are helpless to do much until progessive policies have solid popular support. You build popular support for policies by talking about them to the American people. And for the past fifty years or so, that means being able to make your case in mass media, particularly television.

Now, tell me — when was the last time you watched a substantive, factual, civil discussion of progressive ideas on national television?

Take health care, for example. For years, we progressives have wanted some kind of national health care system, maybe single payer, maybe a combination of public and private systems, but something that would scuttle the bloated, failing mess we’ve got now. Many polls indicate that a majority of Americans are deeply concerned about health care in this country. Yet it is next to impossible to present progressive ideas about health care reform to the American public through mass media. Even on those programs allegedly dedicated to political discussion, as soon as a progressive gets the phrase “health care” out of his mouth, a chorus of rightie goons will commence shrieking about socialized medicine! And then the allotted ten minutes for the health care segment is up; go to commercial.

And that’s assuming a real progressive is invited on the program at all.

So even though a majority of the American people sense that something is wrong with our health care system, and think something needs to change, they never hear what the options are through mass media. Probably a large portion of American voters don’t realize that the U.S. is the only industrialized democratic nation with no national health care program. They never hear that, on a purely cost-benefit basis, we have about the worst health care system on the planet. All Americans ever hear is that Canada has national health care and that Canadians have to put their names on waiting lists to get services, and ain’t that awful? OK, but what about the thirty-something other nations with national health care systems that don’t have waiting lists?

Bottom line: The Right figured out how to use mass media to make its point-of-view dominant and shut out the Left. Thus, radical right-wing views are presented as “conservative” and even “centrist,” even though a whopping majority of the American public doesn’t agree with those views. Through media, the radical Right is able to deflect attention away from itself and persuade just enough voters that Democrats are loony and dangerous. And maybe even treasonous.

And if just enough voters aren’t persuaded — well, there are ways to deal with that, too. But media consumers aren’t hearing much about that, either.

Because media is the dominant political force of our time, media reform is an essential part of the cure. It’s not the only part — reform is required along many fronts — but without media reform, we’re bleeped.

So what’s this political culture thing? Genuine representative democracy is more than just elections, as explained in this Wikipedia article. It is a form of government in which “the ability of the elected representatives to exercise decision-making power is subject to the rule of law, and usually moderated by a constitution which emphasizes the protection of the rights and freedoms of individuals and minorities, and which places constraints on the leaders and on the extent to which the will of the majority can be exercised.”

In successful democracies, accountability to the people is critical. Therefore, government must be transparent except when national security requires secrecy, and in that circumstance some form of oversight of those acting in secret must be honored. It is also essential that a large majority of the people respect a social contract in the broadest sense of that term. And a fundamental part of that contract is the implicit agreement that protecting the integrity of the law, and of the institutions and processes of democratic government, comes before winning elections or enacting policies.

As explained nicely by the Wikipedia article linked above,

For countries without a strong tradition of democratic majority rule, the introduction of free elections alone has rarely been sufficient to achieve a transition from dictatorship to democracy; a wider shift in the political culture and gradual formation of the institutions of democratic government are needed. There are various examples, like in Latin America, of countries that were able to sustain democracy only temporarily or in limited form until wider cultural changes occurred to allow true majority rule.

One of the key aspects of democratic culture is the concept of a “loyal opposition”. This is an especially difficult cultural shift to achieve in nations where transitions of power have historically taken place through violence. The term means, in essence, that all sides in a democracy share a common commitment to its basic values. Political competitors may disagree, but they must tolerate one another and acknowledge the legitimate and important roles that each play. The ground rules of the society must encourage tolerance and civility in public debate. In such a society, the losers accept the judgment of the voters when the election is over, and allow for the peaceful transfer of power. The losers are safe in the knowledge that they will neither lose their lives nor their liberty, and will continue to participate in public life. They are loyal not to the specific policies of the government, but to the fundamental legitimacy of the state and to the democratic process itself.

Granted, these ideals have never been perfectly manifested in the American body politic. All human institutions are imperfect, and institutions that survive through many generations will go through cycles of corruption and reform. Often idealistic people will point to the corruptions and the many ways our nation has fallen short of its ideals and argue that the patient isn’t worth saving. I, however, take the Buddhist view that all compounded things are imperfect and subject to decay, but that’s how life is, and it’s our duty — to ourselves, our ancestors, and our descendants — to make the best of it. Not making the best of it is a bad alternative.

Although it’s never been perfect, once upon a time American political culture supported democratic processes, but now it does not. It does not because many of our civic institutions are controlled by right-wing extremists who do not respect the social contract or the values of democracy. Although they pay lip service to the legitimacy of the government and democratic processes, what drives them is the acquisition of power and the implementation of their extremist agenda by any means necessary. If rules must be broken and democratic processes subverted to achieve their goals — so be it.

Paul Krugman recognized what was happening and wrote about it in the introduction to his book The Great Unraveling. He explained that, throughout history, reasonable people accustomed to political and social stability have failed to recognize the danger of emerging radical movements — until the stability is lost. Ironically, Krugman says he came to understand this from reading Henry Kissinger’s Ph.D. thesis. As Krugman explained in a Buzzflash interview,

… reasonable people can’t bring themselves to see that they’re actually facing a threat from a radical movement. Kissinger talked about the time of the French Revolution, and pretty obviously he also was thinking about the 1930s. He argued that, when you have a revolutionary power, somebody who really wants to tear apart the system — doesn’t believe in any of the rules — reasonable people who’ve been accustomed to stability just say, “Oh, you know, they may say that, but they don’t really mean it.” And, “This is just tactical, and let’s not get too excited.” Anyone who claims that these guys really are as radical as their own statements suggest is, you know, “shrill.” Kissinger suggests they’d be considered alarmists. And those who say, “Don’t worry. It’s not a big deal,”are considered sane and reasonable.

Well, that’s exactly what’s been happening. For four years now, some of us have been saying, whether or not you think they’re bad guys, they’re certainly radical. They don’t play by the rules. You can’t take anything that you’ve regarded as normal from previous U.S. political experience as applying to Bush and the people around him. They will say things and do things that would not previously have made any sense — you know, would have been previously considered out of bounds. And for all of that period, the critics have been told: “Oh, you know, you’re overreacting, and there’s something wrong with you.”

The ascension of the radical right occurred over many years, and their takeover of government — a slow-motion coup d’état — happened gradually enough that most of us didn’t comprehend what was happening. America has been challenged by radicalism before, and always it has come back to the center soon enough. (And by “center” I mean the real center, where liberalism and conservatism balance, not the false “center” of today that would have been considered extreme conservatism in saner times.) I do not believe the coup is a fait accompli; the Right is not yet so secure it its power that it has dropped all pretense of honoring democratic political process. They’re still going through the motions, in other words. But this time I do not believe America will come back to the center unless a whole lot of us grab hold and pull at it. Hard.

How do we do that? First, we have to get our bearings and remember what “normal” is, which is going to be hard for the young folks whose memories don’t back back further than the Reagan Administration. Just take it from an old lady — what we got now ain’t normal.

Second, media reform, as I say, is essential, and will be looked at in more detail in Dear Media, Part II, which I hope to have up by tomorrow. I argue that media reform is essential to all other necessary political reform. Blogs and innovations in media technology may prove to be critical to this reform.

Stars in Their Courses

I got into a disagreement with someone yesterday in a TAPPED comment thread on the matter of astrology. Astrology is in the blog buzz these days because someone discovered Jerome Armstrong once practiced political astrology (or still does, but is keeping quiet about it). A rumor that Jerome also used astrology to choose stocks is making the rounds, but this may not be accurate.

Billmon writes,

Not content with picking through Jerome Armstrong’s dirty laundry at the SEC — at a time when he is expressly forbidden from talking about the case — the werebunnies of Right Blogistan and TNR (is there a difference any more?) plus Mickey Kaus, who flunked out of wererabbit basic training, are having themselves a gay old time making fun of Jerome’s interest in astrology, which I gather he has used in the past to pick stocks, or forecast political trends, or both — I’m not clear.

Nor do I particularly want to be. I’m very familiar with the practice of forecasting financial price trends based on charts of what are essentially random numerical patterns. But on Wall Street they call this “technical analysis,” and they pay thousands of guys millions of dollars to practice the art — even though any number of scientific studies have shown that it works about as well as astrology. (If it did work, the technical traders would own the world by now.) So irrational behavior by an ex-stock picker doesn’t seem like much of a scoop to me.

Like I told the commenter at TAPPED, I don’t see the scandal. Practicing astrology may be stupid, or delusional, or crazy, or a great many other adjectives, but by itself I don’t consider it unethical. If, hypothetically, someone were selling financial advice and telling his clients that the advice was based on in-depth analysis of profits or discounted cash flow or some such, but he was really using astrology, that would be unethical. But if he’s upfront about the astrology thing, what the hell. You pay your money and you take your chances. I’ve heard of people who successfully choose stocks by taping the newspaper stock market section to a corkboard and throwing darts at it.

Once upon a time I couldn’t stand to hear anyone talk about astrology without jumping in and proclaiming how dumb it is. But now I am older and either wiser or more demented; take your pick. I have cleared my head of opinions and judgments. If someone tells me he decided not to take a plane flight because there was a Grand Cross over the airport at the time of departure, I no longer feel an urge to lecture him on his credulity. Likewise, if someone tells me he thinks astrology is bunk, that’s fine with me. Whatever.

I’ve never seen empirical evidence that astrology forecasts specific events, like plane crashes or election results, any better than flipping coins or throwing darts. But I’ve known a few people who were deeply into astrology and who were brilliant at using it to predict general trends. In these cases, I suspect the astrologer (consciously or not) uses star charts to jog intuition. In other words, interpreting star patterns might be helping the astrologer access something he already knows, or believes, at some sub-cognitive level.

Knowing something without knowing you know it isn’t as far-fetched as it might sound. Maybe you’ve had the experience of reading a book or hearing a lecture, when something you read or hear causes an understanding, or realization, to pop into the forefront of your brain. And you recognize that this little eureka had been in your head for a while, but it had been a fuzzy thing dangling at the edge of cognition that you’d overlooked. It took someone else’s words to give it clarity and bring it to your full attention.

Another example: These days we nearly always use the word myth as a synonym for fallacy, but myths, it is argued, can be interpreted allegorically as windows to the psyche, or guides to truths that defy articulation. “For the myth is the foundation of life; it is the timeless schema, the pious formula into which life flows when it reproduces its traits out of the unconscious,” Thomas Mann said. Dismissing myths because they aren’t historically or literally true misses the point of them.

I’ve come to appreciate that literalness and truth are not at all the same thing, but I’ve yet to be able to explain why that’s so to someone who doesn’t already get it. The ability to realize truth outside of language or conceptual knowledge seems to come naturally to some people but baffles others, possibly depending on how their brains are wired.

Back to astrology — does “believing in” astrology require believing that events here on earth are caused by the alignments of planets and stars many light years away? I don’t think so, but then just because something isn’t literally true doesn’t make it worthless. If interpreting a star chart — or reading tea leaves, or chicken bones, or the I Ching — causes someone to access depths of intuition he couldn’t get to otherwise, I say there’s some value in that.

And if all of this leaves you cold, that’s fine, but you don’t have to get hostile about it. Think of it as a harmless quirk, like saving gum wrappers or eating eggs with ketchup.

This forecast posted in December 2002 was linked to by Garance Franke-Ruta as a shocking example — she called it “Armstrong’s analysis of the causes of 9-11.” However, it’s not about the causes of 9/11. Rather, it’s about the subconscious impulses driving people, mostly President Bush, in a particular direction. The section subtitled “Bush, Republicans, and Varuna” proved to be accurate in many ways. However, I don’t believe it says anything that people weren’t already suspecting in December 2002 without using astrology. And whatever was going on with Pluto and Bush’s south node doesn’t seem to have tripped him up in the 2004 election.

On the other hand, these predictions for likely Democratic presidential candidates, also made in December 2002 by another astrologer, are pretty darn close — well, except for the Kerry section — and I don’t think these predictions could have been constructed from conventional wisdom in December 2002. This is either one amazing astrologer, or she opened the file last year and re-wrote the predictions. I can’t tell. But if this really is what the lady predicted in 2002, look for John Edwards to be a factor in 2008.

Congrats to Kos

You might have noticed the orange animated ad in the left-hand column — “honor Kos for speaking truth to power.” This Thursday night Markos Moulitsas (along with Wynton Marsalis and Anna Burger) is being honored by the Drum Major Institute for Public Policy in New York City. Read more from Jane Hamsher, here.

I realize there’s some ambivalence about Kos among leftie bloggers. See, for example, Nick Bourbaki’s posts at Wampum, here and here. And yesterday I ran across some snarking at Kos in a comment thread at Unclaimed Territory discussing Ned Lamont’s challenge of Joe Lieberman’s Senate seat. This guy, for example,

I hope that your laudible support for electoral challenges to centrist/conservative Democrats extends also to *third-party* challenges of centrist/conservative Democrats. At the moment, progressive third-party voices are being generally shut out of most of the so-called “progressive netroots”… most egregiously at the site of your friend and colleague, Markos.

Unless there’s a D after the name of the candidate in question, Markos would greatly contest what you yourself have just written above: that few things are more constructive than a democratic election where pro-war views get openly debated and then resolved by voters.

The writer goes on to say that he’d been banned from posting at Daily Kos merely for advocating third-party candidates. Maybe, but I know from my own experiences that people who claim they were banned from a site for expressing perfectly reasonable and temperate opinions are usually, um, not telling the whole story. As a blogger who makes robust use of twit filters herself, I support any blogger’s decision to ban anyone from his or her site for whatever reason. And, yes, this includes rightie bloggers who ban lefties. A blog is the blogger’s creation, not a public utility, and bloggers have a right to exercise editorial discretion whether I like it or not.

IMO the commenter quoted above exemplifies the “let’s-keep-shooting-ourselves-in-the-foot” faction of progressivism. Consider: We are up against a big, well-funded, and well-organized extremist right-wing faction that has taken over the White House and Congress and is well on the way toward taking over the judiciary. This faction spouts rhetoric about “freedom” and “democracy” but in fact supports radical theories about the Constitution that have put this nation on the road to totalitarianism. The regime in power has gotten us into one pointless and ruinous war and appears to be preparing to get us into another one. They are threatening the health of the planet by ignoring global warming, and the point at which it will be too late to act is fast approaching. They have strengthened their grip on power by corrupting elections and appropriating news media so that citizens can’t learn the truth. They are strangling our economy with profligate spending combined with irresponsible tax cuts, and every second that passes we are deeper and deeper in debt to other nations, like China.

The house is on fire, in other words. Some of us think our first priority is to put the fire out any way we can. We can argue about what wallpaper pattern would look best in the master bedroom some other time.

If the Democrats can win back a majority in the House this November — or, even better, the entire Congress — the Dems will have some power with which to fight the Right. That doesn’t mean they will, of course; I expect we will need to apply pressure on a future Dem majority to be sure they use their subpoena power (which they don’t have as a minority) and conduct meaningful investigations to expose the Bushies and the extremist Right for the danger to America that they are. But a Democratic majority in even one house will curtail much of the Bush regime’s ability to steamroll over American rights and values any time it pleases.

I want to be clear that I support Democratic candidates in the November elections (most of ’em, anyway) not because I believe they are always right or because I think a Democratic majority in the House will fix all our political problems. I admit that many Dems running for election in November are less progressive than I wish they were. And even If we succeed in taking at least part of Congress away from the Republicans there will still be a long, hard fight ahead to restore America to anything approximating political health.

But a Dem majority would give us a better position from which to fight and a lot more ammunition to fight with. If we don’t take back part of Congress in November, it means two more years of having no power in Washington at all.

The Bushies can do a lot of damage in two years, folks.

Looking beyond the midterm elections — if we succeed, our next goal as netroots activists should be to increase our influence among the Dems. We must deliver the message to the Democratic Party establishment that it’s time to stop dancing the Clinton triangulation two-step. We must sell progressive policies to the public and pressure Dems in Washington to enact those policies. If we can topple Joe Lieberman, the most egregious of the DINO Bush bootlickers, this would send a clear signal to the Dems that they must reckon with us, and that they can’t take our loyalty for granted. This is essentially the argument made by Kos and Jerome Armstrong in Crashing the Gate.

There’s a lot more to be done to make America safe for progressivism again, such as reform media so that our messages reach the public without being twisted by the rightie noise machine. Election reform, real campaign reform — all vital goals, and none will be easy to achieve.

But if the Dems don’t succeed in the 2006 midterms, prepare to kiss it all off. That’s reality. And another reality is that until we change the way we conduct elections — allow instant runoff elections, for example — third party candidates will not only lose, they will split the progressive vote and hand elections to Republicans. This has been happening in America since the first political parties emerged, which was while the ink was still drying on the Constitution. I do believe a pattern has been established.

Where does Kos fit into this? IMO Kos is more of an organizer than a blogger, but that’s OK. The netroots are a cornucopia of great bloggers, but great organizers are harder to come by. I don’t always agree with Kos, but I admire his ability to get in the faces of politicians and media and demand attention. The YearlyKos convention — which was fabulous, IMO, and if they have another one next year I’m already there — was a major step toward giving netroots progressivism real power in the flesh. I couldn’t have done it. I suspect most of us couldn’t have done it. But Kos did it, and he deserves the credit. So, I congratulate Kos for being honored by the Drum Major Institute. I wish him continued success, and I hope more bloggers step out from behind their monitors and follow his lead.

And if we keep fighting, the day will come when progressive goals will be achievable. Goals like providing health care for all Americans and a genuine commitment to reducing global warming will no longer be kept dangling out of our reach by the power of the Right.

Last January I caught some flames with this post, in which I said that too much of the Left was “stuck in a 1970s time warp of identity politics and street theater projects and handing out fliers for the next cause du jour rally.” But for at least forty years — since I was old enough to pay attention to politics — I’ve watched earnest and dedicated liberals stand outside the gates of power and hand out essentially the same fliers for the same causes, year after year, decade after decade. And in most cases we’re no closer to achieving real change than we were forty years ago. On many issues we’ve lost ground. Yet too many lefties (like the commenter above) care more about ideological purity than about accomplishment.

If in-fighting over ideological purity is getting in the way of having the power to enact progressive policies, then the hell with ideological purity. Speaking truth to power is grand, but let’s not forget the ultimate goal is to be power. I believe one of the reasons we have been rendered into a minority is that too many lefties act and think like a minority; we’re perpetually out of power because that’s how we envision ourselves. So even though an overwhelming majority of the American public now agrees with us on Iraq, for example, somehow we’re the extremists, and the hawks — who dominate government and media — paint themselves as mainstream. Righties, on the other hand, maintain total faith that the majority of Americans are with them, even if poll after poll says otherwise. And that faith has empowered them.

We are the mainstream. We are the majority. But to take our rightful place in American politics and government we must start thinking like a majority and acting like a majority. It’s way past time to stop standing outside the gates of power handing out fliers. It’s time to crash the gate.