Stephen W. Browne | Rants and Raves

CAT | Academic



Why are smart people so stupid?

Some time back after I had returned to the United States after living in Eastern Europe I was invited to speak to a local chapter of Mensa about my experience living abroad during an exciting time in history.

Mensa is the international high-IQ society founded in 1946. Its only criterion for membership is an IQ in the 98th percentile. In other words, in a group of 100 people you’re one of the two smartest people in the room.

The first time we tried to get together they sent me to the wrong address. So we rescheduled.

The second time they’d forgotten there was a scheduling conflict so I wound up going out for a beer with the three people who did show up.

So how come the smartest people in the room couldn’t arrange something every Cub Scout den mother does on a regular basis?

I think all of us probably know some pretty smart people who have dumb ideas. And the stereotype of the unworldly impractical genius has been around for a long time.

An article in the New York Times Sunday Review of Sept. 16, by David Z. Hambrick, professor of psychology at Michigan State University, and graduate student Alexander P. Burgoyne, summarizes research that confirms what some of us have suspected for some time.

Smart people can be pretty dumb.

“As the psychologist Keith Stanovich and others observed… some people are highly rational. In other words, there are individual differences in rationality, even if we all face cognitive challenges in being rational. So who are these more rational people? Presumably, the more intelligent people, right?

“Wrong. In a series of studies, Professor Stanovich and colleagues had large samples of subjects (usually several hundred) complete judgment tests like the Linda problem, as well as an I.Q. test. The major finding was that irrationality — or what Professor Stanovich called “dysrationalia” — correlates relatively weakly with I.Q. A person with a high I.Q. is about as likely to suffer from dysrationalia as a person with a low I.Q.”

So it turns out that people with high IQs are just as prone to bias, prejudice, and rationalization as anybody else. No matter how smart we are, it’s difficult to think objectively about things we are emotionally invested in.

It’s depressing.

Though perhaps not surprising. How many smart people do you know who can be spectacularly stupid about for example, their romantic affairs? Money? Car repairs?

In fact, it seems really bright people are capable of much larger scale and much more harmful stupidity than your average-bright person.

Worse news, it doesn’t seem that higher education has an effect on how prone to cognitive bias we are. So much for those freshman logic classes.

The good news Hambrick claims, is that computerized training can affect long-term improvements in people’s ability to think objectively.

Forgive me if I’m skeptical. I haven’t looked at the experimental results in detail, but I’ve seen a lot of a tendency to label something “objective” when it seems to mean “agrees with me.”

That however could be my own bias in favor of a classical liberal arts education where logic and rhetoric are taught early. Logic is about objective thinking, rhetoric is about persuasive speaking, and the study thereof is about knowing the difference between them.

Perhaps we will find, or rediscover ways to teach objective critical thinking. One may always hope.

But one thing is for sure, we have every reason to be skeptical about the ability of other people to run our lives for us based on the argument they are smarter than we are.

No tags



May Day

Note: Cross-posted on my professional blog at the Marshall Independent.

Yesterday, May 1, I saw a Facebook post by an academic I’ve known for… a long time. He teaches history in an east coast college and advertises himself as a “labor historian.”

He, or somebody, had filched the classic “We can do it!” WWII poster of a working woman flexing her bicep and appropriated it to promote International Workers’ Day. He urged everyone to “honor labor.”

Just because I get intensely irritated by the kind of intellectuals and academics who would do anything for the working class – except join it, I left a comment.

I said, “Good idea! How about everyone honor labor by listing all the jobs we’ve done that involved demanding physical labor. Mine are: waiter/bartender, garbageman, framing carpenter, bucking hay in season, sewage treatment plant operator, and in between journalism gigs I drove a grain truck for harvest.”

Dead silence.

At any rate, I got curious and looked up a few things about the date. For one, nobody remembers but April 30- May 1 is the ancient Celtic festival of Beltane that used to mark the beginning of summer. Great bonfires were built and cattle driven between them to be purified by the smoke. Everyone would douse their house fires and relight them from the sacred bonfires.

In the 19th century May 1 was promoted by socialists (my academic acquaintance is a socialist), communists, syndicalists, and anarchists as a day to honor labor. The day was chosen to commemorate the date of the Haymarket Square bombing in Chicago in 1886. (Which actually happened on May 4, I don’t know why the date was changed to the first.)

During a demonstration a bomb was thrown at police by person or persons unknown, killing seven of them. The police returned fired on the crowd, killing four.

In the aftermath, eight radicals were tried, four executed and one apparently committed suicide in his cell in a particularly grisly fashion with explosives.

For well over a century this was considered the judicial murder of innocent people for the crime of having unpopular opinions, until historian Timothy Messer-Kruse dug up an awful lot of evidence that seems to show that the trial was quite fair by the standards of the time, and if any innocent people were executed, it was because their lawyers were more interested in making points than oh, say preparing a defense. You know, that thing lawyers are supposed to do?

At any rate, eight years later in the aftermath of the Pullman Strike of 1894, President Grover Cleveland signed the bill declaring the first Monday in September Labor Day, unofficially marking the end of summer. The date was chosen specifically to avoid any association with May 1.

Nonetheless May 1 remains a labor holiday in over 80 countries world-wide.

No tags

Over at the Wall Street Journal is an interview by Bari Weiss with Bernard Lewis, who at age 95 is still sharp as a tack and the preeminent scholar of the Islamic world.

The article is entitled, ‘The Tyrannies are Doomed,’ which gives you Lewis’ opinions in a nutshell. Read it anyway, there’s a lot of good stuff in it, starting from the obvious truth that while the tyrannies may be doomed, there’s no guarantee that anything better is going to replace them.

Well I say obvious truth, but evidently it isn’t so obvious to a fair number of people.

“And yet Western commentators seem determined to harbor such illusions. Take their treatment of Sheikh Yusuf Qaradawi. The highly popular, charismatic cleric has said that Hitler “managed to put [the Jews] in their place” and that the Holocaust “was divine punishment for them.”

“Yet following a sermon Sheikh Qaradawi delivered to more than a million in Cairo following Mubarak’s ouster, New York Times reporter David D. Kirkpatrick wrote that the cleric “struck themes of democracy and pluralism, long hallmarks of his writing and preaching.” Mr. Kirkpatrick added: “Scholars who have studied his work say Sheik Qaradawi has long argued that Islamic law supports the idea of a pluralistic, multiparty, civil democracy.””

Heavy sigh. The New York Times again…

There is some fascinating stuff on how traditional institutions moderated the power of Islamic rulers throughout history, until seriously weakened by modern technology. Lewis cautions against imposing an Anglo-American model of democracy where it doesn’t fit into the local political culture, and cites post-WWI Germany as an example of a bad fit.

He also has some interesting things to say about Women’s Lib for the Islamic world, which by chance Kathleen Parker also has a few things to say this week. See her endearingly entitled, “‘Women aren’t pet rocks’.

According to Lewis:

“My own feeling is that the greatest defect of Islam and the main reason they fell behind the West is the treatment of women,” he says. He makes the powerful point that repressive homes pave the way for repressive governments. “Think of a child that grows up in a Muslim household where the mother has no rights, where she is downtrodden and subservient. That’s preparation for a life of despotism and subservience. It prepares the way for an authoritarian society,” he says.

Amen. And see the Parker article to read how George and Laura Bush, whatever missteps George’s administration may have made, have always realized this and continue to work for women’s emancipation in the Islamic world to this day. Something that counts for zero among left-wing American feminists, who evidently think a western woman’s right to an abortion trumps an eastern woman’s right not to be genitally mutilated, beaten, or murdered for getting uppity. And by the way, women in Islamic countries can’t get abortions either.

The really important point Lewis makes is that in the transition to a free society, elections should be last in order.

Others have made this point as well. Thomas Sowell has said the rule of law must be established before elections take place. Milton Friedman used to point out that Hong Kong as a Crown Colony was free, but definitely not a democracy.

And any anthropologist should be able to tell you that if you have a state composed of tribal/ethnic groups, the state is going to become the possession of the largest, if it has a majority, or the largest coalition of tribes with a common interest. In this case, if the state is a major distributor of wealth (such as oil revenues,) the permanent minority may see no other alternative than violence to seize the state or secede from it. The rule of law must be established to prevent a newly established government from reverting to feudalism, the default state of civilization, or chaos.

Yet I think there is something beyond the first free and fair elections, a point at which everybody stops holding their breath and dares to hope freedom may have gained a sure foothold in their country. I saw this in Poland in the first years after the fall of communism or it might never have occurred to me.

It’s not the first free and fair election that matters. It’s the first election in which the party in power loses and steps down of their own free will, reasonably confident they will not be prosecuted – or executed.

I think that’s why we in America are so reluctant to begin criminal prosecutions against officials of past administrations, even in the face of some pretty obvious criminality.

Everybody can think of their own examples. I’m among those who would like to see Janet Reno, and very possibly Hillary Clinton, face charges, and I mean capital charges, for the murder of those harmless religious lunatics in Waco, Texas.

But the trouble is, different people have different views about who should be prosecuted for what. For example, soon after Obama took office there was some loose talk about prosecuting certain people for renditions and such – before the usefulness of renditions was discovered by the administration.

Perhaps it’s best not to open that can of worms, even if we have to grit our teeth and let some pretty flagrant injustices pass unavenged.

No tags



On war of ideas part 2: 9/11 as theater

A New Paradigm

A scant three years after the abolition of the USIA, on September 11, 2001 Islamic terrorists struck in the heart of America, destroying the twin towers of the World Trade Center, killing more people than the attack on Pearl Harbor which brought the US into the Second World War. It was soon discovered that the majority of the hijackers came from Saudi Arabia, a country closely bound to the US by trade, defense treaties and by the huge number of students it sends to study in American universities.

That and the mobs in the streets of the Middle East dancing and cheering ecstatically, brought home in the most dramatic way possible that American power and cultural influence did not coincide with American popularity.

Soft power had become so identified with fighting the Cold War that few Americans noticed that, with the advent of the information revolution, soft power was becoming more important, not less.

It took the September 11 attacks to remind the United States of this fact. But although Washington has rediscovered the need for public diplomacy, it has failed to master the complexities of wielding soft power in an information age (Nye:2004: 18).

Almost immediately a startling variety of interpretations were offered. Startling because by the standards of any previous war that began with an attack on American territory, soul searching over the reasons why we were at war seemed as important as the fact that we were at war.

If indeed we were at war. Some public figures on the Left such as Noam Chomsky and Ward Churchill proclaimed that the attack was just retribution for an immoral, imperialist American foreign policy. More importantly, they did so with impunity. Though evoking some public censure, their jobs were secure and they were certainly not arrested or imprisoned for sedition nor were they threatened by mob violence, as happened to German-Americans in the First World War or members of the American Nazi Bund during the Second.

This does not suggest the behavior of a nation that believes itself to be at war. Others urged that the attacks be dealt with as a criminal justice matter. (I received an email communication to this effect from spokesmen for a libertarian organization within days of the event.) On the other end of the political spectrum, Norman Podhoretz proclaimed it to be the first attack on American soil of World War IV (Commentary: 2004).

So is this “war on terror” really a war or something else? If so, is there a propaganda front?

The Cold War notion of public diplomacy was found to be totally inadequate – or perhaps it is more accurate to say, that it was stood on its head. Nor has a new paradigm emerged. Christopher Ross wrote shortly after the events of 9/11, “The degree of apparent hostility to the United States and the depth of unfamiliarity with U.S. society – its values, accomplishments, and aspirations – that recent events have brought into dramatic relief have surprised even those who work in foreign affairs” (2002: 80).

This misses the point entirely. The attacks were not planned and executed by men unfamiliar with US culture and society, they could not have been. The 9/11 hijackers were familiar enough with US society to function within it for years while they scouted the ground, made their preparations and got their flight training at American aviation schools.

The mobs that danced in the streets were familiar enough with American pop culture, and the jihadists have proven themselves adept in using New Media technology such as digital videocameras, computer editing and the Internet. Osama bin Ladin taunts the US from his hiding place on professional news quality videos. Al Queda makes and markets DVDs of the murder of captives throughout the Arab lands.

During the Cold War, the peoples of the Soviet empire, to the extent they had any accurate knowledge of American society, longed for a standard of living and comparable lifestyle. In contrast, the jihadists are most often affluent and educated members of their own societies who are intimately familiar with American culture and values – and loathe them.

Ron Robin wrote, “By all accounts, contemporary public diplomacy appears trapped in a time warp… The dismembering of national narratives – the result of what Paul Bove has described as the “transformation from territory-based power to network-based power” – has yet to affect U.S. information management. The fact that the bipolarity of the cold war has not been transformed into a unipolarity of a hegemonic America, but rather into the “advent of heteropolarity” characterized by “the emergence of actors that are a different kind… connected nodally rather than contiguously” still eludes public diplomacy…”

The principle strategy of cold war public diplomacy was the inundation of target populations with information, mostly because their adversaries restricted public access to media beyond carefully monitored official channels. “fifty years ago,” observes Joseph Nye, “political struggles were about the ability to control and transmit scarce information.” Such strategies have little bearing in a media age dominated by “the paradox of plenty” in which “a plentitude of information leads to a poverty of attention (2005:3-4).

The US is the premier military and industrial power in the world, one that resistance movements in developing countries have no realistic hope of overcoming militarily, no matter how extensive the damage they do or how often the US retreats from any given theater of operations. The US and Europe are together the primary producers of media content in the world, the greatest contributors to that “paradox of plenty”. The Islamic jihadists have no realistic hope of overcoming the West militarily (though a long-term demographic strategy may well overcome Europe in the future) or as a producer of media content anytime in the near future.

After 9/11 while various experts and pundits were debating whether this was an act of war or criminality, and if war what kind.
The composer Karl Stockhausen may have stumbled on an important insight – and was vilified for it. He called 9/11 “the greatest work of art of all time”.

“Despite the repellent nihilism that is at the base of Stockhausen’s ghoulish aesthetic judgement, it contains an important insight and comes closer to a genuine assessment of 9-11 that the competing interpretation of it in terms of Clauswitzian war. For Stockhausen did grasp one big truth: 9-11 was the enactment of a fantasy – not an artistic fantasy, to be sure, but a fantasy nonetheless” (Harris: 2002:3).

The Islamic jihadists have mastered the technology of New Media, but any new entrant into the media market has to contend with that “poverty of attention” caused by the information flow from the West. This they have overcome by turning acts of war into grand theater.

Another cultural analogy that suggests itself is the American Indian custom of ‘counting coup’. Among the Plains Indian peoples who considered warfare to be manly sport, prestige and honor were gained by daring acts such as riding into the midst of ones enemies and striking one in an insulting fashion, or sneaking into their camp to steal their horses (their most prized possession). Afterwards the warrior who had counted coup would recount his deeds to an audience in his tribe as a kind of performance art accompanied by song, dance, pantomime etc.

Arab culture, like Plains Indian culture, is considered by social scientists to be a ‘macho’ or ‘honor culture’. Such are characterized by display behavior, the acting out of ones pride – and rage at insults to it. Modern media has provided a world stage for display behavior and modern technology has made it more destructive than ever before. I suggest that terrorist attacks on the West are conceived in the spirit of performance art or counting coup. Though terrorism has produced real benefits in terms of concessions from the West, that too is secondary to the satisfaction from the expression of rage and revenge for wounded honor. The mighty West is humbled by the hit and run tactics of the jihadist warriors – and in full view of the world audience. Thus making the Islamic jihadists some of the foremost media content providers in the world, making up in drama what they lack in quantity.

“The terror attack of 9-11 was not designed to make us alter our policy, but was crafted for its effect on the terrorists themselves: It was a spectacular piece of theater. The targets were chosen by al Qaeda not through military calculation – in contrast, for example, to the Japanese attack on Pearl Harbor – but entirely because they stood as symbols of American power universally recognized by the Arab street. They were gigantic props in a grandiose spectacle in which the collective fantasy of radical Islam was brought vividly to life: A mere handful of Muslims, men whose will was absolutely pure, as proven by their martyrdom, brought down the haughty towers erected by the Great Satan. What better proof could there possibly be that God was on the side of radical Islam and that the end of the reign of the Great Satan was at hand?” (Harris:2002:8).

If this analysis has any merit at all, then all previous paradigms of public diplomacy are useless in this case. Demonstrations of the superior power of the West are offset by even a single successful terrorist act, the image of which is spread throughout the world by the media. The superior wealth and standard of living of the West is interpreted as corruption and contemptible weakness. Offers to share the largess of the West has the opposite effect to that intended, and is invariably seen as infuriatingly patronizing. Giving gifts is the privilege of a superior in a tribal honor culture.

Ultimately the Cold War was won because enough information about the greater standard of living in the West was spread to the Soviet block that even the Communist elites wanted to “eat at the same table” (in the words of a student of mine in Poland). This motivation is not applicable to the Islamic jihadists. And while it is probably true that a majority of Arab Muslims who are not jihadists would prefer not to live under the repressive autocracies of the Arab countries, we have seen that 1) they have yet to demonstrate that they can free themselves from the most brutal of them without help, and 2) receiving that help is humiliating to them and inculcates a desire for revenge.

There has yet to be discovered a paradigm of propaganda/ public diplomacy for this new kind of war, a new way to communicate directly with the populations that supply recruits for the jihadists, while bypassing the governments that provide covert funding and support for them. The propaganda/ public diplomacy that helped bring down the Soviet Union addressed a population starved for information and who were unhappy with the media provided by their states. Arabic media is by contrast, a rapidly growing business, popular throughout the Arab lands and increasingly in the lands of the Arab diaspora. Communism attempted to impose an artificially designed ideology on cultures that were basically Western to begin with, where Western media acted as a subversive influence. Islamic jihadism is an organic outgrowth of ancient indigenous cultural patterns that are pre-Islamic in their roots and are now reinforced by modern media technology.

If, as Clauswitz said, war is diplomacy by other means, then perhaps public diplomacy is war by other, less lethal means. The West has a challenge to arrive at a new paradigm of ‘public diplomacy’ or unabashed ‘propaganda’ or even ‘cultural imperialism’. If not met then it may well be that total war and the brute simplicity of caesarism is the only alternative.



On war of ideas Part 1

In an email exchange with Col. Gordon Cucullu I said (to his hearty agreement), “Though I firmly believe that military strength is essential to our survival, in the long run victory will depend on how well we explain ourselves.”

So… I’m going to do something I don’t usually do – well, actually never. I’m going to post an academic paper I did about two years ago. Among other reasons, I made a few points that I want to bring up for discussion (at the end of part 2, you’ll have to wait.)

This proves that 1) I can so write in a pompous stuffy style, and 2) I read a book or two every now and again. So in two parts here is:

International Politics as Theater: Soft Power, Propaganda and Public Diplomacy
Stephen W. Browne


With the fall of the Soviet Union in the late 20th century, the United States was recognized as the sole “hyperpower”. Paradoxically, with this came an increased, often painful awareness that military and economic power has limits. American military power was not able to achieve a victory in Vietnam when there was a rival superpower, nor has it so far achieved a victory in Iraq, in spite of the immense disparity in power. In both of these cases far less powerful actors were able to craft appeals both to the international community and parts of the American public that could be persuaded to sympathize with their cause, or at least to disapprove of American actions.

Nor should it come as a surprise that American power is regarded with suspicion and hostility, even among American allies. American power is awesomely dangerous and has no serious rival at present. And no matter how much the US protests its benign intentions, even allied countries have reason to worry that they have no counterbalancing power of their own or one that they can appeal to for support. And even granted good intentions, American administrations change regularly and often display a worrisome inconstancy of foreign policy. As one practitioner of public diplomacy put it “- the United States, as the world’s dominant power, will inevitably be accused of heavy-handedness and arrogance” (Ross:2002: 82).

Since the 1960s, the phrase “soft power” has been used to describe the power of mass persuasion and cultural influence as an alternative to force. It was defined by Joseph S. Nye in his book “Soft Power: The Means to Success in World Politics” (2004) as “the ability to get what you want through attraction rather than coercion”. And when force is resorted to, it is more and more recognized that the “decent respect to the opinions of mankind” referred to in the Declaration of Independence, demands that the US make a good-faith attempt to explain its actions to the court of public opinion, both ours, our allies and even our enemies.

US citizens are asked to support with their lives and treasure the US’ projection of power and assume the risks of retaliation from abroad and the long-term effects of a huge debt burden. US allies are well aware that they are also subject to retaliation if they support US actions and are less able to respond to it. Their currencies are often tied to ours, they may be holders of US government debt and American involvement elsewhere means fewer resources are available to devote to concerns more important to their security. Our enemies must be made to understand exactly what it is that they do, that will or will not invite consequences, of what kind and to what extent, if there are ever to be grounds for future negotiations – even negotiations for their surrender, and so that American use of force not be seen as capricious and unpredictable to our allies and potential allies.

In the Second World War, this was unabashedly called propaganda. Perhaps in no time since then have the issues at stake been so clearly explained both to the populations of the allies, their enemies and the powers that were persuaded to remain neutral. Reasons and justifications for all subsequent wars and minor conflicts waged by the US have been in comparison, poorly articulated for audiences both at home and abroad. President George W. Bush admitted as much when he stated, “We have to do a better job of telling our story” (Wolf and Rosen: 2005).

Definitions: propaganda and public diplomacy

The American Heritage Dictionary of the English Language (online edition) defines propaganda as:

“Noun: 1. The systematic propagation of a doctrine or cause or of information reflecting the views and interests of those advocating such a doctrine or cause. 2. Material disseminated by the advocates or opponents of a doctrine or cause: wartime propaganda. 3. Propaganda Roman Catholic Church A division of the Roman Curia that has authority in the matter of preaching the gospel, of establishing the Church in non-Christian countries, and of administering Church missions in territories where there is no properly organized hierarchy.

Etymology: Short for New Latin Sacra Congregregatio de Propaganda Fide, Sacred Congregation for Propagating the Faith (established 1622), from ablative feminine gerundive of Latin propagare, to propagate.”

This definition is morally neutral and seems not very different from the noun persuasion and the verb persuade, applied to a mass audience. Persuade is defined as:

“Verb: To induce to undertake a course of action or embrace a point of view by means of argument, reasoning, or entreaty: “to make children fit to live in a society by persuading them to learn and accept its codes” (Alan W. Watts). See Usage Note at convince.”

If there is a difference, it lies in the strict definition of “persuade” as using “argument, reason and entreaty”. Propaganda is more broadly understood to include the use of music, visual art, theater (live and film, both fiction and non-fiction), to appeal to emotions as well as reason – and critically, not to categorically exclude the use of lies and half-truths.

However, “propaganda” during the Cold War came to mean in popular usage, something like “political lies”. In the 1960s the term “public diplomacy” was promoted as an alternative. It was first used in the modern context in 1965 by Edmund Gullion, dean of the Fletcher School of Law and Diplomacy at Tufts University. Gullion later founded the Murrow Center of Public Diplomacy. A Murrow Center brochure gives a definition of the term as understood by Gullion and his successors:

“Public diplomacy… deals with the influence of public attitudes on the formation and execution of foreign policies. It encompasses dimensions of international relations beyond traditional diplomacy; the cultivation by governments of public opinion in other countries; the interaction of private groups and interests in one country with another; the reporting of foreign affairs and its impact on policy; communication between those whose job is communication, as diplomats and foreign correspondents; and the process of intercultural communications.” (Quoted in Cull.)

This definition is wordy and a bit euphemistic. Wolf and Rosen go directly to the point, “…to the extent that the behavior and policies of foreign governments are affected by the behavior and attitudes of their citizens, public diplomacy may affect governments by influencing their citizens” (2005).

Public diplomacy is contrasted with traditional diplomacy in that:
The practitioners of traditional diplomacy engage the representatives of foreign governments in order to advance the national interest articulated in their own government’s strategic goals in international affairs. Public diplomacy, by contrast, engages carefully targeted sectors of foreign publics in order to develop support for those same strategic goals (Ross: 2002: 75).

‘Public diplomacy’ also broadens the definition of propaganda to include information and cultural exchange programs such as the Fulbright Scholarship programs designed to “understand and influence foreign publics by familiarizing them with America and its policies, institutions, and people” (Roberts: 2005: 131) and the export of mass media entertainment.

Wolf and Rosen’s definition included non-governmental organizations and actors as well as governments. This became important with the rise of modern revolutionary organizations seeking to found new states and terrorist groups not openly affiliated with any state. For example, Yassir Arafat practiced public diplomacy so successfully that he was invited to address the United Nations with the honors of a head of state and became the single most frequent official visitor to the White House during the Clinton administration.

Ancient Origins

That a great deal of governance and diplomacy takes place in public rather than within the institutions of the state is not a new phenomenon particular to democracies. From the earliest civilizations, rulers have used public pageant and ritual to legitimize their power to their own people and overawe visiting foreigners.

In the 6th century BCE, the Athenian city-state invented theater, about the same time as they invented democracy. Solon, who instituted jury trial and broadened public participation in government, and Thespis the first actor, were contemporaries and undoubtedly acquainted.

Public performances originally consisted of a performer reciting long epic poems and stories for an audience limited to the number of people who could crowd around and hear him. Performances capable of reaching larger audiences were created by using a mass chorus in an amphitheater with favorable acoustics. This can be fairly called the origin of mass media. The innovation of Thespis was to create the role of the actor, a performer who spoke the words as if he were the person who was supposed to have said them (Brockett and Hildy: 1999: 13-33).

From the beginning, theater was a public venue where issues important to the life of the state were presented. “The stage drama, when it is meant to do more than entertain – though entertainment is always one of its vital aims – is a metacommentary, explicit or implicit, witting or unwitting, on the major social dramas of its social context (wars, revolutions, scandals, institutional changes).” (Rosenbloom: 2002: 283).

In 415 BCE during the Peloponnesian War, the Athenian army massacred every male of military age on the island of Melos and sold the women and children into slavery. Within a year Euripides, who had participated as a soldier in the destruction of Melos, staged his play ‘The Trojan Women’, a sympathetic portrayal of the anguish of the enslaved survivors of a city’s destruction by Greeks. Throughout Athens’ existence as a free state, the theater dealt with controversial issues and public figures, public scandals, class conflicts, and jury trials (Rosenbloom: 2002).

What is not clear is the extent to which the theater affected and informed visiting foreigners, but it may be plausibly presumed to have had some effect. Many of the plays of Athenian writers have survived and been widely imitated and translated to new media, arguing that they speak to issues of lasting importance and universal significance. The Trojan Women for example, was made into a film in 1971 and at the time widely seen as a commentary on the Vietnam War.

Modern History

The earliest use of the term ‘public diplomacy’ found so far, was in the London Times issue of January, 1856 in a criticism of US President Franklin Pierce. “The statesmen of America must recollect that, if they have to make, as they conceive, a certain impression upon us, they have also to set an example for their own people, and there are few examples so catching as those of public diplomacy.” The term here appears to be used in the meaning of ‘civility and calm’, and their effect on the publics of both parties to a conflict or negotiation.

Abraham Lincoln used public diplomacy in the modern sense during the Civil War, when the English upper classes were still hostile to the very existence of the United States and felt a kinship of blood and aristocratic ideals with the Confederate elites. At one point there was a very real danger of British intervention on the side of the South. Lincoln appealed directly to the workingmen of Manchester in letters published in 1863, and sent speakers to tour England making the case for the North. George Train, correspondent for the New York Herald, made numerous speeches on behalf of the Union – something that would be considered questionable by today’s understanding of journalistic ethics (Knightley:2004:34-35).

Perhaps the greatest coup of public diplomacy in Lincoln’s administration though, was the Emancipation Proclamation. Critics have pointed out with the benefit of hindsight, that it had almost no effect on the lives of slaves at the time of its issue, since it specifically declared free only those slaves in territory under Confederate control and not in non-secessionist slave states on the border. However it’s effect on international relations was to proclaim to the peoples of England and Europe that the aim of the war was universal emancipation, thus putting any government tempted to offer aid to the South in the position of effectively endorsing slavery.

In a congressional debate in 1871, Representative Samuel Cox (D, NY) declared that he believed in “open, public diplomacy” while objecting to secret intrigues to annex the Republic of Dominica. The term became widely used during the First World War, as did the term ‘propaganda’. But while ‘propaganda’ was used in the sense of ‘mass persuasion’, ‘public diplomacy’ was still used to mean diplomacy conducted between states, but without secrecy. For example, Woodrow Wilson, in the opening of the ‘fourteen points’ speech of January 8. 1918, spoke of “open covenants of peace, openly arrived at” (Cull).

By the First World War, America occupied the position of the ‘swing vote’ in any worldwide conflict – that of a force that determined the outcome of the war when the great powers were stalemated. By the Second World War the US was the essential power, without whose military and industrial capability the victory of the allies was by no means certain. In each case, the opposing powers used propaganda and public diplomacy to try and sway the people of America to demand that the government stay out of or enter the war.

Before America’s entry into the war, the Office of the Coordinator of Inter-American Affairs was established to counter Nazi propaganda in Latin America. Roosevelt also established the office of Coordinator of Information (COI) in 1940, which was primarily concerned with intelligence operations, but also included a Foreign Information Service (FIS) which significantly, was headed by a playwright, Robert Sherwood. After America entered the war the FIS started the Voice of America (VOA), which broadcast in German, French, English and Italian. Soon, the functions of the FIS were split off from the COI and it became the Office of War Information (OWI).
After the end of the Second World War, the OWI was abolished and certain functions, such as the VOA were transferred to the State Department. However the US quickly became involved in a nuclear standoff with the Soviet Union, while both powers claimed spheres of influence and vied to bring non-aligned parts of the world into their orbits. Nuclear war being unthinkable, efforts to influence the publics of various countries became of primary importance. In 1946 Congress authorized the Fulbright programs, in 1948 the Smith-Mundt Act created a new set of information and cultural programs and in 1953 the United States Information Agency (USIA) was created (Roberts:2005:132).

The USIA operated throughout the period of the Cold War, employing the VOA, various specialty publications, libraries, films and speakers. At the end of the Cold War, the function was perceived to be obsolete and funds started to dry up. In the decade after 1989 USIA funding was cut by 10% and had only 6,715 employees, as opposed to 12,000 in the 1960s (Nye:2004:17).

What passed almost unnoticed was that public diplomacy efforts to the Arab countries, once part of an effort to keep them out of the Soviet orbit, were also cut. By 2001 only 2% of Arabs listened to VOA and foreign exchanges dropped from 45,000 in 1995 to 29,000 in 2001 (Nye:2004:18). (Not “exchanges” they were mostly one-way actually, American students were rarely sent to study in Arab countries, with the exception of Lebanon during times of stability.)

In late 1998 the USIA was abolished and the VOA transferred to the State Department (Roberts: 2005: 134).

Next: Part 2, 9/11 as theater

No tags



Ahmedinejad and Columbia

Posted at The Right Angle
— 09-24-2007 @ 04:19 PM

I admit to being somewhat torn about Ahmedinejad speaking at Columbia. Like Cal Thomas in today’s front page article at Human Events, I’m irritated that he got invited to Columbia and evidently spoke without interruption or heckling, while conservatives, border security advocates etc, are either disinvited or get the “hecklers veto”.

On the other hand, I think our people should hear him.

People in this fat, happy, lucky country of ours just don’t believe in evil anymore. Notice how people either avoid using the word – or use it in such a loose sense that it doesn’t mean anything.

Our people need to hear it, see it and get close enough to look into the eyes of a man who would kill you without a second thought – for what? Having different opinions about religion? Refusing to acknowledge the superiority of his civilization? Any reason that you could comprehend at all?

The problem of our country today is, that we have had the good life for so long now that we’ve forgotten that there really are people like that in the world – and that a small but significant number of our own people find the idea of that kind of power exciting.

No tags

Dr. Thomas Sowell is one of those authors whose laundry lists I’d read. Reading A Conflict of Visions was one of the “Ah-ha!” moments of my life.

Sowell is an economist, newspaper columnist and Fellow at the Hoover Institution. He is a prolific writer on economics, public policy, history, culture and the politics of race. His opinions are often controversial and he has strong detractors and supporters. Agree or disagree, he is an opinion leader of considerable influence in our society today.

In observing arguments for and against a wide variety of positions, Dr. Sowell reports that he noticed that in many cases participants seemed to be arguing not so much against each other, but past each other. In other words, each person was arguing not against the others’ position but what they perceived those positions to be, which was often far different from the actual positions held.

Over time he refined his observations into the theory expressed in, A Conflict of Visions – Ideological Origins of Political Struggles (Basic Books, 2002). I believe this book has critical insights important for understanding the major ideological conflicts within Western civilization and has specific application to understanding the controversies concerning academic and journalistic bias.

His thesis is that prior to paradigms, world-views, theories or any rationally articulated models there is an underlying vision, defined (quoting Joseph Schumpeter) as a “pre-analytic cognitive act”. Sowell further defines a vision, “It is what we sense or feel before we have constructed any systematic reasoning that could be called a theory, much less deduced any specific consequences as hypotheses to be tested against evidence. A vision is our sense of how the world works.”

Visions are a sense of the possibilities of human reason and power to act purposefully to achieve desired ends and are broadly defined as Constrained and Unconstrained. An unconstrained vision sees articulated reason as powerful and potent to shape human society, a constrained vision sees human beings as more limited by human nature and natural law.

Dr. Sowell concedes that visions are rarely pure but range from strongly to weakly constrained or unconstrained. People may hold one sort of vision in a certain sphere of opinion and another in a different sphere, there are hybrid visions (Marx and John Stuart Mill are given examples) and people sometimes change predominant visions over their lifetimes.

It is important to note that he does not equate constrained and unconstrained visions with the Left/ Right model of the political spectrum, nor do they strongly reflect the Libertarian/ Authoritarian dichotomy. An unconstrained vision characterizes the Utopian Socialists of the early nineteenth century (such as Fourier) but is also strongly expressed by William Godwin, considered by many to be the founder of modern Anarchism, in his Enquiry Concerning Political Justice.

The unconstrained vision is more often characteristic of those who would use the coercive power of the state to affect great changes in the structure of society and human nature, but it cannot be assumed that a constrained vision leads to a blind defense of the status quo. He gives the example of Adam Smith, an exemplar of a strongly constrained vision, was an advocate of sweeping social changes such as the abolition of slavery and an end to mercantilist policies.

Once grasped, Dr. Sowell’s theory makes sense of some seeming inconsistencies and contradictions in both Left and Right positions.

For example, though there is a tendency for the constrained vision to predominate among the politically Conservative and free market advocates, it is not absolute or consistent. A Conservative may argue for the superior efficacy of market processes to serve the social good (as opposed to purposeful direction of the economy) but fail to see the market for illegal drugs as subject to the same laws of supply and demand as other commodities or consider the argument that the process costs of drug prohibition may be higher than the social costs of drug addiction. In fact, the phrase “consider the argument” is misleading. It is possible that the argument simply does not exist in his perceptual universe and is interpreted as advocacy for drug use.

On the other end of the political spectrum, a thinker such as Paul Ehrlich (in The Population Bomb) may argue from the highly constrained view of Thomas Malthus on population and food resources, combined with an unconstrained view of the ability of the state to effectively control population and allocation of resources for the general good of mankind.

And we see on both the Left and Right, visionaries holding strong beliefs about the ability of humans to deliberately shape culture to reflect whichever set of values held by their respective advocates. Though much experience in the twentieth century has shown how limited the ability of men is to design culture as if it were an engineering project, and how disastrous the attempts often are, men and women of unconstrained vision persist in their advocacy of policies intended to rid society of gender defined roles on the one hand or of behavior considered “vice” on the other.

So the question arises, if the concept of the contrasting visions is hedged about with so many qualifications, is it at all useful in categorizing belief systems or explaining behavior?

I believe it is highly useful. In Western civilization there exists no serious argument about the desirability of that condition expressed by the words “freedom” and “equality”. Yet in the West we find that whenever advocates of various causes argue for their sides, their definitions do not coincide, i.e. they argue past each other.

Advocates of redistributionist policies, affirmative action to achieve more socioeconomic equality and a high degree of taxation and market regulation are seen as tending towards totalitarianism by advocates of a less intrusive government.

Contrariwise, advocates of leaving the pursuit of the social good to voluntary and market processes are seen by political opponents as apologists for powerful and rapacious economic elites in their drive to impose a quasi-royal authority on society via economic coercion.

For those who see government as a powerful engine for social engineering, it is desired results that matter. If it is possible for the state to eliminate poverty and insure socio-economic success for historically disadvantaged groups then it follows that it is immoral not to do so. Arguments that the goals lie outside the state’s competence or that process costs are too high or that the attempt itself is counterproductive will simply not register and almost inevitably must be interpreted in terms of ulterior motive.

Thus a TV journalist can make a parenthetical remark on a broadcast about how African-Americans are still not as “free” as Whites in the US. One who considers freedom to be the absence of legal coercion might ask how are they not free today when all forms of legal discrimination have been abolished by Supreme Court decisions and federal law? The answer would reflect the definition of “freedom” as opportunity, a definition that will conflate “poor and disadvantaged” with “unfree”.

The definition that limits freedom to a relationship of men in society where physical force or fraud in human relationships is made illegal with no further attempt to redress inequalities of wealth, education, opportunity etc, is sometimes derided as “freedom to starve”.

Likewise the condition called “equality” is seen by those with opposing visions as either a process or a result, leading them to almost diametrically opposite interpretations of the term. To someone of unconstrained vision who views equality as a result, the socioeconomic lagging of certain groups behind others is prima facie evidence of externally imposed inequality (such as persistent discrimination) in society. To someone who views equality as the absence of legally imposed barriers to opportunity, the outcome is the result of values and choices and irrelevant to questions of justice as seen by people of unconstrained vision.

Those with a constrained vision tend to regard socioeconomic inequalities between individuals and groups as the inevitable result of inborn human variations in ability, different cultural indoctrination in values that promote or retard economic success and individual choices. Those of unconstrained vision tend to regard them as the result of artificially imposed constraints and when inequalities persist beyond the removal of obvious constraints will keep looking for them rather than change their model of causation.

Dr. Sowell has elaborated this theory far more than can be covered in a short review. He examines in detail visions of justice, power and equality and the difference between visions and paradigms, values and theories.

What is important to the problem of both academic and journalistic bias is how contrasting visions lead to unconscious assumptions about how the world works, and how that affects their interpretation of events. For those of unconstrained vision, though socioeconomic equality may be a strongly held value, they are nonetheless going to tend strongly towards intellectual elitism. If articulated reason is held to be the most powerful force for the social good then it must follow that society should be lead by the most advanced and progressive thinkers. Those who view the collective wisdom of individuals operating within their own spheres of experience to be superior to the ability of others to direct their destinies will be seen as self-interested, reactionary and apologists for injustice.

Those who see themselves as being in the intellectual vanguard of progress will tend to be strongly attracted to the fields of teaching, liberal arts, humanities, and journalism, and moreover, will tend to regard journalism as an extension of the teaching profession.

Unconstrained visions flourish in the absence of deep experience. In business, the natural sciences and engineering, theories about the way things ought to work (within their sphere of activity) are constantly tested against the way they do in fact work: profitability, repeatable experiments and bridges that don’t fall down all serve as reality checks against extending theory further than is warranted by the facts.

An academic environment tends to insulate against experience and journalism, by the nature of the news cycle, tends to expose practitioners to a superficial kind of experience, most especially among the newsreader “talking heads” who are basically presenters rather than researchers.

The consequences of the predominance of this vision among many academics and journalists are subtle and powerful and may include:

*Dismissal of other points of view as unworthy of reporting rather than attempting to refute them, not from motives of conscious fraud but simply from failure to take them seriously, often because of…

*Attribution of motive. It noteworthy how often arguments give the “real” motive of the opposing point of view – the one thing that cannot be known for certain. Motives can be strongly inferred only by a ruthlessly honest appraisal of one’s own nature – but it is seldom the case that a partisan for a particular point of view argues that “His motive is probably thus because that is what I experience in myself.”

*Unsupported parenthetical remarks among university lecturers and telejournalists. A broadcast from location often cannot be edited due to time constraints. It is interesting to note how often among the narrative of events a sentence that is unsupported comment can be slipped in.

*The use of ad hominem attacks (both Direct and Circumstantial) on someone’s credibility, probably coming from the unconscious assumption that since articulated reason can show the way to the social good, then conclusions about how to achieve it must be consistent among reasonable people. Disagreement about means and ends are seen as coming from ulterior motives, villainy or stupidity.

Dr. Sowell sees the theory as explaining a lot about the ideological struggles of the past two centuries – and sees no end in sight for the conflict of visions. However an appreciation of the role of visions in shaping worldviews can help make sense of opposing views for those who disagree and shows us that opposing views are not capriciously chosen or necessarily stemming from ulterior motives, but are internally self-consistent within the framework of the underlying vision. One may even hope that this appreciation may lead at least to genuine argument of the points at issue rather than character assassination and attribution of rapaciously self-interested motive.

It is fairly obvious that the constrained vision is behind much economic thinking. Economics is after all fundamentally about the way that human beings allocate finite resources. It is not clear that Dr. Sowell is making a blanket condemnation of the unconstrained vision though. He has noted that in the years since he first published, Malthus (on the constrained side) has been proven consistently wrong and he has credited both William Godwin and Ayn Rand (both exponents of the doctrine of the godlike power of human reason) as contributing to the evolution of modern libertarian thought. Possibly a certain element of the unconstrained vision serves to fire the imagination and may be necessary for motivating the spirit of social reform. Only when carried to extremes does it become a demand that society be everywhere remade to conform to a vision of perfection.

It also seems evident that though America was founded by men of largely constrained vision, there have been elements of both visions in our national culture from the beginning. The Founding Fathers did in fact design our federal institutions and were quite aware that they were creating a new social order by an act of will. However, they did so with a realistic appraisal of human nature, careful research of historical confederations and built upon local institutions that had been in operation for nearly two centuries. Since our beginnings American culture has reflected both utopian and pragmatic visions, a pattern that shapes our political discourse to this day.


The following chart is drawn from some of the major points of Dr. Sowell’s theory of visions. Since it is a collection of very short abstractions, responsibility for how well it represents the author’s thought rests with me.

Constrained Vision:
Sees human nature as fixed, unchanging, selfish and ambitious, which must be subordinated to society to some extent.

Unconstrained Vision:
Sees human nature as malleable, perfectible whose uncorrupted form will be expressed in the good society.
CV: Freedom is defined as the absence of coercion by other human beings.

UV: Unfreedom seen as the absence of opportunity.
CV: Emphasis on process costs. Seeks optimum trade-offs.

UV: Emphasis on motives and the desired results. Seeks solutions.
CV: Sees tradition as expressing the accumulated experience of the culture.

UV: Sees tradition largely as outmoded superstition.
CV: Sees articulated reason as less important than “distributed knowledge” expressed in market processes. Emphasis on experience.

UV: Sees articulated reason as powerful and effective. Emphasis on logic.
CV: Seeks the social good in making allowances for human nature, such as checks and balances in government, using mutual jealousy as a counterbalance against ambition and greed on the part of the powerful.

UV: Seeks the social good in the elevation of an enlightened and progressive leadership.
CV: Preference for evolved systems.

UV: Preference for designed systems.
CV: Characterized by the belief that the evils of the world can be explained by inherent characteristics of human nature. War and crime may be rational, if immoral, choices.

UV: Characterized by the conviction that foolish or immoral choices explain the evils of the world. War and crime seen as aberrations.
CV: Tends to compare the status quo with worse alternatives.

UV: Tends to compare the status quo with hypothetical perfection.
CV: Exemplary thinkers: Adam Smith, Thomas Hobbes, Edmund Burke, The Federalist, Thomas Malthus, de Tocqueville, Oliver Wendell Holmes, F.A. Hayek, Milton Friedman…

UV: Exemplary thinkers: William Godwin, Jean-Jacques Rousseau, Thomas Paine, Condorcet, Fourier, Harold Laski, Thorstein Veblen, John Kenneth Galbraith, Ronald Dworkin…

No tags

“This theme is so prevalent, and so obvious, that even though you can see where I am going with it — and hate the inevitable conclusion — you aren’t going to dispute the core fact. You have to sit there and accept one of the most galling things that a bunch of dedicated individualists can ever realize — that you were trained to be individualists by the most relentless campaign of public indoctrination in history, suckling your love of rebellion and eccentricity from a society that — evidently, at some level — wants you to be that way!” [3] (The Matrix: Tomorrow Might Be Different, David Brin

David Brin here points out something that an outside observer, the hypothetical “man from Mars”, might consider both glaringly obvious and seriously weird.

Every society has rebels and cynics, but ours has institutionalized rebellion as normal. If you don’t think so, try calling any American taken at random, a “conformist” and see how they react. I’m betting that the mildest response will be a vague discomfort, defensiveness and a feeling of having been insulted.

How do we do this? How do we socialize our children into the meme of Otherness/ tolerance/ suspicion of authority?

How many of you remember MAD magazine, before it was possessed by the Devil (a.k.a. AOL/ Time-Warner)? If you do, you know what I mean. If your memory is vague you might go to the Wikipedia entry on MAD. If your only knowledge of MAD is from the post-William M. Gaines era – you have my sympathy.

MAD was born from the death of EC horror comics, the sole survivor of a once-mighty empire of violence-porn William Gaines built on the religious comic book company he inherited from his father. Hounded by a congressional investigation, Gaines shut down the horror line and converted MAD (originally in comic format) into a magazine.

MAD specialized in even-handed satire of EVERYTHING. He assembled a wonderful team of artists and writers and let them run wild. He used to say, “They create the magazine, I create the atmosphere.”

“The usual gang of idiots” were of all kinds of political persuasions. When I asked about them at the Journal of Madness they told me that perhaps the only thing the original bunch had in common were that they were mostly WWII veterans who had in their youth, seen the world descend into madness.

Later they were joined by like-minded artists such as Antonio Prohias (creator of Spy vrs Spy), a prominent journalist in Cuba who fled to the US when Castro took over. He wandered into the MAD offices with samples one day, and never wandered out.

MAD has to rate as one of the most successful magazines of the 20th century. Consider that during Gaines’ lifetime they never accepted advertising of any kind. They supported themselves solely on subscriptions, magazine stand sales and the very limited tie-ins that Gaines allowed.

The effect on us as kids is incalculable. We loved the puncturing of adult hypocrisy and the wordplay. To this day there are a huge number of tunes that evoke for many of us, not the composer’s lyrics, but the MAD parody of them.*

MAD lampooned every sitting president without discrimination, virtually every top-rated movie and TV show, and every genre stereotype in both popular and highbrow culture. I believe they only got sued once, and eventually movie and TV stars didn’t feel they had arrived until they’d been roasted in MAD. After which, the custom was to send them a picture of oneself with the MAD issue they appeared in for the letters column.

The satire was sharp and biting, but like Spielberg’s Indiana Jones series (take-off on the pulp adventure genre), often a loving appreciation as well. You could enjoy the original story and love the parody too.

That’s all gone now. Gaines and the original gang died or retired. AOL/ Time-Warner appointed an editor in the 90s who thought MAD should tap into urban hip-hop culture – and saw the circulation figures drop precipitously. It has never recovered, and now accepts advertising.

A few years ago I took this thesis to the American Studies Conference in Minsk, Belarus. I ran off numerous samples of covers and articles from my complete CD collected MAD (from the start to 1998). It was hard to explain out of the cultural context. (And interesting to note that foreign editions of MAD have done well in only a few other countries. The failures outnumber the successes.)

Anyhow, there were a few American academics there, including a distinguished poet/ professor and an Anthropologist teaching in Belarus for a year. When my presentation came up, they were in the audience and I thought, “Oh my God, these are real scholars. They’re going to crucify me!”

Well what actually happened was, as I was passing around the samples, they were jumping up and down in their seats and going, “Tell them about Alfred E. Newman for President!”

* Since I am again living in Oklahoma, near our football stadium, I frequently have this going through my head on game days when the band is playing:

“Oh-h-h-h-h Oh-seven is the greatest spy there is today!
Though the Empire’s gone, he keeps right on, so you’d better not get in his way!
Oh-h-h-h-h Oh-seven we adore his looks and manly build,
When the going’s rough, he’s got the stuff, and he never let’s himself be killed!

We know in a fight he will win, ’cause he wins every fight he is in,
And that is why-y-y-y-y, when bullets start to fly-y-y-y-y
You’ll hear us crying, you’ll never die oh-oh-seven,
Oh oh-seven, our spy!”

No tags



David Brin’s Otherness

I encountered an idea that was to become one of the foundations of my world-view years ago in a short essay by science fiction author David Brin. Like Thomas Sowell’s ‘A Conflict of Visions’, which I will post about anon, after reading it I was never the same again and have been digesting the implications ever since. I am not at all sure that Dr. Brin would like where I’ve taken this idea, but then if it is in fact a valid insight into reality, that kind of thing happens.

David Brin is a scientist and author with a Ph.D in astrophysics. As a scientist he has worked as a physics professor and a NASA consultant. As an author he is known mainly for his science-fiction novels one of which, The Postman, was made into a movie by Kevin Costner. A few years ago he published a non-fiction book, The Transparent Society, which attracted a lot of attention for its startlingly different approach to the problems of surveillance technology in a free society. In it he postulates that the extension of surveillance of public places and access to data banks by an ever-increasing number of people and institutions, need not be an Orwellian nightmare if there is a corresponding extension of accountability and transparency. This was neatly expressed by the title of one of his articles in Salon, We Will Watch the Watchers. (Which can be accessed from his website – highly recommended.)

Regarding his writing and speaking career, one cannot help but wonder if this is a new model for the career of a public intellectual in the Internet age.

Dr. Brin’s career as a public intellectual is interesting in that he seems to have done an end run around the traditional routes to pundit status. He has never been a journalist, social scientist nor served in government. He started by writing entertaining stories with provocative ideas. In collections of his short stories he included a few essays exploring interesting ideas and began publishing essays and articles in web and print publications[1]. He maintains a busy schedule of speaking engagements and interviews and a web site where his articles are posted and discussion forums provided.

His ideas are characterized by technological optimism, fierce anti-elitism and confidence in the ability of ordinary people to govern themselves. Politically he identifies himself as Libertarian, but is obviously of an atypical kind. He advocates political alliances with Democrats rather than Republicans and extols the self-described Moderate majority of the population as the most sensible.

One of his most provocative ideas is the concept of The Dogma of Otherness and its implications for the issue of unconscious journalistic and academic bias.

The Dogma of Otherness is part of Dr. Brin’s simplified cultural morphology which divides world cultures into five types, 1) feudalism, 2) machismo (characteristic of the Latin and Arabic cultures), 3) paranoia (Russia, with its heritage of on average two incredibly destructive invasions per century), 4) Eastern collectivism (stable and sane – at the cost of the complete unimportance of the individual) and 5) the culture that originated in the West and developed to the most extreme in the United States: Otherness.

In his article, The Dogma of Otherness Dr. Brin describes (in composite form) the genesis of the idea while on a speaking tour. Because he has written about dolphins, questions usually arise concerning dolphin intelligence. He replies that after initial optimism, he was convinced by the evidence that dolphins are not intelligent on anything like a human level. Objections arise, “You can’t know that!” “If we can’t communicate with them it must mean we’re not smart enough!” “But…but there may be other ways of dealing with the world intelligently than those we imagine!” “Those problems the dolphins had to solve were designed by human beings, and may miss the whole point of cetacean thought! In their environment they’re probably as smart as we are in ours!”

Dr. Brin repeats the reasons he became convinced, until he gives up in the face of the absolute refusal of the audience to concede the validity of the evidence. He points out that he gets this reception from every audience of non-scientists and thinks he has realized why, and that it has to do with the core cultural assumptions of American society. Every culture has them, they are socialized into the young of every culture and nation, some call them dogmas, some call the zeitgeists, and we have ours as well. “But I am coming to see that contemporary America is very, very strange in one respect. It just may be the first society in which it is a major reflexive dogma that there must be no dogmas!”

“Think about it, “There’s always another way of looking at things” is a basic assumption of a great many Americans.” Someone replies, “Well isn’t it true? There is always another way!” “Of course there is… or at least I tend to think so. I like to see other viewpoints. But you see, I was brought up in the same culture as you were, so it’s no surprise I share your dogma of otherness.”

He describes what follows in the discussion, he talks about how unique this orientation is in the history of the world. Someone accuses him of cultural chauvinism, “What’s so special about our culture?” “You’re doing it again!” he cries. He points out that there may indeed be something to be learned from other points of view, but then again that could just be a bias imposed by our cultural conditioning.

Further objections follow, “All right, so that’s just our way of looking at things. But you can’t say it’s actually better than any other way… Other peoples have their own cultural assumptions, of equal value.” Further discussion ensues leading to the core contradiction: Otherness holds that all points of view may be valid for the culture (or even the individual) that holds them, other cultures think that this belief is self-evidently insane and/or evil. They cannot both be true, one or the other must be and recognizing this involves making a judgment (at least implicitly) about which is better, more true or more worthy to prevail.

To elaborate further would involve quoting the entire (admirably succinct) essay – and by this time the academic reader is probably nodding his head in recognition. The attitude is most noticeable in the doctrine of cultural relativism in the social sciences: all cultures are equally valuable and worthy to survive and to suggest otherwise is bigotry and racism. [2]

Dr. Brin suggests that this attitude evolved in our culture of immigrants. America has become a mix of more peoples and cultures than probably any other in the history of the world – with none that had a strong claim to right of precedence, and that to get along peacefully we had to develop tolerance to a degree beyond any other previous civilization. He does not claim that everyone everywhere in America possesses this attitude in the same degree, or that it is possessed only in America, simply that it is characteristic of American culture to a degree beyond any other place at present.

“The Dogma of Otherness is a worldview that actually encourages an appetite for newness. A hunger for diversity. An eagerness for change. Tolerance, naturally, plays a major role in the legends spread by this culture. (Look at the underlying message contained in most episodes of situation comedies!) A second pervasive thread, seen in the vast majority of our films and novels, is suspicion of authority…

“What we can say, nevertheless, is that Otherness has become powerful in the official morality of most western societies. Look at the vocabulary used in most debates on issues concerning the public. So-called ‘political correctness’ can be seen in ironic light, as a rather pushy patriotism in favor of the tolerance meme! But even the other side often wraps itself in phrases like “freedom,” or “color blindness,” or “individual rights.” ”Even more important, though, is the fact that millions accept the deeply utopian notion that our institutions must be improvable, and that active criticism is one of the best ways to elicit change.” (The Meme Wars The irony of this is that American culture socializes its members into an attitude that each of us and a few like-minded others are unique in possessing this attitude of tolerance and suspicion of authority.

“It is a smug cliché — that you alone (or perhaps with a few friends) — happen to see through the conditioning that has turned all the rest into passively obedient sheep. …

“Ah, but here is the ironic twist. Look around yourself. I’ll bet you cannot name, offhand, a single popular film of the last forty years that actually preached homogeneity, submission, or repression of the individual spirit. ”That’s a clue! In fact, the most persistent and inarguably incessant propaganda campaign, appearing in countless movies, novels, myths and TV shows, preaches quite the opposite! A singular and unswerving theme so persistent and ubiquitous that most people hardly notice or mention it. And yet, when I say it aloud, you will nod your heads in instant recognition. ”That theme is suspicion of authority — often accompanied by its sidekick/partner: tolerance. ”Indeed, try to come up with even one example of a recent film you enjoyed in which the hero did not bond with the audience in the first ten minutes by resisting or sticking-it to some authority figure….

“This theme is so prevalent, and so obvious, that even though you can see where I am going with it — and hate the inevitable conclusion — you aren’t going to dispute the core fact. You have to sit there and accept one of the most galling things that a bunch of dedicated individualists can ever realize — that you were trained to be individualists by the most relentless campaign of public indoctrination in history, suckling your love of rebellion and eccentricity from a society that — evidently, at some level — wants you to be that way!” [3] (The Matrix: Tomorrow Might Be Different, David Brin

I believe that Dr. Brin is on to something here. The idea is hard to grasp for some, but once grasped anyone can test it for themselves by observing the culture around them, for example by trying that experiment with movies, TV shows and popular literature. This may be one of the most important insights into understanding our own culture in recent years.

It is obviously going to be unsettling to some. It supports the idea of American, and Western exceptionalism – which contradicts the basic vision of Otherness itself! This kind of condition is of course, the primary causative factor in Cognitive Dissonance theory.

There is no doubt about whose side Dr. Brin is on. He holds with a society where progress is achieved through openness and constant criticism. In The Postman he has the main character tell a story about the Americans, who used to accuse themselves of all kinds of terrible crimes – but that this was only their way of making themselves better.

I believe that Dr. Brin’s model is well worth considering, with some caveats.

Our core cultural assumptions, as everyone’s do, lead us to look at the world from a certain perspective, but also have their own unique blind spots, which we must make a serious effort to see around.

At present, America is attempting to do something that has never been achieved before – with no historical examples to indicate whether or not it will continue to succeed in the long run. We have a republican form of government with a population of heterogeneous origins now at 300 million and continually growing through immigration and a birthrate far healthier than Europe’s. Prior to the US Constitution, political philosophers such as Montesque believed that a republican government must necessarily be on a small scale. Surely we are going to need the feedback of constant self-criticism if we are to chart a course into an unknown and uncertain future.

There is indeed a lot of self-criticism in our society, but much of it is neither legitimate nor productive of any real self-examination. Dr. Brin has not to my knowledge dealt with what might be called xenophilia, the hallmark of many radicals who hold that the good society is found elsewhere. Usually in some truly awful tyranny, which they strangely never seem to immigrate to, or even bother to visit in most cases. Xenophilia is not respect for other ways, but a positive loathing of one’s own, and is manifestly quite common among the very classes which have most benefited from American society. Obviously our culture is productive of some discontents that affect its most privileged members that we only dimly understand.

(See my posts on ‘Western Civilization and its Discontents’ and ‘Aeyrheads’.)

At an extreme, the respect-for-all-cultures vision that Otherness promotes leads to a reluctance to evaluate others for fear of seeming “judgmental” (a capital offence in the social sciences). The problems with this attitude include:

1) One cannot avoid making judgments, it’s part of our basic cognitive processes. If one tries then the judgments made will be unconscious and thus difficult to deal with rationally.

2) If we are unwilling to consider that our culture has achieved something uniquely valuable, we will be unable to examine rationally exactly what it is we are doing right, how it might apply to others and how it might be improved further.

3) Contrariwise, if we are unwilling to honestly criticize the failures and shortcomings of others, how will we learn from them?

These extremes of the Dogma of Otherness do not necessarily represent fatal flaws in the concept. But it does seem possible that most people may not have the insight necessary to appreciate the difference between being objective but evaluative, and being judgmental and biased – and this is rather an elitist point of view, which is of course contradictory to the Dogma of Otherness itself.

Dr. Brin makes an excellent case for the Dogma of Otherness being among the core assumptions of American, and increasingly Western Civilization and shows how unique this is among other civilizations both contemporary and historical. This worldview is undoubtedly being spread throughout our culture by our academic establishment, our news and entertainment media and throughout Western culture by American media hegemony.

The above caveats and reservations about Otherness lead to one important question. Though the assumptions of Otherness may be necessary to maintaining a society as large and as heterogeneous as our own, might this not also lead to something analogous to a breakdown of our society’s immune system – something like a case of cultural AIDS? If we do not consider our own society as in some way unique, worthy and better than any alternative, from where will come the will to defend it against those that would attack and destroy it from within or without?

Whether the contradictions in our core cultural assumptions will ultimately tear apart our culture remains to be seen. This has happened to many other cultures throughout history and perhaps we are not so special after all.

Note: the essays containing the elaboration of the idea of The Dogma of Otherness can be found in:

Otherness, David Brin, 1994 Bantam Spectra Books,
The Dogma of Otherness
The Commonwealth of Wonder

And on David Brin’s website, as cited above.

[1] Sometimes for free just to have a forum to address a specific audience. I have published in one of the same magazines as Dr. Brin and can testify that they do not pay and sometimes edit at their whim.

[2] If this characterization seems extreme, I’d note that during my studies in Anthropology on more than one occasion I brought up the question of Thugee. Thugee (origin of the English word thug) was a Hindu cult that worshipped the goddess Kali. Their religious devotion took the form of joining parties of travelers on the road, making friends with them and at a given signal, strangling them with silk scarves and robbing their corpses. These were not simple brigands but a fully developed culture with customs, rituals, rites of passage for their children and an elaborate taboo structure. When the British discovered their existence in the 19th century an estimated 40,000 people were disappearing on the roads in India.

In the name of objectivity and being non-judgmental , more than one Anthropology professor defended their right to practice their culture unmolested by British imperialism. None considered the possibility that this attitude was inherently racist, holding the lives of Indians as insignificant compared to upholding the right of a predatory culture to exist. I have repeated the experiment many times and found that Western social scientists will almost always either frantically avoid making a judgment or come down on the side of defending a moral obscenity.

[3] I used this theme at a presentation at the 11th American Studies Conference in Minsk, Belarus with the example of MAD magazine as one of the means of acculturating American youth into an attitude of tolerance and suspicion of authority through even-handed satire.

No tags

My wife came home from a parent-teacher meeting the other day, mad as hell.

Why? Because the whole meeting was in Spanish. Both my wife and I can follow Spanish a bit, but she’d have been totally lost if she hadn’t known what the agenda of the meeting was.

Our boy goes to Headstart preschool at a local church. We wanted him to have a social life, and admittedly to get him out of the house for a few hours during the day. He was born just two weeks shy of the limit that would get him into regular kindergarten so this was a good alternative for a poor grad student. He goes to preschool with a bunch of mostly Mexican kids and a very few Anglos. Each class has two teachers, one of them Spanish-speaking and all notices are in English and Spanish.

Well, how is it working?

In a word, it isn’t. The Spanish-speaking kids aren’t learning English – and our boy isn’t learning Spanish beyond a few words. One committee is chaired by a woman who doesn’t speak English at all, so the English-speaking parents just get left off the phone tree and don’t hear about crucial events.

My wife is not a native speaker of English, but she speaks it better than a great many who are. My Polish is not up to the level of her English, but I did learn the language well enough to get around the country by myself and communicate for all practical purposes, and though I doubt I’ll be invited to lecture at a university in Polish any time soon I have had compliments on my accent. Monika gets vexed that other people resident in the US don’t learn English at least well enough to function in society without special help.

So what does work?

My wife’s best friend is a Mexican woman who doesn’t get out of the house a lot, so she’s happy to have Monika over so she can practice her English. And I think it takes some of the stress off her to know that it’s a second language for Monika too. With her mother in the house helping with the kids, naturally they didn’t learn English. That is, until her eldest daughter went to elementary school. She picked it up in two months. There is no bilingual education at her school.

My boys playmates at home are from Kenya and Sri Lanka, respectively native speakers of Swahili and Sinhalese. They speak English perfectly.

It’s called “total immersion” and that doesn’t mean the thing Baptists do. Kids are like language sponges, throw them into the linguistic environment and “poof” they learn to speak it. My boy understood English from the beginning because he heard his mother and I speak it, but spoke Polish by preference (and his grandmother also had a lot to do with that). After a few months in America he finally got that nobody understood him and started speaking English – literally overnight. It was like flipping a light switch, one day he was an English speaker, just like that. Now we have to work at keeping his Polish up to speed.

So since we know what works, why are they trying to reinvent the wheel? Well, perhaps the fact that total immersion just happens, and doesn’t require a paid specialist has something to do with it. And some folks just can’t accept that good things happen without their help.

Funny thing, a while ago I had this conversation with a professor who mentioned that he or somebody in his family was involved in ‘bilingual education’ programs of some sort or other. I mentioned that I’d heard it was pretty much considered a disaster in California. He said, “No” and gave a longish explanation about how it either hadn’t been done right or had been sabotaged. Now here’s the funny (or tragicomic) part; he knew very well that I have a bilingual household and that we are raising our kids as English and Polish speakers. Did he think to ask how my wife and kid learned English? Did he think to ask even a single question about our experience that might be relevant to the issue? Do I even have to answer that question?


Tips on learning a language.

When I went to Poland, I found out that Polish has a really complicated grammar – and that’s not just a point of view thing. I told one of my high school classes once, “Wow, Polish grammar is really complicated, but then I suppose you think the same thing about English.” They looked blank for a moment, then one replied, “Oh no Steve, English grammar is much simpler than Polish.”

I later found that it’s a trade-off in some ways. English grammar is more complicated than Polish in the verb tenses, the conditionals and the negative prefixes. (To give you an idea, Polish has two: “nie” which answers for; no, not, un-, in-, im-, a-, ab- etc, and “bez” which is a prefix but covers the English suffix -less or “without”.) Polish is more complicated in that it has a case structure, i.e. every noun and adjective has several different forms depending on gender and whether is is used as a subject, direct or indirect object, location, instrumentality etc.

So how to deal with this if you go to live in another country, or are just travelling? A Polish philosopher gave me this advice, “Steve, just ignore the case endings. Everybody will know what you mean anyway.”

In language teaching (or “applied linguistics” we like to call it because it sounds more important) we call it the difference between ‘fluency’ and ‘accuracy’. Fluency is the ability to understand and make yourself understood. Accuracy is getting it exactly right according to the local rules of grammar, syntax and usage. I speak Polish fluently but not accurately.

Unfortunately, formal language courses in America make people feel insecure because they concentrate on getting it just right for tests. Vocabulary is where it’s at if you want to be understood folks. Learn a lot of words, worry about getting the grammar right later.

So what words? First words you should learn are: please, thank you and excuse me. These go far. Then learn the numbers – very useful in shops, and that’s where you’re going to be doing a lot of practicing. As Kipling said, there are few linguistic barriers between a willing buyer and a willing seller. So next you might learn, “Please can I see that?” (point).

Another tip, say it confidently. It’s amazing how people just don’t hear your mistakes if you speak with an air of confidence, just like you know what you’re doing.

The rewards are great, most people really warm to someone who tries to learn their language even a little and are extravagant in their compliments. (Well, except the French. They insult you for the way you speak their langauge and now they wonder why French isn’t the universal langauge any more.)

And the nice thing about teaching English in Poland was that unlike the French and Germans, Poles always knew that Polish wasn’t going to be the universal language. Mostly they just thank God it isn’t Russian.

No tags

Older posts >>

Theme Design by