What Went Wrong

July 7th, 2002

Sometimes a cup of coffee represents no more than a refreshment. Other times, coffee fills up more than a little cup of irony. In 1000 AD, an Arab would indulge in a cup of sweetened coffee from Ethiopia. Islamic culture introduced coffee to the world. Without its discovery, there would be no Starbucks and Seattle would be much more laid back.

By the eighteenth century, Europeans had found they could grow the “devils drink” more cheaply in its colonies than could be produced in Ethiopia. In What Went Wrong? eminent Middle Eastern authority Bernard Lewis, points out that by the eighteenth century a typical Turk or Arab would sip coffee that was imported from Dutch Java or Spanish America. For Lewis, coffee represents a metaphor for the decline in the dominance of the Islamic world. This decline and its impact on the western world is the theme of Lewis’s book.

If one visited the world at the beginning of the second millennium, the case could easily be made that the Islamic World was the most powerful, dynamic, advanced and progressive culture on the planet. The Islamic World extended into Europe from both the east and the west, controlled all of Saharan Africa and the east coast of Africa, the Arabian Peninsula, east into Asia. Much of ancient Greek knowledge had been assimilated and new insights were learned in contact with the Chinese civilization.

The Islamic culture was self-confident and thus was, for its time, tolerant of other ideas and faiths. Although they would not accept attempts at proselytization, Christians were permitted and Jews prospered and occupied positions of prominence.

The Islamic powers had reason to be confident. They dominated the world militarily, eventually expelling Crusaders from Europe.

Somewhere around the time of the European Renaissance, it became apparent even to Muslims that something was changing. European science and technology was improving, largely due to imperatives of trade and exploration of the New World. The Reformation in Europe alleviated the hegemony of thought, with an emphasis on individualism.

Gradually, Europeans pushed back the geographic limits of the Islamic World, largely expelling it from Europe. As the military superiority of the West improved, Muslims found it essential to adopt Western military weapons and tactics, but felt no real necessity to incorporate large elements of Western culture. Much of the Islamic World believed it could modernize without Westernizing. Ultimately, even these efforts failed.

In the late eighteenth century, Napoleon with superior arms blasted into Egypt and quickly overcame a power at the core of the Islamic World. Ultimately, Napoleon was forced to leave, not by an Islamic power, but by another Western one. Soon much of the Islamic world would become part of the colonial empire of one or another European power.

Today, much of the Islamic world is in poverty, ruled by tyrannical and oppressive leaders, and is technologically and economically far behind the West. If it were not for Western addictions to oil and opium, there would be little of export value from the Islamic World. Muslims also cannot fail to notice that the Eastern powers, Japan and the rest of Pacific Rim, have somehow been able to embrace Western economic culture and in some cases even surpass Western powers in terms of prosperity.

Of course, as Lewis points out, the human response to this conspicuous decline is to ask “Who did this to us?”

According to Lewis, for a long time, the Islamic world blamed the invasion of the Mongols in the thirteenth century. However, this explanation is unpersuasive given that many Islamic cultural achievements came after the expulsion of the Mongols.

Many in the Islamic World blamed Western imperialism, particularly by the British and the French. However, this explanation begs the question. It was the decline of economic and military power that allowed Western imperialism to succeed.

After, the formation of a tiny Israeli state in the center of the Middle East, the Muslim world tried to blame Zionism for their humiliation. As Lewis puts it, “… it was humiliating enough to be defeated by great imperial powers of the West; to suffer the same fate at the hands of a contemptible gang of Jews was intolerable. Anti-Semitism and its image of the Jew as a scheming and evil monster provided a soothing antidote.” The Jews were to blame.

Tolerance has come full circle. For much of the last millennium the treatment of Jews in the Islamic World was far more exemplary than their treatment by Christendom. Now Jews are hated in much of the Islamic World. It is ironic that perhaps the best indicator for the success and prosperity of a society and culture may be its treatment of Jews. If this historically persistent minority is tolerated, it implies that the dominant culture is sufficiently self-confident and prosperous that it sees no threat in the acceptance of Jews. It is an unmistakable sign of decline when this tolerance is abandoned.

Most recently, some in the Islamic world have blamed Western culture, and it chief symbol, the United States for undermining Islamic religious values. Islamic fundamentalist seek to explain decline in the Islamic World, by the abandonment of traditional Islamic practices.

The true reason for the decline of Islamic civilization has been its growing calcification and refusal to recognize the importance of individual freedom necessary for a modern economic state. Interestingly it may have been the early phenomenal success of Islam that cemented it into rigid religious structures. At the outset, Islam spread quickly and relentlessly. Within a century of Mohammed, Islam extended from Spain to the Caucuses. Islam washed over other local religions like a tidal wave and immediately dominated religious and government structures. Indeed, there was no perceived difference between the civic culture and religious culture. The law was Islamic law and carried the weight of Allah’s authority.

The Christian and Jewish religious traditions arose in a culture of the oppressed. The Jews were enslaved in Egypt and the early Christians were persecuted by the Roman Empire. Mohammed was triumphant on Earth, while Christ was put to death by local government authorities. From the outset, Christians and Jews realized that the kingdoms of the Earth and religious authority were not co-extensive. In the West, secular and religious power was often allied and sometimes synonymous, but the idea of two separate spheres of authority was at least possible. After the Reformation and a series of religiously-based European wars, it became evident that some mutual distance between the state and church could provide lasting peace. Indeed, with the rise of commercial society, the question of religious affiliation diminished in importance.

The laws and institutional arrangements of man were thus accepted as largely empirically based. What worked to produce civility and prosperity was sufficient. Arrangements could be temporary and flexible according to the needs of the time. Appeals to immutable religious authority were not necessary. This emancipation of the individual and associations of individuals to seek their own goals ignited the growth in wealth and military power that has left much of the Islamic world behind.

There is nothing inherent in the Islamic faith that prevents it from embracing Western culture. The relative prosperity of Turkey is a consequence of its adoption of Western economic and cultural institutions. Nonetheless, there remains a broad sympathy in the Islamic World with the notion that freedom, particularly the freedom that separates civil from religious authority, makes Western powers debauched and self-indulgent. The ultimate fall of these powers under the weight of their own decadence, in the minds of some, will mark Islam’s return to ascendancy.

However, as Lewis concludes:

“If the peoples of the Middle East continue on their present path, the suicide bomber may become a metaphor for the whole region, and there will be no escape from a downward spiral of hate and spite, rage and self-pity, poverty and oppression, culminating sooner or later in yet another alien domination-perhaps from a new Europe reverting to old ways, perhaps from a resurgent Russia, perhaps from some expanding superpower in the East. But if they can abandon grievance and victimhood, settle their differences, and join their talents, energies, and resources in a common creative endeavor, they can once again make the Middle East, in modern times as it was in antiquity and in the Middle Ages, a major center of civilization. For the time being, the choice is theirs.”

Unlawful Combatants

June 25th, 2002

It is always amazing how many who do not care one whit about constraints on First Amendment rights implicit in contemporary “campaign finance reform” or limitations on peaceful protests around abortion clinics or who insist on the narrowest possible interpretation of the Second Amendment manage to get their shorts tied up in a rigid knot about the detention of illegal combatants associated with Al Qaeda. There are certainly serious civil rights issues that need to be addressed, but there remains a strange and unsavory sensitivity to rush to the defense of only those who hate America.

Some try to invoke Pastor Martin Niemöller’s warning:

“First they came for the Jews
and I did not speak out
because I was not a Jew.” …
“Then they came for me
and there was no one left
to speak out for me.”

However, even this sound observation can be misapplied. There are also times they come for thugs; there are also times they come for murderers; and there are also times when they come for the evil. We should be able to discern the difference and speak up for those who come to protect us.

The Bush Administration is faced with an awkward situation. They are charged with fighting a war that sometimes takes place on American soil against enemy soldiers who do not conveniently, and according to the laws of war, wear uniforms. These “llegal combatants” fall into an unfamiliar legal no man’s land. They are not quite prisoners of war since they are not part of a regular armed force. They do not even have formal “ranks and serial numbers” that are normally required of prisoners. At the same time, these people are not mere criminals, but part of an enterprise that is at war with the United States. The sooner the United States makes a formal declaration of war, the easier it will be sort out the legal categories.

Americans feel uncomfortable, and justly so, when arbitrary executive authority is used to detain people, even extremely dangerous people. While there is little evidence that the Bush Administration has abused its authority in this matter, there is always a danger of tyranny when one branch of government can act solely and unilaterally to detain people. In Ex Parte Quirin, decided in 1942, the Supreme Court invoked common law practices to empower the government to try un-uniformed Nazi saboteurs (one of whom was an American citizen) in military tribunals. The court was silent about indefinite detention of similar illegal combatants. Yet, under the Ex Parte Quirin doctrine the government will probably be able to hold indefinitely people like Jose Pedilla who were likely conspiring to engage in terrorist activity. Nonetheless, there is a more appropriate and Constitutionally regular way to hold illegal combatants.

The US Constitution has made provision for dangerous situations where conventional and important legal protections might need to be modified. Article I of the US Constitution provides that:

“The privilege of the writ of habeas corpus shall not be suspended, unless when in cases of rebellion or invasion the public safety may require it.”

Obviously, in cases of “rebellion or invasion”, other measures can be taken and Congress should make a provision for dealing with these new illegal combatants in a thoughtful and formal way. Consider the following proposed steps:

  1. Via legislation, Congress should provide the temporary authority for the executive branch to detain those who it has strong reason to believe are part of the foreign network at war with the United States. The legislation should make clear the level of proof required for this detention.
  2. Congress should provide for a special court with the sole purpose of supervising this detention. Members of this court could be cleared to review classified information. Every six months (or whatever time period Congress specifies), the executive branch must re-make the case for continued detention to this special court.
  3. Congress should place a time limit on this legislation so that this special executive power does not continue indefinitely and so that the specific provisions of the legislation can be reviewed and modified as necessary.

These legislative steps would not only protect the country, but also insure that anyone who is detained is done so under the review and care of all three branches of government. Importantly, these special provisions would be temporary in nature.

It is time for Congress to act in order to protect Americans and American liberties and avoid the unnecessary distraction of constant arguments about what may be more detentions.

Dishonesty in the Service of Higher Goals

June 9th, 2002

“There may be honest differences of opinion as to government policies; but surely there can be no such difference as to the need of unflinching perseverance in the war against successful dishonesty.” — Theodore Roosevelt.

The goal of the 1973 Endangered Species Act was “to provide a means whereby the ecosystems upon which endangered species … depend may be conserved.” If the Secretary of the Interior determines that the habitat or range of an endangered species is threatened by human activities, then those activities can be curtailed. In practice, the act has been both praised and criticized. It has maintained habitat for endangered species, but has sometimes done so at the cost of jobs for humans. The reduction of federal lands available for logging because these lands encroached on the habitat of the endangered spotted owl is one of the most famous and controversial applications of the act.

Successful application of the Endangered Species Act depends on the unbiased identification of habitats crucial to the survival of endangered species. The importance of this responsibility makes recent events disturbing.

The Canadian lynx is endangered and the government is trying to assess its range. The location of Lynx hair samples found in the wilderness are an important means for this assessment. A number of Federal and State employees were discovered taking hair samples from captive lynx and submitting these as if they had been obtained from the wild. Some of those involved claimed they were just trying to test the accuracy of the lab to which the samples are sent, but such tests were outside the specific study protocols. That explanation has the foul stench of deception. Some believe the fudging of data as a means to prevent development is widespread. The Washington Times cites a retired Fish and Wildlife Service biologist as saying, “I’m convinced that there is a lot of that going on for so-called higher purposes.”

In Piper City, KS just outside of Kansas City, biology teacher Christine Pelton determined that 28 of 118 students were guilty of plagiarism on a biology assignment. Pelton awarded the students a zero on the assignment. After protests by parents concerned about the effect of poor grades on the competitiveness of their children in the college admissions race, the school board investigated. The Board found that the students had indeed plagiarized material, but thought the punishment too severe. The Board then directed Pelton to raise the grades of the affected students. Pelton resigned in protest.

The Enron energy company and its accounting firm, Authur Anderson, engaged in misleading, dishonest, and perhaps illegal accounting practices to hide the true financial status of the company. The result is company bankruptcy and the decimation of the savings of investors and employees. The efforts to untangle the web of deception woven by this dishonesty still are not complete.

These and other instances point to a growing cultural acquiescent to dishonesty, particularly when honesty and integrity prove to be inconvenient. There always seems to be an easy justification or rationalization. The dishonesty is always in service of a higher cause, whether it is environment, economic advantage, or simply higher school grades.

It would be easy and perhaps a little too much fun to blame the spread of dishonesty on former President Bill Clinton’s studied dissembling in a federal civil rights case. While Clinton may have raised deceit to a high art form, he was not the cause, at least not the sole cause, of the cultural acceptance of dishonesty. The trend existed before Clinton’s presidency.

If we want to look at intellectual sources for this tolerance, we might try to blame Friedrich Wilhelm Nietzsche. He argued that truth may not be as absolute and universal as we had supposed. However, most people do not read Nietzsche, and fewer accept his analysis.

Unfortunately, conservatives and free market libertarians may be partially to blame for tolerance of dishonesty. One of the virtues of free markets is to tame and sublimate otherwise aggressive animal impulses into constructive market competition. This virtue can simultaneously be a vice. Markets are notoriously amoral schemes aimed only at practical measures of success. Only the result is important. Such a neglect of means relative to ends diminishes the importance of means. It habituates the public to an ethos of indifference to honesty and integrity.

In order for free market societies to successfully exist, they cannot rely on markets as the only instructors of morals and molders of temperament. Governments are instituted to provide an honest legal structure within which free transactions can confidently take place. However, the cost and awkwardness of prosecutions means that such enforcement will be reserved for only the most egregious violations.

By-and-large, enforcement and self-control must arise organically from the culture. Shame and embarrassment in the face of peers can act as powerful social constraints. Intermediary institutions, religious, private and civic, can engage in moral instruction and nurture our better natures.

Those who believe in the power of markets, who believe that countries and economies should be organized and directed by free markets, have an additional obligation to insure the character of their citizens and the success of those intermediary institutions that form such character. Not surprisingly, social institutions that rely on individual self-direction and self-rule are dependent upon the wise and honest use of this freedom.

Declaring War on Iraq

June 2nd, 2002

“The Congress shall have Power … to declare War, grant Letters of Marque and Reprisal, and make Rules concerning Captures on Land and Water.” — United States Constitution.

In the immediate aftermath of the terrorist attacks that destroyed the twin towers of New York City’s World Trade Center and demolished a wing of the Pentagon, the moral authority to take military action to stop those responsible from future attacks was clear. Sure, there were a few of the “Blame America First” temperament, who wanted to know what the US did to make these terrorists hate us. Fortunately, the voices of those who habitually make excuses for mass murders were few and isolated. The moral authority to respond to the terrorists was quickly followed by the legal authority to do so.

On September 18, 2002 only seven days after the attack, Congress passed the joint “Authorization for Use of Military Force” resolution. The resolution gave direct authority for the president to act militarily. Specifically, Congress resolved:

“That the President is authorized to use all necessary and appropriate force against those nations, organizations, or persons he determines planned, authorized, committed, or aided the terrorist attacks that occurred on September 11, 2001, or harbored such organizations or persons, in order to prevent any future acts of international terrorism against the United States by such nations, organizations or persons.”

The Al Queda organization hosted by the Taliban in Afghanistan was quickly identified as the proximate group responsible for seizing the planes and turning them into weapons. The US government requested that the Taliban hand over the Al Queda leaders. When the Taliban refused to cooperate, it demonstrated its complicity with attacks on the US. The original “Authorization for Use of Military Force” resolution clearly sanctioned the subsequent US military defeat of the Taliban.

Nine months after September 11, the case for attacking Iraq is not nearly so clear. For a while, there was the suspicion that Iraq may have cooperated and directly helped the Al Queda. While this may indeed be the case, the public evidence for this is not clear. As a consequence, the authorization given by Congress for action may not be sufficient to cover efforts to militarily overthrow the Baghdad regime.

It is very possible and perhaps probable that Iraq is providing critical aid and funding to a loose-network of anti-US terrorist groups. It is very conceivable that Iraq, who has used weapons of mass destruction against its own people, is sufficiently malicious to provide some of these weapons to terrorist groups willing to use them against the American homeland. The case to attack Iraq as a necessary measure to preempt a future attacks on the US may be strong and compelling, but it has not been made publicly.

Presidents sometimes see Congress as an unfortunate impediment to the efficient execution of foreign policy and sometimes it is. On occasion, representatives and senators act like 535 secretaries of state raising a cacophonous din, confusing allies and adversaries alike. Nonetheless, a premeditated attack to depose the government of a sovereign state, months, or perhaps years after provocations is war and ought to be declared so by Congress.

President George Bush can probably stretch the “Authorization for Use of Military Force” resolution to authorize an attack against Iraq without much public objection. It is not to his benefit to do so. Asking Congress for additional authorization will impose an important discipline. It will force Bush to unambiguously articulate the rationale for military action against Iraq. It will force Bush to clearly define objectives and measures of success. It will also compel Congress to affirm Bush’s judgment in a way that will unite the country and make it difficult for others to second-guess Bush later.

There is a legitimate concern that in articulating and describing the perceived threats from Iraq, important intelligence and means of collecting intelligence might be compromised. However, the intelligence committees in both houses of Congress are capable of dealing with secret information and reporting summary findings to the remainder of Congress.

Countries that boast rule by consent of the governed should not easily shed their ideals under serious challenges. Returning to Congress for additional authorization to attack Iraq will be a difficult challenge for Bush’s leadership. However, by engaging Congress Bush will provide an important precedent for future Presidents contemplating military action. Indeed, such a precedent may prove to be an important Bush legacy.

Goodbye to Stephen Jay Gould

May 26th, 2002

The popular vision of scientists as wise, well read, erudite, Renaissance men is largely a throw back to the nineteenth century, at least as portrayed in the movies. The truth is that most scientists have narrow fields of expertise. Many scientific disciplines are so academically demanding and time consuming, scientists generally have time for little else than the study of their fields. The scientist with a deep knowledge and appreciation of history and literature is rare. Rarer still is a scientist who can speak and write eloquently for the lay audience about topics as disparate as the Flamingo’s Smile, Leonardo’s Mountain of Clams and the Diet of Worms, or Crossing Over Where Art and Science Meet. Harvard paleontologist Stephan Jay Gould, who recently succumbed to a rare cancer, was just such a scientist.

Although his primary expertise was in West Indian snails, Gould became an important voice in many national debates. Gould was blessed with the dual gifts of raising the level of the discussion and of reducing the temperature of discourse. The 1981 book, The Mismeasure of Man proved to be an important cautionary tail. In the book, Gould warned of the inherent difficulty and aborted efforts in classifying human intelligence with a single numerical measure, the so-called Intelligence Quotient, IQ. There is still considerable controversy about measures of intelligence and the final chapter has not yet been written. In 1994, when Richard Hernstein and Charles Murray, published The Bell Curve suggesting correlations between race and measures of intelligence, Gould was a voice of calm and studied critique. However, Gould’s most important contribution in this debate was to explain how even well-meaning and honest scientists can inadvertently bend interpretation of the data to fit preconceived notions and prejudices. It is not that science cannot be neutral; it is just that scientists must be vigilant to maintain objectivity.

Gould had his flaws. His politics were left of center and worst yet he was a life-long New York Yankees fan. Yet, these could be overlooked in light of his persistent honesty in pursuit of the truth.

Gould has always been something of an agnostic with respect to religion, but has never carried an anti-religious chip on his shoulder like other scientists in the manner of Carl Sagan. Indeed, later in life in his book, Rock of Ages, Science and Religion in the Fullness of Life, Gould seemed to reach a permanent accommodation with religion arguing that religion and science address different human needs and realms. Gould asserted the Principle of NOMA, Non-Overlapping Magisteria. Specifically:

“[the] magisterium of science covers the empirical realm: what the universe is made of (fact) and why does it work this way (theory). The magisterium of religion extends over questions of ultimate meaning and moral value.”

Nonetheless, Gould’s field is evolutionary biology and it is here that he has been most vocal. He was one of the first to recognize that biological evolution may not proceed at a slow regular pace, but proceed in fits and spurts as species respond to dramatic changes in the environment in a process known as “punctuated equilibrium.” For example, after an asteroid impact, species may quickly change until a new equilibrium is reached.

Recently, Gould has argued that biological evolution is not directed and evolution does not necessarily imply progress. Evolution is popularly portrayed as a progression from simplicity to complexity, culminating in humans. If you measure evolutionary success by ubiquity, Gould argues, bacteria are the evolutionary success story, not humans. Humans are a recent evolutionary development, whereas bacteria have existed for hundreds of millions of years and will likely outlast humans. There is no inherent reason why humans were an inevitable development. A few environmental changes here or there, fewer or more asteroid impacts, and humans would not exist now.

To some, Gould’s argument diminishes the dignity of humans by portraying them as mere evolutionary accidents. If, indeed, the emergence of an intelligence conscious enough to consider its place in the universe is a rare random event that would seem to make its development even more precious. If, in the enormous universe, conscious intelligence is not inevitable, if four billion years of development on Earth does not guarantee a sentient and self-aware species, it seems like an awful waste of time and space.

If humans are an unlikely accident, then any single human is far rarer. Given the 23 genes donated by each parent, over 8 million genetically different children could result from any set of parents. In one such accident in 1941, Stephen Jay Gould was born and we have all been the better for this rare conjunction of genetic material. We are all diminished by the all too early loss of his voice.

Carter in Cuba

May 17th, 2002

Ex-presidents devote themselves to a variety of pursuits upon leaving the world’s most powerful office. Gerald Ford and George Bush (41) focused on private pursuits and enjoying their families, especially their grandchildren. For excitement, Ford plays golf and Bush jumps out of planes. Nixon spent most of his ex- presidency enduring the shame of resignation and writing books on foreign policy desperately trying to achieve the status of elder statesman. Unfortunately, the oldest American to leave the presidency, Ronald Reagan, is suffering from Alzheimer’s disease. Reagan is on, what he described as “the journey that will lead me into the sunset of my life.” It is too soon to say definitively how Clinton, who has many years ahead of him, will spend his post-presidency years. They only certainty is that he will earn a lot of money. Because of his charitable work on behalf of Habitat for Humanity, Carter has often been held up as the best ex-president. In 1980, the country was obviously convinced that this was an apt role for Carter so they hastened his transition with a landside vote intended to make Carter an ex-president.

Carter also has the annoying habit of complicating the foreign policy of his successors by independently consulting with foreign leaders. Cavorting with despots is Carter’s particularly favorite activity. He apparently believes that others are enlightened by the beacon of his own virtue.

Carter’s recent trip to Casto’s Cuba is illustrative. On balance, Carter’s trip was probably salutary. He met with and encouraged dissident groups and emphasized America’s commitment to freedom and democracy. He gave an address at the University of Havana that was broadcast on Cuban state-controlled radio and television. To his credit, Carter delivered his address in Spanish.

Nonetheless, the opportunity to give such an address is rare. Measured against what could have been, Carter’s speech was a disappointment. It is not that Carter’s speech was a bad or inappropriate one, but rather that it could have been so much better, so much more memorable, and so much more important.

Carter’s tone was one of moral equivalency between the world’s dominant democracy and an island ruled by a thug. Americans may be freer, but heck, Cubans are lucky enough to have socialized medicine. It is sort of like arguing that Hitler may have had some human rights problems, but gee, the trains ran on time. Carter believes in a variant of the “I’m OK, you’re OK” foreign policy. Carter worried that Americans and Cubans suffered a “misunderstanding,” as if our differences are minor and inconsequential.

Contrast this speech with a speech given at Moscow State University in 1988 by Ronald Reagan. While in his speech Carter dutifully mentioned human rights violations, Ronald Reagan explained how political and economic freedoms were not just another choice, but essential to the dignity of man. While in Cuba, Carter described how Americans are free to start their own businesses. Reagan made heroes out of entrepreneurs by calling them “explorers of the modern era…with courage to take risks and faith enough to brave the unknown.” Carter extended the hand of friendship to the Cubans. Reagan directed the Soviets to a higher calling hoping that “freedom…will blossom forth …in the rich fertile soil of your people and culture.”

It is not so much that Reagan employed soaring rhetoric and powerful imagery while Carter’s language was more pedestrian. There is something more fundamentally different. Carter is apologetic about America. Reagan saw America as a “shining city on a hill.” Reagan always sought to be as good a president as his country deserved. Carter sought to make his country as good as he perceived himself to be. Reagan believed in America, Carter believes in his own rectitude. In his righteousness, Carter squandered a unique opportunity to call Cubans to freedom and to make a powerful demand for Castro to free his people. It is unfortunate that Carter delivered such a forgettable speech.

Thoughts of a Father on Father’s Day

May 16th, 2002

The statistics are clear to everyone with even an approximation of an open mind. The presence of a father in the home is highly correlated with the well being of children. Children fortunate enough to have both a father and a mother in the home perform better is school, are healthier, are less likely to live in poverty or commit suicide, and are less likely to become involved in drugs, than children raised by a single parent. On Father’s Day, it is important to emphasize the importance of fathers.

This strong positive social effect does not necessarily mean that the relationships between fathers and sons will always be smooth and easy. Indeed, the father-son relationship can be complex and define the way both interact with the rest of the world.

In an article in the National Review, Mark Goldblatt ruminates over the relationship with his own father. He argues that the natural competition between fathers and sons explains why “Sons are their fathers’ only natural predators.” Goldblatt suggests that the following dilemma confronts fathers and their sons. If a son fails to become as accomplished as his father, the father is disappointed. If on the other hand, the son is more successful, the father is left with sour envy. It is as if the success of the son somehow acts as a reproach of the father. It is almost certain that Goldblatt’s generalization is overly tainted by the relationship he describes with his own father in National Review. Goldblatt’s father lacked a college education and was apparently unsure of his own intelligence. The young Goldblatt was afflicted with the natural arrogance of youth and confident in his own abilities. Perhaps the young Goldblatt even deliberately aggravated his father’s sensitivity. As Goldblatt explains,

“[My father] looked up from dinner one evening and said, `You probably think you’re smarter than me, don’t you?’ So I glanced up at him and replied, `No, not really.’ This was a lie: Of course I was smarter than he was! The issue had been settled so long ago in my mind that I thought he was asking a trick question.”

Goldblatt story is a sad one. Apparently, he has spent the time after his father’s death trying to reconcile himself to the relationship he had with his father.

However, the fallacy of the father’s dilemma as posed Goldblatt lies in the assumption that the success and challenges faced by sons can be separated from those of the father. If a son is very successful, then the father is successful as well. If a son is struggling, then the father shares in the struggle.

It is not that fathers fail to compete with children. Fathers should compete with their sons (and daughters) as a means to build up the competence and confidence of their children, butnot as a way to demonstrate their own vigor and superiority. Children need the challenges posed by parents to develop a sense of their limits and strengths. But their successes and failures are shared by their parents

Nonetheless, when children get a little too confident, it is wise to provide them a little perspective. To my children, I often find myself paraphrasing the words of Sir Isaac Newton. If my children can see farther than their parents, it is because they stand on the shoulders of giants.

The Weakness of an International Criminal Court

May 12th, 2002

It is politically convenient for Democrats to characterize President George Bush as a single-minded right-wing ideologue. The truth is that Bush is an ideological conservative, but also a temperamentally moderate practical politician, very disposed to tact with the prevailing political winds. While he is focused on the pursuit of terrorists, on just about every other issue, Bush has shown a readiness to compromise with political adversaries in Congress when necessary. Bush signed an Education Bill that was far shorter on reform than he would have wanted. Bush signed a Campaign Finance Reform Bill he believes may violate the First Amendment. More recently, Bush has promised to sign a budget-busting agriculture bill aimed by Democrats and Republicans at purchasing contested Senate seats in the Midwest.

It is, therefore, pleasing that the Bush Administration has decided to eschew the easy path and renounce United States support for the International Criminal Court. The Administration accurately argued that, “…the International Criminal Court is built on a flawed foundation. These flaws leave it open for exploitation and politically motivated prosecutions.”

Even President Clinton recognized the shortcomings of the agreement, but characteristically tried to have it both ways. He signed the agreement on December 31, 2000, but did not submit the treaty for ratification in the Senate knowing it faced defeat. Clinton played to European opinion, while avoiding any political price at home.

The International Criminal Court (ICC) is a standing court ostensibly designed to prosecute those guilty of genocide and crimes against humanity. It is likely to be yet another European bureaucracy, headed by a prosecutor accountable only to himself, designed more for political posturing than to address serious prosecutions. At best, the ICC is unnecessary and at worst, it could make more difficult the transition from authoritarian or totalitarian regimes to more democratic ones.

There are many despots and mass murders, who in a perfect world, could and should be subject to prosecution, but so long as they remain in their own countries they are unlikely to ever be punished. The organizer of the mass murder at the New York World Trade Centers, Osma bin Ladin, or North Korean leader Kim Jong Il are not particularly worried about prosecution by any standing international court. In other cases, when an overwhelming military victory makes it possible to seize persons involved in war crimes or genocide, the formation of ad hoc courts of jurisdiction has not been a problem. The International Military Tribunal held in Nuremberg, Germany following World War II was sufficient to try captured Nazi leaders.

In some cases, the presence of a standing court could prolong the tenure of despotic regimes. For example. it is certainly the case that Augusto Pinochet, who ruled Chile from 1973 to 1990, could be convicted of leading a cruel and murderous regime. However, the settlement that ushered in a democratic government promised amnesty to both the military government and anti-government rebels. Without the amnesty or with threat of prosecution by a third party international court, the Chilean military government may have found it in their interests to hold out longer to avoid prosecution and punishment. In such a case, the price of a standing international court might be unnecessarily prolonged suffering.

In the United States, it took the prosecution of Democrats and Republicans by various independent prosecutors to convince both parties that an unregulated prosecutorial office is open to political abuse. There can be no doubt that the European-dominated ICC will BE subject to the same political imperatives. Given the European culture and the broad language of ICC protocols which allows prosecutions for such vague crimes as imposing “mental harm,” one can envision that such a court would prosecute US officials for genocide in allowing for capital punishment, for collateral damage in Afghanistan, or for the suffering caused by the embargo against Iraq. Israelis will face prosecution for anti-terrorist activities.

At the same time, Europeans are too busy sunning themselves on vacations to Cuban beaches to ever bother prosecuting Fidel Castro for four decades of oppression and murder. Europeans are too dependent on drinking at the spigot of Iraqi oil to prosecute Saddam Hussein for his use of biological weapons against Iraqis. Even if the court could bring itself to prosecute the leaders of such regimes, without the ability to enforce their decisions the prosecutions become fruitless.

Yes, it is heartening, that Bush sees through the posturing and moral chest beating of European and American supporters of the ICC and refused to follow the fantasy. Without the US, the ICC will just become another small and irrelevant bureaucracy providing lifetime employment for another generation of European intellectuals.

Letting Standards Fall

May 5th, 2002

One bit of conventional wisdom holds that an academic degree from Harvard is a ticket to a life of affluence and ease. Regardless of the accuracy of that observation, it appears that admission to Harvard as an undergraduate is now virtually a guarantee of excellent grades. Students overwhelmingly accumulate A’s and B’s and a remarkable 90% graduate with honors. The honors citation has increasingly become a way to identify the few poor students who don’t receive honors rather than a means to focus particular tribute upon outstanding students.

Patrick Healy of the Boston Globe, interviewed Trevor Cox in his senior year at Harvard. Only in his last year was Cox finally challenged by the work on his senior thesis. Cox explained, “I’ve coasted on far higher grades than I deserve… It’s scandalous. You can get very good grades, and earn honors, without ever producing quality work.”

A few professors at Harvard have attempted to maintain an island of integrity in the on rushing torrent of easy A’s. Professor Harvey C. (“C-minus”) Mansfield, who teaches Government 1061, was a notoriously hard grader in comparison to his colleagues. Actually, his grading policy had remained constant over time while policies had loosened around him. Students were torn. If they took Mansfield’s course, they might be challenged but only at the cost of hurting themselves in the class rank competition among other students.

In an effort to strike a compromise, Mansfield now awards two grades. The official grade for the transcript is in keeping with the easy grading policies of his colleagues. The second grade tells students what they truly deserve. For the official record, only 27% of his students received a B or lower. The overwhelming majority were awarded a B-plus or above. For the second grade, only 15% of his students earned a grade higher than a B.

There are many reasons for grade inflation in academia, particularly at Ivy League schools. Part of it began during the Vietnam era when high grades helped students remain in school and retain an academic deferment from the draft. Interestingly, the largest jump in grades occurred when the average SAT scores dropped.

In 1969, Harvard made a bold effort to admit additional minority students. The African-American enrollment in the freshman class doubled from 60 to 120. SAT scores for entering freshman dropped, yet the fraction of grades of B or higher increased 10%. Not only were professors making allowances for a new set of less academically prepared students, out of fairness, they made it easier for other students as well.

The University of California system is preparing to embark on a similar reduction of standards in the service of attempts to change the demographics of enrollment. Richard Atkinson, the president of the university system, released a report arguing that the SAT tests should no longer be required for admission.

Atkinson frets that when he visited a private school he observed, “students studying verbal analogies in anticipation of the SAT.” Some observers might be heartened by diligent students improving their verbal skills. Atkinson, who is more astute in these matters than most of us, concluded, “America’s overemphasis on the SAT is compromising our educational system.” Obviously, we must free students from the oppressive yoke of verbal analogies.

How about this for a verbal analogy? Atkinson is to academic excellence, what rust is to metal. Atkinson is a corrosive force that if left unchecked can undermine the academic integrity of the University of California system.

There really are two not-so-attractive reasons for eliminating SAT scores as a requirement. The first is to allow the University of California school system to make admissions decisions based on skin color and ethnic background without the interference of academic preparedness as an inconvenient constraint. Ironically, the second reason some schools have eliminated SAT’s as an academic requirement is to increase the apparent (though not actual) academic selectiveness of the university. When SAT’s are optional, the better students tend to be the only ones who submit them. The average SAT scores (among those reporting, of course) will increase. When various college ranking services rank colleges by the SAT scores of incoming freshman, schools that eliminate SAT’s as a requirement are at an advantage. Dickinson College reportedly had a 60-point increase in the average SAT score of incoming freshman when the test was made optional.

The truth is that Atkinson of the University of California and Harvard University need to recognize that no amount of fudging the results of the educational process at the tail end is sufficient. Indeed, such efforts are probably counterproductive. There is a real and urgent problem with educational opportunity for minority students. The sooner we grant such students the means to escape failed public school systems, the sooner the University of California system, Harvard University, and other schools can return to the celebration of academic excellence rather than avoiding its consequences.

References:

  • Healy, Patrick,“Harvard’s Quiet Secret: Rampant Grade Inflation,” Boston Globe, October 27, 2001.
  • Arkin-Gallagher, Anna, “California SAT Decision Sparks Controversy,” The Yale Hearld, May 3, 2002.

Pondering the Infinite

April 28th, 2002

Pondering the infinite is an activity usually relegated to undergraduate philosophy students, particularly in their sophomore year. Physicists often spend their time reducing physical phenomena that are for all practical purposes, like the size of the universe, infinite to comprehensible descriptions. Mathematicians are perhaps the most facile in dealing with and manipulating concepts of infinity. For a mathematician, it is a simple matter to specify a mathematical surface that is infinite in area, but encloses a finite volume. In other words, mathematicians can conceive of a shape that one could fill with paint, but not paint. It is not until recently, that people in computer sciences have considered quantities and qualities, which if formally finite, may prove to be practically infinite.

In 1965, Gordon Moore, one of the founders of Intel, extrapolated from the fact that the number of transistors on a integrated circuit grew from one in 1959, to 32 in 1964, to 64 in 1965 that transistor density was doubling every 18 to 24 months. This is the narrow statement of Moore’s Law. The more general statement of Moore’s Law is that computer computing power doubles every 18 to 24 months.

The latter formulation of Moore’s law has been given more depth by MIT-educated computer scientist, entrepreneur and writer Ray Kurzweil. He has tracked back the growth of computer power from electromagnetic punch card calculators used in the 1890 census to Pentium 4 processors that have 42 million transistors. Kurzweil foresees accelerating increases in computer power past physical limits of silicon-based devices as manufactures employ more exotic bio-chemical technologies.

Much thought has been given to whether Moore’s Law can really exceed limits posed by silicon-based technology and the ever-increasing capital costs required to construct chip-manufacturing plants. Additional consideration has been given as to what this increased computer power can be used for. Kurzweil is not shy about predicting a future with machines that are more intelligent than humans and computer implants interfaced to human minds. While increases in computer capacity has proven to be more persistent in time that anyone has a right to expect, predictions about the future abilities of artificial intelligence have a notorious record of over optimism.

What has not received much thought is the rapid increase in data storage. Writing in American Scientist, Brian Hayes explains how recent changes in technology are actually increasing rate of growth is disk storage. A large disk on a personal computer is about 120 GBytes. Technologies in the laboratory presently achieve storage densities equivalent to disks with 400 GBytes of storage.

At the present rate of increase, personal computer disks will reach 120 Terabytes (120,000 GBytes) in size in ten years. Even if the growth rate decreases by 60 percent, the 120 TByte level will be reached in 15 years. What are we to do with this storage capability? Is natural American acquisitiveness sufficiently great to use of this space.

Recently MP3 digtial music files have been filling disks, especially in college dorms. However, as Hayes points out, if you put enough music to listen to different songs 24 hours a day for an 80 year lifetime you barely fill a third of a 120 TByte disk disk. Even this assumes that storage technology would remain fixed over the 80-year lifetime.

Digital photographs are a new source of data filling up disks. Assuming each such photograph require 1 MByte of storage and assuming a itchy shutter finger producing 100 photographs a day — certainly a well-documented life — less than 3% of the 120 TBytes would be filled.

Fundamentally, storage of video is the only data source likely to fill 120 Tbyte disks. Even so, with growth beyond 120 TBytes over our lifetimes, we likely face the prospect of being able to store more data than we have. It is roughly comparable to having an attic that is growing so fast that we cannot fill it fast enough.

It seems that if we are having problems filling up new disks over a lifetime, the only solution is to increase lifetimes.

  • Fixmer, Rob, “Internet Insight, Moore’s Law and Order,” Eweek, April 15, 2002.
  • Hayes, Brian, “Terabyte Territory,” American Scientist, 90, 212-216, May-June, 2002.