scispace - formally typeset
Search or ask a question

Showing papers in "Policy Review in 2002"


Journal Article
TL;DR: The United States and Europe share a common "strategic culture" as discussed by the authors, which is a caricature of a "culture of death," its warlike temperament the natural product of a violent society where every man has a gun and the death penalty reigns.
Abstract: IT IS TIME to stop pretending that Europeans and Americans share a common view of the world, or even that they occupy the same world. On the all-important question of power -- the efficacy of power, the morality of power, the desirability of power -- American and European perspectives are diverging. Europe is turning away from power, or to put it a little differently, it is moving beyond power into a self-contained world of laws and rules and transnational negotiation and cooperation. It is entering a post-historical paradise of peace and relative prosperity, the realization of Kant's "Perpetual Peace." The United States, meanwhile, remains mired in history, exercising power in the anarchic Hobbesian world where international laws and rules are unreliable and where true security and the defense and promotion of a liberal order still depend on the possession and use of military might. That is why on major strategic and international questions today, Americans are from Mars and Europeans are from Venus: They ag ree on little and understand one another less and less. And this state of affairs is not transitory -- the product of one American election or one catastrophic event. The reasons for the transatlantic divide are deep, long in development, and likely to endure. When it comes to setting national priorities, determining threats, defining challenges, and fashioning and implementing foreign and defense policies, the United States and Europe have parted ways. It is easier to see the contrast as an American living in Europe. Europeans are more conscious of the growing differences, perhaps because they fear them more. European intellectuals are nearly unanimous in the conviction that Americans and Europeans no longer share a common "strategic culture." The European caricature at its most extreme depicts an America dominated by a "culture of death," its warlike temperament the natural product of a violent society where every man has a gun and the death penalty reigns. But even those who do not make this crude link agree there are profound differences in the way the United States and Europe conduct foreign policy. The United States, they argue, resorts to force more quickly and, compared with Europe, is less patient with diplomacy. Americans generally see the world divided between good and evil, between friends and enemies, while Europeans see a more complex picture. When confronting real or potential adversaries, Americans generally favor policies of coercion rather than persuasion, emphasizing punitive sanctions over inducements to better behavior, the stick over the carrot. Americans tend to seek finality in international affairs: They want problems solved, threats eliminated. And, of course, Americans increasingly tend toward unilateralism in international affairs. They are less inclined to act through international institutions such as the United Nations, less inclined to work cooperatively with other nations to pursue common goals, more skeptical about international law and more willing to operate outside its strictures when they deem it necessary, or even merely useful. (1) Europeans insist they approach problems with greater nuance and sophistication. They try to influence others through subtlety and indirection. They are more tolerant of failure, more patient when solutions don't come quickly. They generally favor peaceful responses to problems, preferring negotiation, diplomacy, and persuasion to coercion. They are quicker to appeal to international law, international conventions, and international opinion to adjudicate disputes. They try to use commercial and economic ties to bind nations together. They often emphasize process over result, believing that ultimately process can become substance. This European dual portrait is a caricature, of course, with its share of exaggerations and oversimplifications. One cannot generalize about Europeans: Britons may have a more "American" view of power than many of their fellow Europeans on the continent. …

377 citations


Journal Article
TL;DR: The International Criminal Court (ICC) is the most important international institution since the United Nations over the opposition of the most powerful nation in the world as mentioned in this paper, and the United States is not bound by the ICC in any way.
Abstract: IN APRIL 2002, delegates from 66 nations and dozens of nongovernmental organizations (NGOS) gathered at United Nations headquarters in New York to celebrate the ratification of the treaty creating the International Criminal Court (ICC). In the back of the room, the chair reserved for the delegate from the United States stood empty, and in a subsequent letter the Bush administration confirmed that the U.S. would not participate in or be bound by the court in any way. The treaty establishing the court entered into force on July 1, 2002, thereby creating what many describe as the most important international institution since the United Nations over the opposition of the most powerful nation in the world. The celebratory spirit in New York was nothing compared to the delegates' reaction in Rome in the summer of 1998. After five weeks of negotiation over the ICC treaty, the United States was clearly frustrated with the power politics and maneuvering of a group that called itself "like-minded" states and their collaborators, the NGOS. On the final day, the U.S. called for a vote and found itself on the losing side by a stunning 129-7. Normally reserved diplomats broke out in cheers and chants, accompanied by rhythmic stomping and applause. Yes, the treaty creating an International Criminal Court had been approved, but more than that, the "new diplomacy" had won a major victory over the United States. Debuting as the "Ottawa Process" in 1996, the new diplomacy successfully led a fast track campaign of NGOS and small and medium sized nations to a treaty banning anti-personnel land mines. The bold break from traditional processes, the innovative methodology, and the amazing speed of these efforts won widespread attention, as well as a share of the 1997 Nobel Peace Prize for the NGO leader, American Jody Williams. Still, there were unique features to developing the land mine treaty that did not seem easily replicable, and few could foresee that the Ottawa Process might be the first act of a major new diplomatic drama. With Act Two, the establishment of an International Criminal Court, under its belt, the new diplomacy has now moved from its Ottawa debut to the center stage of the diplomatic world in Rome and New York. It is time for a critical review of its performance, including an understanding of its actors and methods, how others including the United States might interact with it, and what the future for the new diplomacy may hold. The end of Cold War diplomacy THE END OF THE Cold War and its predictable structure of international relations set the stage for new forms of diplomacy. From the close of World War 11 to the fall of the Berlin wall, the great powers that opposed Hitler dominated the diplomatic stage. In a bipolar world based on ideology, the opposing forces lined up in conventional ways, with both military and diplomatic battles fought between states. Even the structure of international organizations such as the United Nations bore the stamp of the great powers, with the five permanent members of the Security Council able to veto proposals not consonant with their national interest. The Cold War drama generally pitted the U.S. versus the Soviet Union, often involving surrogate states. In his 1992. state of the union address, President George H.W Bush took note of the changing global scene, boasting that the United States was now the world's "sole and preeminent power" and the "undisputed leader of the age.' Familiar, perhaps even comfortable, with a bipolar world, experts began the search for the next American rival. Much attention focused on China, though it seemed to be some distance away from superpower status. Others wondered whether Europe, beginning to band together for trade and monetary policy, might form an influential bloc. Most assumed that the U.S. alone would dominate the post-Cold War world or that, ultimately, alliances or other countries would rise to challenge its leadership. …

56 citations


Journal Article
TL;DR: For instance, the United States and its European allies cooperated in a grand strategic venture to create a democratic, peaceful, prosperous continent free of threats from within and without.
Abstract: FOR 50 YEARS and more, the United States and our European allies cooperated in a grand strategic venture to create a democratic, peaceful, prosperous continent free of threats from within and without. At the dawn of a new century, that task is approaching completion. This autumn both NATO and the EU are likely to launch so-called "Big Bang" rounds of enlargement, encompassing up to seven and 10 countries, respectively. If successful, these moves will help lock in democracy and security from the Baltic to the Black Sea. Relations between Russia and the West are also back on track. Russian President Vladimir Putin has opted to protect Moscow's interests by cooperating with the U.S. and Europe rather than by trying to play a spoiler role. The certitude of that decision and, above all, the depth of Moscow's commitment to democracy at home remain open questions. But Putin's turn to the West has further reduced the risk that Russia might again become a strategic adversary and has instead opened a window to put the West's relations with Russia on a more stable and cooperative footing. There is still work to be done. Not all of the European democracies are ful1y functional and not all of the European economies are prosperous. Completing Central and Eastern Europe's integration will take time even after they join NATO and the EU. Balkan instability has been stemmed but the underlying tensions are not yet resolved. Ukraine's westward integration and that of Russia will remain works in progress for years to come. And the West is only waking up to the challenge of the Caucasus and Central Asia. But the key cornerstones of a new, peaceful European order are in place. The grand strategic issues that preoccupied statesmen and strategists for the second half of the twentieth century -- Germany's internal order and place in Europe, the anchoring of Central and Eastern Europe to the West, and the establishment of the foundation for a democratic Russia to integrate itself with Europe -- have been or are in the process of being largely resolved. Europe today is at peace with itself and more democratic and secure than at any time in history. If Harry Truman and his European counterparts could look down upon us today, they would no doubt be proud of what has been accomplished in their names. Unfortunately, there is bad news too. The extraordinary accomplishment of the Atlantic alliance does not mean that America and Europe are now safe and secure. Success on the continent has been matched by the emergence of new threats from beyond. September 11 has brought home what a number of strategists have been predicting for years -- that the new century would usher in new, different, and potentially very dangerous threats to our societies. On the verge of eradicating the danger to our societies from intra-European war and thermonuclear exchanges, we are faced with new scourges -- terrorism, weapons of mass destruction, mass migrations, rogue and failed states, and the threat of disruptions to the economic lifelines of the world. September 11 has become a symbol and metaphor for the new perils looming on the horizon. No one can doubt that Osama bin Laden would have used weapons of mass destruction on September I I if he had had them. We know that al Qaeda and similar groups are trying to obtain such weapons and will, in all probability, use them if they succeed. The odds of their success are too good for comfort. Indeed, the likelihood of weapons of mass destruction being used against our citizens and societies is probably greater today than at any time since the Cuban missile crisis. While America is the target of choice for these terrorists, Europe may not be far behind. It was certainly no accident that the United States was struck September 11, but it is not much of a stretch to imagine a similar attack on Europe in the future. There is already ample evidence of past terrorist plots by these groups on the continent. …

20 citations


Journal Article
TL;DR: Feynman as discussed by the authors pointed out that while educational research sometimes adopts the outward form of science, it does not burrow to its essence, and pointed out some fundamental reasons why educational research has not provided dependable guidance for policy and suggested how to repair what it lacks.
Abstract: "We really ought to look into theories that don't work, and science that isn't science. I think the educational ... studies I mentioned are examples of what I would like to call cargo cult science. In the South Seas there is a cargo cult of people. During the war they saw airplanes with lots of good materials, and they want the same thing to happen now. So they've arranged to make things like runways, to put fires along the sides of the runways, to make a wooden hut for a man to sit in, with two wooden pieces on his head for headphones and bars of bamboo sticking out like antennas -- he's the controller -- and they wait for the airplanes to land. They're doing everything right. The form is perfect. It looks exactly the way it looked before. But it doesn't work. No airplanes land. So I call these things cargo cult science, because they follow all the apparent precepts and forms of scientific investigation, but they're missing something essential, because the planes don't land." Richard P. Feynman, "Cargo Cult Science," Surely You're Joking, Mr Feynman!: Adventures of a Curious Character (Norton, 1985). AFTER MANY YEARS of educational research, it is disconcerting -- and also deeply significant -- that we have little dependable research guidance for school policy. We have useful statistics in the form of test scores that indicate the achievement level of children, schools, and districts. But we do not have causal analyses of these data that could reliably lead to significant improvement. Richard Feynman, in his comment on "cargo cult science," identifies part of the reason for this shortcoming -- that while educational research sometimes adopts the outward form of science, it does not burrow to its essence. For Feynman, the essence of good science is doing whatever is necessary to get to reliable and usable knowledge -- a goal not necessarily achieved by merely following the external forms of a "method." The statistical methods of educational research have become highly sophisticated. But the quality of the statistical analysis is much higher than its practical utility. Despite the high claims being made for statistical techniques like regression analysis, or experimental techniques like random assignment of students into experimental and control groups, classroom-based research (as contrasted with laboratory research) has not been able to rid itself of uncontrolled influences called "noise" that have made it impossible to tease out the relative contributions of the various factors that have led to "statistically significant" results. This is a chief reason for the unreliability and fruitlessness of current classroom research. An uncertainty principle subsists at its heart. As a consequence, every partisan in the education wars is able to utter the words "research has shown" in support of almost any position. Thus "research" is invoked as a rhetorical weapon -- its main current use. In this essay I shall outline some fundamental reasons why educational research has not provided dependable guidance for policy, and suggest how to repair what it lacks. On a positive note, there already exists some reliable research on which educational policy could and should be based, found mainly (though not exclusively) in cognitive psychology. In the end, both naturalistic research and laboratory research in education have a duty to accompany their findings with plausible accounts of their actual implications for policy -- as regards both the relative cost of the policy in money and time and the relative gain that may be expected from it in comparison with rival policies. Including this neglected dimension might wonderfully concentrate the research mind, and lead to better science in the high sense defined by Feynman. A tale of two studies THE NOVEMBER 2001 issue of Scientific American includes an article called "Does Class Size Matter? …

17 citations


Journal Article
TL;DR: In the case of the war in Afghanistan, the British special service forces played a key role in the success of the U.S. operation as mentioned in this paper and achieved a high level of public support in the United Kingdom.
Abstract: THE FORMER U.S. AMBASSADOR to the Court of St. J James, Ray Seitz, recalls in his autobiography preparations for President Bill Clinton's first meeting with Britain's then-prime minister, John Major. Sitting in the Oval Office, the president was reminded by one of his aides to mention the magic phrase "special relationship." "Oh yes," said Clinton. "How could I forget?" And he burst out laughing. The events of September II cast a different light on this joke, as they did on the frivolity of the rest of the Clinton years. Ten days after al Qaeda's attacks on New York and Washington, President George W. Bush told Congress that America had "no truer friend than Great Britain." This was more than a gracious compliment to British Prime Minister Tony Blair, who was listening in the gallery. Americans, including America's commander in chief, were moved by London's response to the attacks on the United States. Everyone (bar Saddam Hussein) had condemned the loss of civilian life. But whereas, for example, the Belgian foreign minister -- acting as president of the European Union -- was soon talking of "limits to [EU-U.S.] solidarity," Blair's anger was unmistakable and his robustness unwavering. The British prime minister spoke of "barbarism" and "shame for all eternity....Are we at war with the people who committed this terrible atrocity? Absolutely." Blair's sentiments were palpably sincere. But they also reflected his intuitive grasp of Britain's national mood: The truth is that the nation, as a whole, clearly did exhibit far stronger ties of sympathetic solidarity with America than did anyone else. According to a Daily Telegraph/Gallup poll taken in early October, 70 percent of Britons supported military action against Afghanistan: They did so even if it meant large numbers of Afghan civilian casualties and despite the risk of substantial numbers of British troops being killed or wounded. So when the following month a Newsweek poll asked Americans whether particular countries had done enough to support the U.S. during the crisis, Britain earned the highest approval ratings. Blair's own popularity in the United States soared to levels rarely if ever enjoyed by a foreign statesman. Equally significant, American conservatives previously unenamored with the British prime minister were suddenly effusive with praise. All of which was highly gratifying to enthusiasts of the Anglo-American "special relationship" -- a somewhat embattled political minority on both sides of the Atlantic in recent years. But, viewed objectively, the practical results of the revivified "special relationship" have turned out to be meager, in some ways plain disappointing. Capabilities and will OF AMERICA'S EUROPEAN allies only France and Britain possessed a significant capacity to assist in the war on terrorism, and only Britain had the will. A British task force was accordingly deployed in the Gulf; British submarines fired Tomahawks against Taliban targets on two occasions. Within Afghanistan, members of Britain's SAS regiment -- without doubt the most skilled special service forces in the world -- performed taxing and dangerous tasks with great success, notably in attacking the al Qaeda training camp outside Kandahar and in hand-to-hand fighting in the Tora Bora region. British forces are still involved in mopping-up operations against the enemy. The pity is that from first to last these exploits have mattered little in the overall outcome. This has been America's war, and the U.S. has fought it according to its own battle plan and almost entirely with its own resources. The military cooperation of the Central Asian states of Uzbekistan and Tajikistan -- frontline but hardly first-rate powers -- probably mattered more at the crucial stages than that of Britain. Moreover, the significance of the 200 or so British soldiers sent to fight by America's side in Afghanistan is put into some perspective by the British Foreign Office's estimate that about the same number of British citizens were engaged on the side of al Qaeda. …

11 citations


Journal Article
TL;DR: For example, Montezuma was at a loss to know what to make of the 9-11 attacks as mentioned in this paper. But this did not answer the great question: What did it all mean?
Abstract: "KNOW YOUR ENEMY" is a well-known maxim, but one that is difficult to observe in practice. Nor is the reason for this hard to fathom: If you are my enemy, it is unlikely that I will go very much out of way to learn to see things from your point of view. And if this is true even in those cases where the conflict is between groups that share a common culture, how much more true will it be when there is a profound cultural and psychological chasm between the antagonists? Yet, paradoxically, this failure to understand the enemy can arise not only from a lack of sympathy with his position, but also from a kind of misplaced sympathy: When confronted by a culturally exotic enemy, our first instinct is to understand such conduct in terms that are familiar to us -- terms that make sense to us in light of our own fund of experience. We assume that if our enemy is doing x, it must be for reasons that are comprehensible in terms of our universe. Just how unfortunate -- indeed, fatal -- this approach can be was demonstrated during the Spanish conquest of Mexico. When Montezuma learned of Cortes's arrival, he was at a loss to know what to make of the event. Who were these white-skinned alien beings? What had they come for? What were their intentions? These were clearly not questions that Montezuma was in a position to answer. Nothing in his world could possibly provide him with a key to deciphering correctly the motives of a man as cunning, resourceful, and determined as Cortes. And this meant that Montezuma, who, after all, had to do something, was forced to deploy categories drawn from the fund of experience that was ready-to-hand in the Aztec world. By a fatal coincidence, this fund of experience chanced to contain a remarkable prefiguring of Cortes -- the myth of the white-skinned god, Quetzalcoatl. And, indeed, the parallels were uncanny. But, of course, as Montezuma eventually learned, Cortes was not Quetzalcoatl, and he had not appeared on the coast of Mexico in order to bring blessings. We should not be too harsh on Montezuma. He was, after all, acting exactly as we all act under similar circumstances. We all want to make sense of our world, and at no time more urgently than when our world is suddenly behaving strangely. But in order to make sense of such strangeness, we must be able to reduce it to something that is not strange -- something that is already known to us, something we know our way around. Yet this entirely human response, as Montezuma learned to his regret, can sometimes be very dangerous. An act of war? ON SEPTEMBER 11, 2001, Americans were confronted by an enigma similar to that presented to the Aztecs -- an enigma so baffling that even elementary questions of nomenclature posed a problem: What words or phrase should we use merely to refer to the events of that day? Was it a disaster? Or perhaps a tragedy? Was it a criminal act, or was it an act of war? Indeed, one awkward TV anchorman, in groping for the proper handle, fecklessly called it an accident. But eventually the collective and unconscious wisdom that governs such matters prevailed. Words failed, then fell away completely, and all that was left behind was the bleak but monumentally poignant set of numbers, 9-11. But this did nor answer the great question: What did it all mean? In the early days, there were many who were convinced that they knew the answer to this question. A few held that we had got what we had coming: It was just desserts for Bush's refusal to sign the Kyoro treaty or the predictable product of the U.S. decision to snub the Durban conference on racism. Others held, with perhaps a greater semblance of plausibility, that the explanation of 9-11 was to be sought in what was called, through an invariable horticultural metaphor, the "root cause" of terrorism. Eliminate poverty, or economic imperialism, or global warming, and such acts of terrorism would cease. …

10 citations


Journal Article
TL;DR: In the United States, Islam is the most male religion with roughly two men for every woman as discussed by the authors, and the largest number of immigrants derive from three main sources: South Asia, Iran, and the Arabic-speaking countries.
Abstract: OUR RESPECTIVE BOOKSHELVES groan under the weight of books bearing titles like Islam and the West, The Future of Islam and the West, and The Islamic World and the West. What is striking about these books -- all quite recently written and published -- is the anachronism of their geographic premise. With millions of Muslims now living in the West, especially in North America and Western Europe, the old dichotomy of Islam and the West exists no more. This presence of Muslims in the West has profound importance for both civilizations involved, the Western and the Islamic, and has a potential for both good and ill. Indeed, looking ahead, it is hard to see any other cultural interaction quite so fraught with implications as this one. As has become evident of late, a vast number of Muslims, those living in Europe and the Americas no less than those elsewhere, harbor an intense hostility to the West. For most Muslims, this mix of envy and resentment remains a latent sentiment, but for some it acquires operational significance. Merely to conjure the names of Ayatollah Khomeini, Muammar Qaddafi, Saddam Hussein, and Osama bin Laden is to convey the power of this hatred, its diverse ideological roots, and its power to threaten. Their counterparts also live in the West, where they have a unique inclination not just to disrupt through violence but also to challenge the existing order. Will this challenge be contained or will it bring yet greater problems, including violence? This essay focuses on just one portion of Western Islam, namely those Muslims who live in the United States and who are either immigrants or their descendants (hereafter referred to as "Muslim immigrants"). It does not deal with the other major component, the converts, nor does it deal with other Western countries. Demography and geography The first challenge in studying Muslim immigrants in the United States is counting them. By law, the official census cannot count adherents of a religion, and Muslims are too few to show up reliably in most survey research. In addition, there are questions about whom to count (do Ahmadis, legally not considered Muslims in Pakistan, count as Muslims in the United States?). Taking these and other complications into account, a statistical picture is emerging that points to a total Muslim population in the United States of about 3 million, of which immigrants make up two-thirds to three-fourths. Accepting that this number is necessarily rough, it does point to somewhat over 2 million Muslim immigrants, or slightly less than 1 percent of the U.S. national population. Immigrant Muslims are ethnically extremely varied, coming from virtually every country where Muslims live, or well over a hundred countries in all. Symbolic of this diversity, Los Angeles alone boasts such exotic food fare as the Chinese Islamic Restaurant and the Thai Islamic Restaurant. The largest number of immigrants derive from three main sources: South Asia, Iran, and the Arabic-speaking countries. The single largest group of Muslim immigrants are those from South Asia (meaning Bangladesh, India, and Pakistan), followed by perhaps 500,000 Iranians and 400,000 from the Arab countries. Shi'is, who make up about 10 percent of the worldwide Muslim population, probably comprise about the same percentage of the U.S. Muslim population. Like most immigrant communities, Muslims are considerably younger than the national average and heavily weighted toward males. Indeed, Islam is the most male religion in the United States, with roughly two men for every woman. There are many reasons for this imbalance, some of them concerning the mostly African-American convert population, others having to do with the general pattern of immigration in which men move to an area before women follow them. Other factors pertain to the specifics of Muslim immigration; for example, several thousand former soldiers of the Iraqi army who defected during and after the Gulf War were settled in the United States. …

9 citations


Journal Article
TL;DR: In the aftermath of the terrorist attacks on the United States on September 11, numerous Chinese web users gloated in chat rooms over America's national tragedy, declaring that the attacks were payback for America's imperialistic foreign policy, they rejoiced at the sight of the "world's policeman" being dealt a colossal blow.
Abstract: IN THE AFTERMATH of the terrorist attacks on the United States on September 11, numerous Chinese web users gloated in chat rooms over America's national tragedy. Declaring that the attacks were payback for America's imperialistic foreign policy, they rejoiced at the sight of the "world's policeman" being dealt a colossal blow. To be sure, these Chinese were not the only ones who displayed little sympathy for America's grief. Most notably, Palestinians in the West Bank celebrated by passing out candy to children and dancing in the streets. Yet gloating from the Chinese remains deeply disturbing, as these are the very people on whose behalf U.S. policymakers have claimed to seek freedom and democracy in the past 12 years. That the gloating comes from the Chinese internet generation is even more unsettling, for this small but rapidly growing population has been widely hailed by the Chinese and U.S. governments as the bright future of a more modern, more open, and more liberal twenty-first century China. At this time of persistent national soul-searching about the nature and merits of U.S. foreign policy, a close examination of the grave disconnect between Washington and the people of China is sorely needed. A more (and less) Americanized China EVER SINCE THE government of China opened fire on peaceful demonstrators demanding democracy at Beijing's Tiananmen Square on June 4, 1989, American criticism of an authoritarian Chinese regime that has been reluctant to democratize has been a constant. Policymakers left and right have claimed that by fighting for liberty and democracy in China, not only are they upholding values and principles upon which this country was founded, but they are also fighting for the Chinese people who cannot and perhaps dare not speak up against their own government. As Rep. Henry Hyde, Chairman of the House International Relations Committee, said, "We shall remain with [the Chinese people] until they are free, however long the struggle." As it turns out, the Chinese people, in no uncertain terms, have repeatedly said, "No, thank you." In just the past couple of years, a number of spontaneous outbreaks of anti-Americanism in China have given voice to this sentiment. In May 1999, when NATO bombed the Belgrade Chinese embassy in what Americans called an accident, massive anti-American riots erupted throughout China. The destruction of American property, physical and verbal intimidation of Americans, and protests led by the chant of "Down with the USA" paralyzed major Chinese cities for days. This past April, when an American EP-3 surveillance plane and a Chinese fighter plane collided during what Americans referred to as routine intelligence gathering near the south China coast, Chinese on the street and in internet chat rooms threatened to "teach the United States a lesson" in "World War III." The gloating on the internet post-September 11 emerged as the latest manifestation of pent-up Chinese frustration with the United States. It is difficult for Americans to understand Chinese hostility toward them. After all, it was less than 13 years ago that students and workers piled into Beijing's Tiananmen Square demanding a free and liberal society modeled after the United States. Since then, economic liberalization has brought about an ever more American look and feel to China. In a country where everyone used to wear drab Mao suits colored in only gray, blue, and black, the Chinese now sport Nike shoes, NBA T-shirts, and Levi's jeans. McDonald's, KFC, and Pizza Hut decorate corners of Chinese cities, and products manufactured by Kodak, Coca-Cola, and Procter & Gamble are used in urban households throughout China. Many Chinese also have become increasingly "Americanized" themselves: working in American based multinationals, seeking and receiving American venture capital funding for businesses, studying abroad in the United States, watching American movies, reading American news sources online, and admiring American popular culture icons fr om Madonna to Michael Jordan. …

8 citations


Journal Article
TL;DR: In fact, the war on terrorism is a new variation of the old war against the anti-democratic "isms" of the previous century as mentioned in this paper, which was the task during World War II and it was again our objective (or should have been the mission) during the Cold War.
Abstract: THE IMMEDIATE RESPONSE of President Bush and his administration to the September 11, 2001 terrorist attacks against the United States was superb, both purposeful and principled -- a military, political, and diplomatic success. But what comes next? In his State of the Union address, Bush suggested specific targets of future phases of the war -- the "axis of evil" of Iraq, Iran, and North Korea. But what has been missing in the discussion of the second stage (and perhaps the third, fourth, and fifth stages) of the war on terrorism is an articulation of the general principles that will guide policy in difficult times ahead. The new threat to American national security and the American way of life is no less threatening than such earlier challenges as the defeat of fascism in Europe and imperialism in Japan during World War II, or the containment and ultimate destruction of world communism during the Cold War. A grand vision of the purposes of American power is needed not only to shape strategy, but also to susta in support from the American people and America's allies. During the twentieth century, the central purpose of American power was to defend against and when possible to destroy tyranny. American presidents have been at their best when they have embraced the mission of defending liberty at home and spreading liberty abroad. This was the task during World War II, and it was again our objective (or should have been the mission) during the Cold War. It must be our mission again. In fact, the war on terrorism is a new variation of the old war against the anti-democratic "isms" of the previous century. Adherence to a liberty doctrine as a guide to American foreign policy means pushing to the top of the agenda the promotion of individual freedoms abroad. The expansion of individual liberty in economic and political affairs in turn stimulates the development and consolidation of democratic regimes. To promote liberty requires first the containment and then the elimination, of those forces opposed to liberty, be they individuals, movements, or regimes. Next comes the construction of pro-liberty forces, be they democrats, democratic movements, or democratic institutions. Finally comes the establishment of governments that value and protect the liberty of their own people as the United States does. Obviously, the United States does not have the means to deliver liberty to all subjugated people around the world at the same time. And the spread of liberty and democracy will not always be simultaneous. In some places, the promotion of the individual freedoms must come first, democratization second. Nonetheless, the spread of liberty should be the lofty and broad goal that organizes American foreign policy for the coming decades. By defining the purposes of American power in these terms, American foreign policymakers achieve several objectives not attainable by narrower or less normative doctrines. First, the liberty doctrine, like containment during the Cold War, is useful in clarifying the relationship between often very different policies. Toppling Saddam Hussein does in fact have something in common with providing education to Afghan women, and a liberty doctrine allows us to see it clearly. Second, the liberty doctrine properly defines our new struggle in terms of ideas, individuals, and regimes -- not in terms of states. Allies of liberty exist everywhere, most certainly in Iran and even in Iraq. Likewise, not all the enemies of liberty are states; they also include non-governmental organizations like al Qaeda. Third, the liberty doctrine provides a cause that others -- allies of the United States as well as states, movements, and individuals not necessarily supportive of all U.S. strategic interests -- can support. For example, the Iraqi regime constitutes an immediate threat to American national security but does not pose the same threat to France or Russia. …

7 citations


Journal Article
TL;DR: The tax bite in the United States is one-third of the gross domestic product (GDP) in the Western European democracies, the tax take reaches up to 50 percent as mentioned in this paper.
Abstract: IT IS SAID THAT TAXES are the price we pay for a civilized society. In modern times, this has meant more and higher taxes, rarely fewer and lower taxes. The tax bite in the United States is one-third of the gross domestic product (GDP). In the Western European democracies, the tax take reaches up to 50 percent. It was not always so. At the turn of the twentieth century, the tax bite in the United States was a low 10 percent of GDP. And even that level was high by the standards of the American colonies. The first few generations of immigrants who settled the American colonies paid only those taxes that were necessary to provide security against internal and external enemies, a system of courts and justice, prisons, roads, schools, public buildings, poor relief, and churches in some colonies. This consumed no more than a few percentage points of their income. Moreover, the early settlers sought to minimize, avoid, and evade those modest taxes to the maximum possible extent. Only in wartime were they amenable to higher taxes, after which taxes were rolled back to the previous low level. The early colonists did not flee Europe to pay high taxes in the New World. Prelude to the American colonies BEGINNING IN THE fifteenth century, dreams of gold, silver, spices, and other trading opportunities motivated European adventurers to explore and claim tracts of land in Africa, Asia, and the Americas for their sovereigns and hefty rewards for themselves. The Portuguese, Spanish, French, Dutch, Swedes, Danes, and English engaged in a great land rush. The Portuguese and Spanish divided up most of Latin America. The French seized portions of Canada and several Caribbean islands. The Swedes and Danes briefly occupied parts of Delaware and several Caribbean islands. Dutchmen briefly ruled New York and settled two groups of Caribbean islands. The English (British after union with Scotland in 1707) established colonies along the Atlantic Coast stretching from Newfoundland to South Carolina (founding Georgia in the eighteenth century) along with Bermuda and numerous Caribbean islands. Most Americans know the story of the first colonial settlement in Jamestown, Virginia, in 1607, and that of the Mayflower Compact and the New Plymouth settlement in 1620. Apart from stories of personalities, religious disputes, immigration from Europe, and the beginnings of slavery, the public is less conversant with the subsequent development of the American colonies from the founding of Jamestown and New Plymouth until the opening salvo of the French and Indian Wars in 1754. Although numerous economic accounts have been written about the early colonies, few are explicitly concerned with taxation. Most books and articles deal with daily economic life. Taxation became a central theme only from the end of the French and Indian Wars up to the Declaration of Independence. To this day, there is no single comprehensive volume on taxation during the colonial period. To understand "no taxation without representation" and Americans' skepticism of taxes requires a more comprehensive review of colonial taxation than the Stamp Acts and the Boston Tea Party. This article is the first in a series examining the colonial roots of American taxation. This essay reviews the first century of colonial taxation in America. Others will survey 1700 through the French and Indian Wars and the years leading up to the Declaration of Independence, the Articles of Confederation, and, finally, the Constitution of the United States. Taken together, these essays will show the limited scope and low rates of taxation, the response of the colonists to taxation, and the purposes on which public funds were spent between 1607 and 1783, a period encompassing 176 years. Founding and growth of the American colonies THE FOUNDING AND growth of the original American colonies were a slow process. …

7 citations


Journal Article
TL;DR: The resilience and sturdiness of U.S. economic, military, and political reaction to the terrorist attacks on the United States provided a grim but important test of the resilience of American power across these sectors.
Abstract: THE END OF THE twentieth century has been characterized by the exceptional strength of American power in the international system. In all of the major indices of power -- economic, military, and political strength -- the United States weighs in the heaviest, although the European Union as a whole now holds a slightly larger share of the world's social product than the U.S. The U.S. has the largest and most dynamic economy, the most fearsome and adaptive military, and a public opinion surprisingly cohesive in supporting the government's choices about America's role in the world. While the country doesn't lack political schisms or divisive social issues, it is fundamentally centripetal in its character. These advantages were all likely to reinforce American power in the international system as globalization advances. The September 11, 2001, terrorist attacks on the United States provided a grim but important test of the resilience of American power across these sectors. America's economic, military, and political reaction to the attacks has strengthened U.S. dominance of the international system. The strengths of the U.S. are oddly self-reinforcing in all three aspects of power and have demonstrated a resilience and sturdiness in reaction to the challenges since September 11. However, international responses to the acceleration of American power, while largely positive, suggest the possibility of an anti-American backlash. Outside the U.S., a feeling of dependence combined with uncertainty over the motivations of American power breeds reservations and resistance. It can be difficult even for well-intentioned foreign leaders, operating within their own political systems, to guarantee full-scale allegiance. For the U.S., the question is how best to use and to preserve its near-hegemonic power; for others, how to deal with a degree of U.S. dominance that is reminiscent, on a global level, of America's role within the West after 1945. No other nation but the United States would have been able to give a similarly forceful, coherent, and successful answer to the existential challenge of September 11. As world reaction demonstrates, American leadership is not only valued but actively sought in response to the severe threats to national and international security that have become apparent in recent months. We believe that American power could be most effectively perpetuated and expanded, and the interests of America's allies best served, by engaging in the construction of international institutions and practices more closely aligned to emergent security challenges. Current alliances and institutions are not ideally suited to the task. At the same time, the expanse of American power and the constraining efforts of allies are leading America to forgo international cooperation -- to its own detriment. The critical tests of American power in the twenty-first century will be whether the U.S. has the vision to lead the design of an international architecture that fosters American interests and the interests of other states, as did the post-World War II architecture, and whether it can do so in ways that share the burden of sustaining this order so that it becomes self-reinforcing. Economic power EXPECTATIONS OF A widespread and deep global recession were the conventional wisdom among economic forecasters on September 11. The prospect of a "perfect storm" seemed conceivable: a fragile American economy tilting into recession, weakness in other major G-7 economies, capital flight from the U.S., insufficient liquidity to assure settlements, impediments to the freedom and efficiency of commerce brought about by increased security measures, and clumsy leadership all combining catastrophically to damage confidence in the U.S. economy, extending and deepening the global recession. An economic blow to the American economy seemed, from the targeting of the World Trade Center, an important objective of the terrorists and one with reasonable prospects of success. …

Journal Article
TL;DR: For example, the authors pointed out that if private schools were required to take a certain percentage of disabled students, they "tend not to provide needed services for children with special education needs or for children who speak English as a second language".
Abstract: IF THE OPPONENTS of school choice could have their way, the national debate over the use of public money to subsidize private schooling would turn on the subject of special education With research demonstrating the overall success of school voucher programs in Milwaukee and Cleveland, and with the constitutional issue of public funding of religiously affiliated schools headed for resolution in a seemingly God-tolerant Supreme Court, defenders of the educational status quo have been reduced to fanning fears that government support of greater parental choice would transform public schools into dumping grounds for difficult-to-educate students Sandra Feldman, president of the American Federation of Teachers, repeatedly warns that, with private education more accessible to the poor and middle class, good students will "flee" to independent and parochial schools, leaving behind those kids who are physically and emotionally handicapped, are hyperactive, or have been involved with the juvenile justice system "[P]rivate schools don't have to take [the learning-disabled]," agrees Tammy Johnson of the liberal activist group Wisconsin Citizen Action, so public schools would be left "to deal with those children" Even if private schools were required to take a certain percentage of disabled students, adds Rethinking Schools, an online publication of teachers opposed to school choice, they "tend not to provide needed services for children with special education needs or for children who speak English as a second language" NAACP president Kweisi Mfume predicts that the true cost of private education will always exceed what the government can afford to cover, so "those in the upper- and middle-income brackets will be helped the most as long as their kids don't have personal, behavioral, or educational challenges that cause the private school to pass them by" Given the large number of parents who have come to rely on special education services provided through America's public schools, this strategy of conjuring a worst-case scenario for learning-disabled students would at first appear a promising one According to the Seventeenth Annual Report to Congress on the Implementation of the Individuals with Disabilities Education Act, over 537 million children - percent of American students diagnosed with "special needs" - currently participate in public school special education programs; their parents, many of whom have become adept at using the legal system to access an estimated $32 billion in annual services, are a potent political force The vast majority of these parents have come to believe that their own son or daughter benefits most from being educated in the same classes as normal students - a remedial philosophy known as "inclusion" - and would vigorously oppose any policy that threatens to isolate special-needs children in separate schools for the learning-disabled The argument that school choice must inevitably create special education ghettos would appear to have been strengthened by the recent adoption of market-based education reforms in New Zealand In the late 1980s, that country's Labour government undertook a sweeping reorganization of its highly centralized education system, replacing the Department of Education and its 4,000 employees with a new Ministry of Education staffed by only 400 people and putting each local school under the control of a community board of trustees At the same time, the government abolished school zoning, allowing children to transfer freely between schools, even to private schools, at state expense A recent book on these New Zealand reforms by school choice opponents Edward Fiske and Helen Ladd, When Schools Compete: A Cautionary Tale, makes much of a flaw in the initial legislation, which permitted the more popular public schools to reject students who would be costly to educate or whose disabilities might drag down the test averages …

Journal Article
TL;DR: The U.S. defense community held a shared view regarding the purposes for which this country maintained strategic nuclear forces as discussed by the authors, which had to be accomplished in the face of an overwhelming Soviet conventional capability and under conditions dictated by the presence of vulnerable allies close to Soviet territory.
Abstract: THE COLD WAR CONSENSUS on the role of nuclear arms in American national security has dissolved, a casualty of the demise of the Warsaw Pact and the Soviet empire itself. In place of a threat posed by an adversary commanding superior conventional forces, the United States now faces the prospect of multiple potential opponents with variable motives, shifting sources of conflict, and evolving alliance relationships. In this environment, even assuming sharply lower levels of nuclear warheads, the U.S. needs a more flexible nuclear doctrine, based on approaches that simultaneously assure friends of a steadfast U.S security commitment, prevent prospective enemies from pursuing weapons of mass destruction, deter direct threats against America's interests and allies, and promise the defeat of any attack. These new realities dictate a more nuanced role for nuclear weapons, both in terms of the capabilities we pursue and the scenarios governing their use, even as we retain an unmistakably robust, diversified, balanced, and flexible nuclear force structure. The end of Cold War deterrence FOR MORE THAN 40 years, the U.S. defense community held a shared view regarding the purposes for which this country maintained strategic nuclear forces. The overriding purpose of the forces was to deter war between the United States and the Soviet Union. This had to be accomplished in the face of an overwhelming Soviet conventional capability and under conditions dictated by the presence of vulnerable allies close to Soviet territory. As a result, U.S. forces had to be positioned forward to defend those allies, a situation that made them vulnerable to a Soviet offensive. Because it was difficult to have confidence in Western conventional defenses, it was necessary to threaten the Soviet Union with the possible use by the United States (and later by Great Britain and France) of nuclear weapons, including escalation up to a massive strike on the Soviet homeland. American nuclear forces needed to be of sufficient size and robust character such as to impose on the Soviet leadership the unassailable fact that no conflict with the United States could end with anything less than unacceptable damage to the Soviet Union. Once the Soviet Union acquired nuclear weapons, it was also necessary to convince Moscow that it could not hope to gain an advantage by their use. American retaliation had to be assured, even in the face of a "bolt-out-of-the blue" attack by the Soviet Union. For this reason the United States invested in the now familiar triad of strategic bombers, intercontinental ballistic missiles, and submarine-launched ballistic missiles, along with the early warning and command, control, and communications (c3) that guarded against surprise attack. In addition, the United States developed and deployed an array of tactical and theater nuclear weapons. The purpose of these was to ensure that at any point in the conflict, the United States had a credible escalatory optio n. Over the past decade, the strategic rationale that guided the development of U.S. nuclear forces throughout the Cold War has been slowly eroding. The collapse of the Warsaw Pact and the demise of the Soviet Union ended the conventional threat to America's European and Asian allies. No longer did the United States need a stout ladder of escalation based on directly linking conventional defenses to the massive U.S. strategic nuclear capability. Without the threat of conventional conflict and first-use of nuclear weapons by the United States to avoid a conventional defeat, there was also a reduced concern regarding the possibility of a Soviet preemptive strike against the U.S. homeland. As a result, it was possible for the United States to consider altering the size and posture of American nuclear forces. The first Bush administration began the process -- in cooperation with President Boris Yeltsin's regime in Russia -- by detargeting U. …

Journal Article
TL;DR: The second half of the Promethean myth offers a further warning: Prometheus's defiant act led Zeus to dispatch a woman, Pandora, to unleash her box of evils on the human race as discussed by the authors.
Abstract: TO INVOKE PROMETHEUS, the figure of Greek myth who was punished by Zeus for stealing fire from Hephaestus and giving it to humans, has become a popular warning against scientific hubris in our new age of biotechnology and genetic engineering But the second half of the Promethean myth offers a further warning: Prometheus's defiant act led Zeus to dispatch a woman, Pandora, to unleash her box of evils on the human race -- and thus eliminate the power differential that access to fire briefly had given mankind Pandora's box of dark arts is an apt metaphor for human reproductive technologies Despite being hailed as important scientific advances and having succeeded in allowing many infertile couples to have children, the next generation of these technologies offers us a power that could prove harmful to our understanding of what motherhood is This new generation of reproductive technologies allows us to control not merely the timing and quantity of the children we bear, but their quality as well Techniques of human genetic engineering tempt us to alter our genes not merely for therapy, but for enhancement In this, these technologies pose moral challenges that are fundamentally different from any we have faced before Contemporary human reproductive technologies range from the now widely accepted practice of in-vitro fertilization (IVF), where physicians unite egg and sperm outside the woman's body and then implant the fertilized egg into the womb, to sophisticated sex selection techniques and preimplantation genetic diagnosis of disease and disability in embryos Today, for-profit clinics, such as Conceptual Options in California, offer a cafeteria-like approach to human reproduction with services such as IVF, sex selection screening, and even "social surrogacy" arrangements where women who prefer not to endure the physical challenges of pregnancy rent other women's wombs New techniques such as cytoplasmic cell transfer threaten to upend our conceptions of genetic parenthood; the procedure, which involves the introduction of cytoplasm from a donor egg into another woman's egg to encourage fertilization, could result in a child born of three genetic parents -- the father, the mother, and the cytoplasm donor -- since trace amounts of genetic material reside in the donor cytoplasm Doctors in China recently performed the first successful ovary and fallopian tube transplant, from one sister to another, which will allow the transplant recipient to conceive children -- but from eggs that are genetically her sister's, not her own The near future will bring uterus transplants and artificial wombs Scientists at Cornell University are perfecting the former, while researchers at Juntendou University in Tokyo, who have already had success keeping goat fetuses alive in artificial wombs for short spans of time, predict the creation of a fully functional artificial womb for human beings in just six years Cloning technologies eventually could fulfill even the most utopian of feminist yearnings: procreation without men via parthenogenesis, something that excited the passions of Simone de Beauvoir in 1953 "Perhaps in time," she mused in The Second Sex, "the cooperation of the male will become unnecessary in procreation -- the answer, it would seem, to many a woman's prayer" De Beauvoir was correct to identify women's hopes as a powerful force in modern challenges to old-fashioned procreation, but these hopes also pose serious ethical challenges Contemporary feminism's valorization of "choice" in reproductive matters and its exaltation of individualism -- powerful arguments for access to contraceptives and first-generation reproductive techniques -- offer few ethical moorings as we confront these fundamentally new technologies In fact, the extreme individualism of the feminist position is encouraging women to take these technologies to their logical, if morally dubious conclusion: a consumer-driven form of eugenics …

Journal Article
TL;DR: The last time a Republican won either a presidential or a Senate election in the Golden State was in 2000 as mentioned in this paper, when George W. Bush was elected to the U.S. Senate.
Abstract: NINETEEN EIGHTY-EIGHT is the answer to two California trivia questions: It's the last time the Dodgers won in the post-season and also the last time a Republican won either a presidential or Senate election in the Golden State. The baseball metaphor is appropriate: If the big leagues ran the state parties, the California GOP, with few wins, a fractious roster, and a market that seemingly cares little for the Republicans' product, would seem an inviting target for either relocation or consolidation. It's the new reality of the land that gave birth to the Reagan Revolution. Republican folklore has long honored California as a kingmaker and a well-spring of Republican ambition. In eight of the 10 presidential elections from 1948 to 1984, at least one California Republican -- Earl Warren, Richard Nixon, Ronald Reagan -- was on the Republican ticket. California's Orange County, home of John Wayne Airport, remains the spiritual homeland of paleoconservatives, a place where you can occasionally still find an "[AuH.sub.2]O" bumper sticker. But California is fast becoming a graveyard for Republican fortunes. Dating back to 1996, California has gone Democratic in each and every presidential, gubernatorial, and U.S. Senate election -- while Texas has done precisely the opposite. One of those Republicans in whom Texans placed their trust, George W. Bush, sank approximately $15 million into his California operation during the course of the 2000 election yet managed to lose the state by more votes than Bob Dole did four years earlier. In that same election, California Republicans dropped four congressional seats, four assembly seats, and a state senate seat. Republicans are now outnumbered 32-20 in California's U.S. House delegation. Democrats enjoy nearly two thirds majorities in both houses of the state legislature. And there's more. Only one of California's six state constitutional offices is held by a Republican -- secretary of state -- and it's not much of a partisan office at that; California's secretary of state traditionally champions "good government" issues like voter turnout and registration. Look on the state party's website and you'll see pictures of the president, the vice president, Colin Powell, and Donald Rumsfeld. But not one Californian, not even Condoleeza Rice, President Bush's national security advisor. The fading of California Republicanism might spell disaster for the party nationally. Conventional wisdom holds that American political trends flow like the jet stream -- west to east. In theory, that means voting trends that emerge in California eventually find their way to Washington. Exhibit A in this argument is Proposition 13, the California tax revolt of 1978. Two years after that vote, Reagan was swept into the White House running on a similar theme of lower taxes and frustration with government. Since Proposition 13, the press has actually oversold California's importance by assuming that almost every initiative that stirs up controversy in California has national implications. That's not always the case, yet California still deserves a fair bit of the attention of national trend-spotters. On the other hand, should Republicans reemerge as a major force there, California would virtually clinch electoral success for the party. If, in the 2004 election, President Bush were to win his native Texas (now 34 electoral votes) and his brother Jeb's Florida (27 more), California's 5 electoral votes alone would push the president more than 40 percent of the way toward reelection -- with only three states. A Democratic challenger would need to win nearly two-thirds of the remaining electoral votes, 270 of 422, to win the election. That's nearly impossible, given Republican advantages across the "blue state" Deep South and Great Plains. California is a necessity for Democrats. If Bush somehow could carry the state, California becomes Republican insurance. But in the meantime, the 2002 election represents an uphill climb for Republicans both as a party out of power and as a party in decline. …

Journal Article
TL;DR: The U.S. WAR in Afghanistan drives home this point: We can no longer afford to analyze U. S. security policy in Asia pursuant to paradigms developed to fit the realities of the Cold War as discussed by the authors.
Abstract: THE U.S. WAR in Afghanistan drives home this point: We can no longer afford to analyze U.S. security policy in Asia pursuant to paradigms developed to fit the realities of the Cold War. Many of these realities have changed. For example, in the 1970s, when the Soviet Union was still the principal threat to the U.S., we played the China card. The Chinese were happy to oblige, confronting the Soviet threat as they did along their common border in Central Asia. For almost two decades, that reality -- the threat posed to China by the Soviets -- ensured a degree of alignment in U.S.-China strategic interests. Through this experience we came to see our relationship with China as valuable in its own right, not simply as a foil to Soviet power. The strategic reality in Asia changed with the collapse of the Soviet Union in 1989. But over a decade later, that same Cold War paradigm still makes us tend to analyze our relationship with China as though it were the centerpiece of U.S. foreign policy in Asia. By contrast, by the time we played the China card in 1971, India had been relegated to a lesser role in our strategic thinking. That was not always the case. In the first two decades of the Cold War, India and Pakistan both had been viewed as frontline states, critical to containing the expansion of Soviet and (after 1949) Chinese communism in South Asia. By the late 6os, however, India had proved to be a feckless partner -- a would-be great power, with neither the military nor the economic strength to enforce its utopian foreign policy. Worse, India in 1971 abandoned its preachy neutrality to become a full-fledged member of the Soviet camp. Pakistan, for its part, had been a more loyal ally in the Cold War, but was fractious in its relations with India. By the late 6os, both countries had come to be considered in Washington as "too difficult" to deal with. This development coincided with doctrinal changes that had begun to downplay the strategic importance of South Asia generally. This is where the paradigm got stuck. What has evolved since is a pattern in which we ignore South Asia, including India, as irrelevant to U.S. interests --until crisis strikes. When the Soviets invaded Afghanistan in December of 1979, South Asia suddenly became important to us again, but at that point U.S. attention was focused primarily on Pakistan as a conduit for military aid to the Afghan mujahideen. Once the Soviets withdrew from Afghanistan in 1989, South Asia returned to the back burner. Nuclear testing by both Pakistan and India in May 1998 provoked renewed U.S. concern with that now-nuclear rivalry, and nonproliferation economic and military sanctions followed. As a result, in the last two years of the Clinton administration, the India relationship enjoyed an unusual high-level focus, culminating in President Clinton's May 2000 trip to India, the first presidential visit in 2.2. years (perhaps fittingly, the last visit having been made by President Carter on his nonproliferation crusade). The September 11 attacks on the United States have kept South Asia in the limelight, as we have recruited both India and Pakistan to the war on terrorism. That very war on terrorism, however, has exacerbated tensions between Pakistan and India over continuing political violence in Kashmir. The result? Another flurry of high-level diplomatic activity by the United States, seeking to defuse these tensions between our two allies. But this most recent round of activity -- successful as it was -- still fits the pattern of crisis management with India that evolved during the Cold War. What is clearly needed is a more sustained level of engagement with India. This will only happen if we begin to appreciate India's long-term strategic value to the United States. For this purpose, Kashmir, Pakistan, and even the war on terrorism are distractions. In the long term, our strategic interest in the region is plain: India is a major Asian democratic power with the potential economic and military strength to counter the adve rse effects of China's rise as a regional and world power. …

Journal Article
TL;DR: The Media Marketing Accountability Act (MMAA) as mentioned in this paper was introduced by Sen. Joseph Lieberman to restrict the marketing of "adult-rated media" to minors under the age of 17, i.e., movies, music, and computer games containing violent or sexual material.
Abstract: MOST AMERICAN PARENTS want to restrict children's access to entertainment glamorizing violence, sex, drug use, or vulgar language. Fashioning public policies toward that end is not, however, a simple task. Ideally, purveyors of "mature" entertainment (like retailers of other legal but morally dubious products enjoyed by many adults, such as alcoholic beverages, tobacco, and gambling) would voluntarily adhere to a code of advertising ethics. Self-regulation would obviate the need for burdensome government regulation. In practice, threats of legal restriction have always played an important role in persuading "morally hazardous" industries to observe codes of conduct and to avoid aggressive marketing to young people. Specifically, self-regulation on the part of makers of entertainment products (for example, movies and comic books) has allowed Americans to shield children and adolescents from "mature" content with minimal recourse to government censorship. This tradition may, however, be about to change. In April 2001, Sen. Joseph Lieberman introduced the Media Marketing Accountability Act (MMAA) -- a bill to prohibit the marketing of "adult-rated media," i.e., movies, music, and computer games containing violent or sexual material, to young people under the age of 17. The MMAA would empower the Federal Trade Commission to regulate the advertising of entertainment products to young people. The proposed legislation, if enacted, would inject a federal agency into decisions about the marketing of movies, music, and electronic games -- and thereby potentially into decisions about what sorts of movies, music, and games are produced. Lieberman's hearings, well publicized at the time, provided a valuable forum for exposing entertainment industry practices to public scrutiny. Even so, the expansion of the federal government's regulatory powers in the area of entertainment and culture is undesirable compared to the traditional, and still workable, system of industry se lf-censorship. Calling in the ETC THE MOST RECENT ROUND of public controversy over mass entertainment began in the mid-1980s, when, at a Senate hearing, Elizabeth "Tipper" Gore (wife of the then-freshman senator) voiced alarm about sexually explicit and violent lyrics in popular teenage music. For this she was subsequently ridiculed by many self-styled civil libertarians and defenders of the music industry. Nonetheless, public concern over violence and vulgarity in entertainment revived mightily following the June 1999 shooting murders at Columbine High School in Littleton Colorado -- murders committed by teenaged boys steeped in various forms of violent entertainment. After that event, President Clinton asked the FTC to investigate the marketing of such entertainment to young people. In the fall of 2000, Sen. John McCain, chairman of the Senate Commerce Committee, presided over hearings on the resulting FTC report, "Marketing Violent Entertainment to Children." After extensive study of the marketing plans of the movie, music-recording, and e lectronic-game industries, the FTC concluded that media companies do aggressively market products with "mature" content to children, and that these practices "frustrate parents' attempt to protect children from inappropriate material." The irresponsibility of the entertainment industry came up again as an issue during the 2000 presidential campaign of candidates Al Gore and Joseph Lieberman. At the time, many commentators dismissed their references to the issue as empty campaign rhetoric (the Washington Post reported that Gore had first telephoned industry executives to reassure them). But these skeptics proved wrong. In April 2001, Gore's former running mate introduced the MMAA. The bill (co-sponsored by Sen. Hillary Rodham Clinton) defined "targeted marketing" to minors of such material as "an unfair or deceptive" practice. The text of Lieberman's bill cites the findings of the September 2000 FTC report. …

Journal Article
TL;DR: The problem of conflicts of interest in the context of funded research and opinions that are disseminated to the public by journalists, academics, and individuals affiliated with think tanks is discussed in this paper.
Abstract: THE ENRON "SCANDAL" has raised several fascinating issues related to disclosing information and potential conflicts of interest. For example, an accounting firm that receives consulting fees from a company it is paid to audit may be less likely to report financial problems with that company. But contrary to conventional wisdom, simply not allowing that firm to consult will not necessarily solve the problem. The accounting firm will still have potential conflicts so long as it is getting paid by other firms to monitor their performance. In the past, accounting firms have used professional standards as a way of helping to ensure their reputation. In addition, they have tried to have a diverse client base, so the costs of losing one client would not be overwhelming. Similarly, academics and pundits receiving monetary compensation from a company may be more likely to give that company's policy positions a favorable review. One solution to this problem is to have these folks come clean by identifying their conflicts of interest. Unfortunately, as we shall see, this is easier said than done and is likely to have unintended consequences. Consider, for example, the problem of conflicts of interest in the context of funded research and opinions that are disseminated to the public by journalists, academics, and individuals affiliated with think tanks. This is a broad topic and one with which I have some personal experience as a scholar and a consultant to business and government. (1) My purpose is to evaluate the pros and cons of disclosing potential "conflicts of interest." "Full disclosure" may be a laudable goal, but is difficult to define and, therefore, not very useful. In addition, some disclosure norms imposed by the media are not likely to be very helpful in promoting useful information for their audiences, and will likely have unintended adverse consequences. The problem is not that disclosure is necessarily bad, though it may lead to bad outcomes in certain situations. Instead, the problem is that too often, the media and the public use partial disclosure as a substitute for critical thinking. The nature of the problem PAUL KRUGMAN -- ONE of the best known economists in academia -- received $50,000 for serving on an advisory board to Enron. Krugman, of course, was not alone. For example, Larry Lindsey, President Bush's chief economic advisor, was reported to have received the same amount. Defending himself in his New York Times column (January 25, 2002), Krugman noted that he complied with the Times' conflict of interest policy. When he agreed to write for the paper, he resigned from the Enron board. In addition, Krugman noted the potential conflict posed by his Enron advisory role in a Fortune piece he published three years ago. Krugman's level of disclosure, however, did not seem to satisfy Andrew Sullivan -- an excellent journalist. On his website, andrewsullivan.com (January 25, 2002), Sullivan took Krugman to task for not noting the amount of money he received. Sullivan noted, "You'll notice one detail missing from Krugman's apologia -- the amount of money he got. Why won't he mention it? Because it's the most damning evidence against him." He thinks "the reading public has a right to know" such information. Sullivan raises an important question: What does the public have a right to know about a person's opinion or findings? That is, what should academics and pundits be required to say about their remarks or research when presenting it in public? Sullivan thinks that full disclosure is the key. He made the point specifically with respect to talking heads: "What this is about is the enmeshment of some of the pundit class in major corporate money. It seems to me that an integral part of a journalist's vocation is independence -- independence from any monetary interests that could even be perceived as clouding his or her judgment. …

Journal Article
TL;DR: In the wake of school choice's recent victory in the U.S. Supreme Court, a serious crack in the constitutional wall between church and state has been exposed as discussed by the authors, and it's especially troubling when part of that wall comes crumbling down on public school children.
Abstract: GIVEN THE EXTRAORDINARY hullabaloo surrounding school choice's recent victory in the Supreme Court, it's surprising to realize how few choices are actually being made Cleveland offers educational vouchers to just over 3,700 of the city's 75,000 students In Milwaukee, 10,739 students - about 10 percent of the city's schoolchildren -- attend a school of their choice with public support And in Florida, which maintains the country's first and only statewide school choice program, only 50 students currently receive vouchers All told, the nation's three publicly funded voucher programs offer educational options to about 00003 percent of American students Fears concerning the advent of an American theocracy, it seems, have been vastly overstated "This decision represents a serious crack in the constitutional wall between church and state, and it's especially troubling when part of that wall comes crumbling down on Cleveland's public school children," laments Ralph Neas, president of People for the American Way Neas is hysterical Allowing parents using public aid to choose a religious education for their children does not constitute a state establishment of religion, and the Supreme Court has finally unmasked Neas's position as hostility to religion masquerading as a constitutional argument Behind the justices' decision in Zelman v Simmons-Harris is the court's growing respect for religious institutions and refusal to single them out for exclusion from the public square Yet voucher proponents appear equally blind to political reality "The Court's decision," announced House Education and the Workforce Committee Chairman John Boehner (R-OH), "moves us decisively forward in the drive for equal educational opportunity" President Bush insisted, "the Court declared that our nation will not accept one education system for those who can afford to send their children to a school of their choice and for those who can't" In the wake of the court's ruling, says Joseph Overton of the pro-voucher Mackinac Center for Public Policy, "School choice is an inevitability" The court, however, declared no such thing And school choice is far from inevitable The task remains to actually craft and implement school choice programs in states and municipalities nationwide, and this will be no easy affair "We are prepared to fight these efforts, state by state," Neas affirms Having failed in the high court, anti-choice activists are refocusing their energies on the legislative process "We will continue to fight for public schools and against vouchers -- or related schemes to provide public funds to private and religious schools -- at the ballot box, in state legislatures, and in state courts," threatens Bob Chase, president of the National Education Association "If this decision brings new efforts to enact voucher legislation, we will fight these efforts," warns American Federation of Teachers President Sandra Feldman "But we will also work with local, state and national policymakers to ensure that private schools that receive public funds are held accountable, just like public schools are" The Progressive Policy Institute is already promoting what it calls "accountable choice," which would impose statewide standards on private and parochial schools If voucher laws saddle private schools with the same regulatory regime that now hampers the public education system, school choice will prove an iatrogenic aggravation of the educational crisis The prospect of increased regulation is especially threatening to religious schools -- the principal focus of school-choice opponents -- who fear that their religious missions could be undermined Even the modest choice programs now extant impose restrictions on parochial schools that accept voucher children Both the Cleveland and Milwaukee programs force participating schools to relinquish control of their admissions policies; admissions decisions must be made by lottery, ensuring nondiscriminatory access …

Journal Article
TL;DR: Gibraltar as mentioned in this paper is a British territory in the Mediterranean that belongs to Spain and was occupied by Spain from the Reconquista through the War of the Spanish Succession from 1779 to 1793.
Abstract: PROTESTS, JEERS, AND CURSES greeted British Foreign Secretary Jack Straw on a May 3, 2002 visit to Gibraltar. In view of Britain's role as administering power and de facto protector of the small but strategic Mediterranean territory -- and in view of the overwhelming pro-British sentiment of the local government and inhabitants -- such tumult seized media attention in Europe. It hardly garnered a column-inch, however, in America. One of the last remnants of the far-flung British Empire, Gibraltar and its fate nonetheless possess considerable significance for the United States. How Prime Minister Tony Blair and his Labour cabinet are addressing Gibraltar should therefore receive attention within the Beltway appropriate to the ranking foreign and security policy issues of the day. The Gibraltar problem Quintessential symbol of tenacity and independence, the Rock -- the moniker derives from Gibraltar's unmistakable geological profile -- belonged to Spain from the Reconquista through the War of the Spanish Succession. Britain in 1704 conquered the territory, guardian of the straits that link the Atlantic to the Mediterranean, and Spain confirmed the shift of fortune with a treaty signed at Utrecht in 1713, granting Britain sovereignty in perpetuity. Over time, Gibraltar would serve the British as an indispensable naval (later, air) base, and its population, a mix of peoples from the British Isles and Mediterranean -- including Sephardic Jews expelled from Morocco -- took on a distinctive character. English-speaking and governed by quintessentially British institutions, the Gibraltarians now thrive as a civilian economy centered around shipping and financial services. (1) Regrettably, however, their territory has become an object of European horse-trading. Talks over its future have stopped and started in halting fashion for decades but received new impetus this year from a desire at 10 Downing Street to cut a deal with Spain. Tony Blair apparently believes that handing over the territory to Spain -- and thus placating a nationalist hue and cry among certain constituents there -- could purchase an alliance with Madrid, in turn useful perhaps in the internal struggles of the European Union. The problem confronting the prime minister in his hoped-for exercise in Brussels realpolitik lies in the Gibraltarians themselves. They have made plain their devotion to sovereignty through referenda, protests, and government policy -- and just as plain that they want nothing to do with Spain. The Treaty of Utrecht is an instrument on which, ironically enough, Spain in some part bases its claim to Gibraltar. Though the treaty ceded Gibraltar to Britain in perpetuity, its tenth article provided that, in the event that Britain relinquished Gibraltar, Spain would have the right to reclaim possession. On a number of occasions, Spain attempted to take Gibraltar back by force, carrying out a siege from 1779 to 1793. All Spanish exertions in that direction failed, as Gibraltar developed its reputation as the British Empire's impregnable redoubt. Yet Spain continued to pursue what grew into a national ambition to retrieve the lost territory, albeit more recently by diplomatic means and sanctions rather than military aggression. Resolution 2429, adopted by the United Nations General Assembly in 1968, determined that the United Kingdom should relinquish sovereignty over Gibraltar and allow Article x of the Treaty of Utrecht to effect retrocession to Spain. The United Kingdom expressed the view that Resolution 2429 violates Article 73 of the United Nations Charter, which provides for the primacy of the wishes of the inhabitants of a non-self-governing territory in determining the disposition of the territory. A referendum in September 1967, in which over 95 percent of the electorate voted, showed 44 Gibraltarians preferring incorporation into Spain and 12,138 favoring continued association with Britain. …

Journal Article
TL;DR: Medicine's effort to separate spirituality from the main body of religion, or to forge an alliance with religion in general, finds support across the ideological spectrum.
Abstract: UNTIL RECENTLY IN human experience, religion and medical science occupied distinct and separate spheres. Religion dealt with problems of the inner life, including spiritual and emotional trouble, while medical science managed the outer life of the body. Lately, however, and by contrast, the relationship between religion and medical science has fluctuated, creating a dizzying problem of identities. Alternative medicine, to take one example, borrows from both religion and medicine, making it a confusing hybrid. At other times, religion and medical science swap roles altogether - as when religion stands guard over stem cells, for instance, or when medical science uses drugs like Prozac and Zoloft to rescue people suffering from everyday sadness. Another new phenomenon only adds to the confusion: Based on evidence that religious belief is good for one's health, some medical doctors are trying to siphon off spirituality from religion itself, or at least to make religion a junior partner in their enterprises. Thus, in varying ways, have religion and medical science gone from being strangers to competitors and, most recently, even helpmates. This newest connection between medicine and religion takes two general forms. In the first, doctors emphasize the health benefit that comes from active involvement in organized religion. A well-known study published in the Journal of Chronic Diseases describes an association between weekly church attendance and lower rates of coronary artery disease, emphysema, and cirrhosis. Further research has linked religious commitment to lower blood pressure, reduced levels of pain among cancer patients, improved post-operative functioning in heart transplant patients, and even reduced mortality in general. Mindful of such evidence, some doctors active in this branch of the pro-religion movement have come to embrace religion in full, as it is historically understood. Other doctors, however, have sought to amputate that same phenomenon. They believe that spirituality is the active, beneficial ingredient in religion -- that the rest is fluff. In the forms of biofeedback, transcendental meditation, and mind-body medicine, these doctors foster spirituality outside of religion's institutional and moral framework. They admit that physical health can never be totally divorced from moral behavior (for example, monogamy decreases the chance of AIDS as well as a host of other infections), but they do believe that spirituality is a natural phenomenon in itself, the rigors of orthodoxy quite aside. Even atheists, they insist, can fight disease through greater spiritual awareness. An emerging "science of the spirit" supports this claim. Meditation, for example, has been shown to cause a "relaxation response" that leads to reduced muscle tension and a change in the body's neuroendocrine system. Brain scanning reveals a characteristic change among those who meditate, especially in the area of the temporal lobe. The new science of the spirit ignores the impact of religious commitment on health, concentrating instead on the physical manifestations of spiritual awareness. Still, it shares with the epidemiology of church attendance a common purpose: harnessing religion for health purposes. Medicine's effort to separate spirituality from the main body of religion, or to forge an alliance with religion in general, finds support across the ideological spectrum. Atheists hope that research into the physical underpinnings of religious belief will prove that God is just a phantom of the mind. Yet equally supportive of exploring that same connection is the John Templeton Foundation -- a conservative, pro-religion organization that actively funds research into the medical benefits of spirituality. By publicizing these medical benefits, the Templeton Foundation believes it is helping to promote religion. Organized religion, for its part, is ambivalent about the new alliance. …

Journal Article
TL;DR: The concept of a right side of history is derived from Marxism, and it is founded on the belief that there is a forward advance toward a socialist future that can be resisted, but not ultimately defeated.
Abstract: A SPECTER HAUNTS the world, and that specter is America. This is not the America discoverable in the pages of a world atlas, but a mythical America that is the target of the new form of anti-Americanism that Salman Rushdie, writing in the Guardian (February 6, 2002), says "is presently taking the world by storm" and that forms the subject of a Washington Post essay by Martin Kettle significantly entitled "U.S. Bashing: It's All The Rage In Europe" (January 7, 2002). It is an America that Anatol Lieven assures us, in a recent article in the London Review of Books, is nothing less than "a menace to itself and to mankind" and that Noam Chomsky has repeatedly characterized as the world's major terrorist state. But above all it is the America that is responsible for the evils of the rest of the world. As Darius Fo, the winner of the 1997 Nobel Prize for literature, put it in a notorious post-September 11 email subsequently quoted in the New York Times (September 22, 2001): "The great speculators [of American capitalism] wallow in an economy that every years kills tens of millions of people with poverty [in the Third World] -- so what is 20,000 dead in New York? Regardless of who carried out the massacre [of 9-11], this violence is the legitimate daughter of the culture of violence, hunger and inhumane exploitation." It is this sort of America that is at the hub of Antonio Negri and Michael Hardt's revision of Marxism in their intellectually influential book Empire (Harvard University Press, 2000) -- a reinterpretation of historical materialism in which the global capitalist system will be overthrown not by those who have helped to create it, namely, the working class, but rather by a polyglot global social force vaguely referred to as "the multitude" -- the alleged victims of this system. America-bashing is anti-Americanism at its most radical and totalizing. Its goal is not to advise, but to condemn; not to fix, but to destroy. It repudiates every thought of reform in any normal sense; it sees no difference between American liberals and American conservatives; it views every American action, both present and past, as an act of deliberate oppression and systemic exploitation. It is not that America went wrong here or there; it is that it is wrong root and branch. The conviction at the heart of those who engage in it is really quite simple: that America is an unmitigated evil, an irredeemable enormity. This is the specter that is haunting the world today. Indeed, one may even go so far as to argue that this America is the fundamental organizing principle of the left as it exists today: To be against America is to be on the right side of history; to be for it is to be on the wrong side. But let's pause to ask a question whose answer the America-bashers appear to assume they know: What is the right side of history at this point in history? The concept of a right side of history is derived from Marxism, and it is founded on the belief that there is a forward advance toward a socialist future that can be resisted, but not ultimately defeated. But does anyone believe this anymore? Does anyone take seriously the claim that the present state of affairs will be set aside and a wholly new order of things implemented in its place, and that such a transformation of the world will happen as a matter of course? And, finally, if in fact there are those who believe such a thing, what is the status of this belief? Is it a realistic assessment of the objective conditions of the present world order, or is it merely wishful thinking? Marx's political realism THE IMPORTANCE OF these questions should be obvious to anyone familiar with the thought of Marx. Marx's uniqueness as a thinker of the left is his absolute commitment to the principles of political realism. This is the view that any political energy that is put into what is clearly a hopeless cause is a waste. …


Journal Article
TL;DR: The use of military tribunals has been studied extensively in the context of the current war on terrorism as mentioned in this paper, with a focus on the legal and political context for the decision to use them.
Abstract: ON NOVEMBER 13, 2001, President Bush issued a Military Order authorizing the Department of Defense to create military commissions to try non-citizens who are members of al Qaeda or who have attempted or carried out acts of international terrorism. The promulgation of the order was met with overwhelming public support, but with a stream of criticism from civil libertarians and others concerned with the possible dilution of due process standards. The Military Order has also sparked a lively debate among lawyers and pundits in the op-ed columns of America's newspapers focusing on the legality of the commissions under international law and their actual utility in fighting terrorism. What has unfortunately been missing from this debate is its proper political context. The question is not whether a military commission is a good or bad thing, but whether any adequate mechanism currently exists for prosecuting prisoners who end up in U.S. custody during the new terror war facing America and its allies. The narrow legalistic debate has failed so far to do justice to the magnitude and nature of the threat of terror war and the policy context for the decision to use military commissions. In this broader context, it becomes clear that current domestic and international mechanisms cannot respond effectively to the needs encountered in the current terror war, but that military commissions, properly used, can do so at least for now. In the longer run, the existing Yugoslav tribunal offers substantial promise as an international terrorism court for particular types of cases. But in the meantime, the need for an effective mechanism is acute, and the military commissions provide one. Criminals v. enemies THE CURRENT DEBATE over military commissions is so intense and widespread that it gives inordinate importance to the question of the forum in which terrorists should be tried. In reality, courts, in whatever form, have only a small role in the terror war currently underway. The campaign of terror war directed against the United States can be described as "unconventional warfare conducted by unprivileged combatants with the assistance of criminal co-conspirators designed primarily to terrorize and kill civilians." This campaign has been underway for nearly a decade and will likely continue well into the foreseeable future. The potential use of military tribunals was not intended and should not be seen as an effort to shortcut court procedures ordinarily applicable to individuals charged with crimes. Rather, it was intended as a major shift in policy away from the criminal law model as a means for deterring and preventing terrorism. Until September 11, 2001, when al Qaeda struck American targets, including the World Trade Center (in 1993), President Clinton promised to hunt down those responsible and "bring them to justice." Unfortunately, he meant this literally: He called in the FBI as lead agency, and turned to federal prosecutors as the means for fulfilling his pledge. Naturally, no issue of where to prosecute terrorists arose, because in those few instances when the U.S. was able to arrest a terrorist, criminal trials were the principal means intended to "bring them to justice." President Bush put all that behind him after the attacks of September 11. He called the attacks "acts of war," and demanded that the Taliban surrender Osama bin Laden and other al Qaeda leaders on pain of being treated the same as they, as "enemies" of the United States. When the Taliban refused, hailing bin Laden as a Muslim "hero," Bush (with Congress's support) attacked Afghanistan with military force and turned to the Department of Defense to lead the campaign. The terror war, long pursued by al Qaeda, was finally confronted as an issue of national security, rather than one of criminal law enforcement. Taking his cue from this major shift in policy, Attorney General John Ashcroft, along with FBI Director Robert S. …

Journal Article
TL;DR: Fukuyama et al. as discussed by the authors argue that the underlying problem for the West is not Islamic fundamentalism, but rather the liberal yearning for equal recognition, and the Muslim propensity toward violent conflict as the most important coming challenge to world peace and American power.
Abstract: This is Samuel P. Huntington's moment. The world of cultural and religious strife anticipated by Huntington in his much-discussed (and widely excoriated) book, The Clash of Civilizations, has unquestionably arrived. Yet whether we might also someday see an alternative world -- the global triumph of democracy envisioned in Francis Fukuyama's brilliant work, The End of History and the Last Man -- is also a question that seems very much before us as we contemplate what it would mean to "win" the war in which we are engaged. The question of our time may now be whether Huntington's culture clash or Fukuyama's pax democratica is the world's most plausible future. This is a question with policy implications, of course, and both The Clash of Civilizations and The End of History are, in part, books about policy -- what the United States government should do. Ultimately, however, to choose between Fukuyama and Huntington is to articulate a vision of human social life. What are the mainsprings of human action? How salient is religion as a cultural force? Is democracy the most civilized and natural way of life? Questions like these are at the heart of the contest between Huntington and Fukuyama, and we can take the measure of the concomitant policy disputes only by moving through these larger problems, not around them. Philosophically and spiritually, The End of History and The Clash of Civilizations could hardly be more different (although each book can fairly be called "conservative"). Read closely, unexpected areas of convergence emerge. Nonetheless, ultimately, neither Huntington nor Fukuyama tells us what we need to know in order to synthesize their perspectives -- or to finally decide between them. The books are at once complementary and irreconcilable. Taken together, they frame our current perplexity. So let us explore the dilemma that is the state of the world at the moment by considering each book in turn. (1) Templates of conflict Anyone who has followed the war and the debates surrounding it will find something familiar and thought-provoking on nearly every page of The Clash of Civilizations. The book often reads as though it had been written this year. It's easy to forget how controversial it was when it appeared. For all the respectful (if often skeptical) attention Huntington's views have garnered from the policy community, his reception within the academy, among liberal opinion makers, and in many overseas capitals has been, and remains, overwhelmingly hostile. The reason why is that The Clash of Civilizations sticks a thumb in the eye of liberalism and multiculturalism alike. For Fukuyama, the mainspring of history is the liberal yearning for equal recognition. Huntington gruffly retorts, "It is human to hate." Humans require identity, and they acquire it, says Huntington, through the enemies they choose. With the collapse of Cold War enmities, new forms of identity will inevitably be constructed upon new patterns of hostility. Differences of religion and culture, says Huntington, will provide the needed template for the clashes to come. This vision of a civilizational state of nature in which hatred is rife and trust and friendship rare was bound to get a rocky reception in a nation where every other ad or song is about harmony. But Huntington pushes his argument further. Not content to affirm the inevitability of human hatred, or to predict the rise of cultural antagonisms in general, Huntington singles out "the Muslim propensity toward violent conflict" as the most important coming challenge to world peace and American power. "Muslim bellicosity and violence," says Huntington, "are late-twentieth-century facts which neither Muslims nor non-Muslims can deny." Yet until last September, deny they did. For the post-September 11 reader, watching Huntington demolish President Clinton's contention (SO like the current president's) that the West has no problem with Islam, but only with violent Islamic extremists, is a fascinating bit of reverse deja vu: The underlying problem for the West is not Islamic fundamentalism. …