scispace - formally typeset
Search or ask a question

Showing papers in "Policy Review in 2003"


Journal Article
TL;DR: The role of liberal secularists in shaping the policies of the American left has been examined by a number of policy scholars over the past decade as mentioned in this paper, who have examined parallel bedrock constituency in America's political parties.
Abstract: There is a time when a Christian must sell all and give to the poor, as they did in the Apostles times. There is a time allsoe when Christians (though they give not all yet) must give beyond their abillity, as they of Macedonia, Cor. 2, 6. Likewise community of perills calls for extraordinary liberality, and soe doth community in some speciall service for the Churche. Lastly, when there is no other means whereby our Christian brother may be relieved in his distress, we must help him beyond our ability rather than tempt God in putting him upon help by miraculous or extraordinary meanes. --John Winthrop, "A Modell of Christian Charity" (1630) OVER THE PAST decade, a number of policy scholars have examined parallel bedrock constituencies in America's political parties. On one side, the Republicans rely on the near-monolithic support of Christian conservatives, a fact that has been documented ad nauseam by political commentators and the mainstream press for more than 20 years. Less well understood, but equally important, is the role of liberal secularists in shaping the policies of the American left. These people are the religious and political inverse of Christian conservatives: They vote for liberal political candidates and hold left-wing views on issues like school prayer and the death penalty. But most saliently, religion does not play a significant role in their lives. As political scientists Louis Bolce and Gerald De Maio recently demonstrated in the Public Interest ("Our Secularist Democratic Party," Fall 2002), liberal secularists are at least as influential in molding the platform of the Democratic Party as are Christian conservatives for the Republicans. Secularism is historically anomalous in the American cultural mainstream. The links between civic and religious life were persistent across the American political spectrum for hundreds of years. Indeed, John Winthrop's seventeenth-century statement quoted above would probably not have sounded particularly zealous throughout most of the twentieth century. As many public opinion scholars have documented, however, a dramatic philosophical shift occurred in the 1960s, leaving us to this day with a pervasive secular rhetoric on the political left. Consider how retrograde the words of John F. Kennedy's 1961 inaugural address would sound today: "With a good conscience our only sure reward, with history the final judge of our deeds, let us go forth to lead the land we love, asking His blessing and His help, but knowing that here on earth God's work must truly be our own." An unanswered question is one of causality: Do secularists tend toward the political left, or do political liberals tend to be secular? On the one hand, secularism might be the only hard-headed option for those who see, as Karl Marx did, that "Religion is the sigh of the oppressed creature, the heart of a heartless world, and the soul of soulless conditions." On the other hand, secularists might find sanctuary in liberalism's tolerance for their somewhat unpopular views. Alexis de Tocqueville noted that being a secularist in America was no easy life (at least in 1835): "In the United States, if a politician attacks a sect, this may not prevent the partisans of that very sect from supporting him; but if he attacks all the sects together, everyone abandons him, and he remains alone." Nor is secularism a popular stance among the public at large today: According to a March 2002 survey by the Pew Research Center for the People and the Press, more than half of Americans have an unfavorable view of nonbelievers. Reasonable people will disagree as to whether this animus owes to simple religious intolerance, or rather to behavioral differences between the groups. Perceived differences leading to hostility might include disproportionately high rates of behaviors discouraged by religious norms (for example, adultery) or low rates of virtuous actions encouraged by them (for example, charity). …

91 citations


Journal Article
TL;DR: The U.S. has been far less successful in winning the war than it was in achieving the political goals for which the war was fought as discussed by the authors, which is the main obstacle in the way of establishing stable polities in Afghanistan and Iraq.
Abstract: THE UNITED STATES HAS just fought two wars against enemies thought to be difficult to defeat and has won decisively, rapidly, and with minimal loss of life. The military performance in both cases was impressive. With virtually no American troops on the ground in Afghanistan, U.S. forces aided by local Afghan militias destroyed the Taliban government and shattered the al Qaeda bases and infrastructure that had been used to plan and prepare the September 11 attacks. In Iraq one British, one U.S. Marine, and two U.S. Army divisions, supported by advanced precision-guided munitions, sufficed to crush both the Iraqi army and Saddam Hussein's regime in a matter of weeks. In both cases, the U.S. has been far less successful in winning the peace than it was in winning the war. In Iraq, the widespread looting and rioting that followed the collapse of the Baathist regime and the disorder that continued for weeks to rage in many parts of the country, including Baghdad, badly tarnished the image of the American occupying forces. It hindered U.S. efforts to establish a new, stable Iraqi regime that commands the loyalty of the Iraqi people. The situation in Afghanistan was much worse. For more than a year after the fall of the Taliban government, the new government of Hamid Karzai did not command the respect of the majority of the Afghan people and could not make its writ run outside of Kabul. Warlords established themselves in almost all of the other key cities and regions of the country, the roads became unsafe, and violence, both directed and random, became the order of the day. It remains unclear at present whether it will be possible actually to establish a stable and legitimate government in Kabul--and at what cost. Why has the United States been so successful in recent wars and encountered so much difficulty in securing its political aims after the shooting stopped? The obstacles in the way of establishing stable polities in Kabul and Baghdad were always considerable. It was never likely that the road to peace and stability in postwar Iraq and Afghanistan would be short or smooth. The nature of the American military operations in both countries, however, multiplied those obstacles instead of reducing them and greatly increased the chance of failing to achieve the political objectives that motivated both wars. The reason for this fact lies partly in the vision of war that President Bush and his administration brought into office and have implemented in the past two wars. This vision focuses on destroying the enemy's armed forces and his ability to command them and control them. It does not focus on the problem of achieving political objectives. The advocates of a "new American way of war," Secretary of Defense Donald Rumsfeld and Bush chief among them, have attempted to simplify war into a targeting drill. They see the enemy as a target set and believe that when all or most of the targets have been hit, he will inevitably surrender and American goals will be achieved. War is not that simple, however. From the standpoint of establishing a good peace it matters a great deal how, exactly, one defeats the enemy and what the enemy's country looks like at the moment the bullets stop flying. The U.S. has developed and implemented a method of warfare that can pro produce stunning military victories but does not necessarily accomplish the political goals for which the war was fought. If these two wars represented merely isolated cases or aberrations from the mainstream of military and political developments in the U.S., then the study of this problem would be of primarily academic interest. That is not the case. The entire thrust of the current program of military transformation of the U.S. armed forces, on the contrary, aims at the implementation and perfection of this sort of target-set mentality. Unless the direction and nature of military transformation change dramatically, the American public should expect to see in the future many more wars in which U. …

70 citations


Journal Article
TL;DR: In the aftermath of the Iraq war, many have argued that only a democratic transformation of Iraq, and eventually of the larger Arab world, can provide long-term security against terrorism and nuclear attack as mentioned in this paper.
Abstract: ALTHOUGH THE UNITED STATES is the preeminent power in the world, we are not yet an empire. Notwithstanding periodic foreign interventions and our considerable international influence, we have not used our military to secure direct and continuous control over the domestic affairs of foreign lands. If anything, the United States has avoided empire. We have abolished the draft, reduced taxes, cut defense spending, and eschewed nation-building. Only recently, we were accused of "abandoning" Afghanistan in the wake of the Soviet departure from that country. Today, Afghanistan may be the germ of a new American imperium. Iraq forces the imperial question. In the aftermath of an Iraqi war, it may suffice to install a friendly autocracy, withdraw the bulk of our forces, and exert our influence from afar. Yet some have called for more. From voices within the administration like Deputy Secretary of Defense Paul Wolfowitz, to policy intellectuals like Richard Perle, to esteemed scholars like Bernard Lewis, many have argued that only a democratic transformation of Iraq, and eventually of the larger Arab world, can provide long-term security against terrorism and nuclear attack. In an important address in February, George W. Bush lent his voice to this chorus. In no uncertain terms, the president affirmed that "the world has a clear interest in the spread of democratic values," not least because "free nations do not breed the ideologies of murder." The president invoked the examples of American-led democratization in post-World War II Germany and Japan, and he pointedly rejected the claim that Arab nations are incapable of sustaining democracy. What the president did not say, yet gently and ambiguously implied, was that so deep a cultural change would require America to occupy Iraq in force and manage its affairs for years to come. Could such a venture in democratic imperialism be harmonized with our liberal principles? Even if so, would it work? Is it possible to bring liberalism to a society so long at odds with the values of the West? All of these questions were posed and answered, both in theory and in practice, during Britain's imperial rule of India. Three great British thinkers, Edmund Burke, James Mill, and John Stuart Mill, not only philosophized about liberal imperialism; they lived it. Burke helped force a major reform of Britain's early imperial system, while John Stuart Mill succeeded his father James as the "chief examiner" in the London headquarters of the British East India Company. Burke on one hand and the Mills on the other founded the two competing moral and administrative schools of thought on the British Empire. Burke's colonialism was conservative, respectful of indigenous practices and elites, and insistent on the highest standards of stewardship. The Mills were skeptical, even contemptuous, of traditional practices and elites; they were determined to force a democratic social transformation. Neither approach, it turns out, was able to operate independently of the other. If we find ourselves shouldering an imperial burden in Iraq or beyond, we shall want to study the wisdom -- and the folly of Burke, the Mills, and their respective disciples. Far more than America's post-World War II occupation of Japan, the British experience in India may be the key precedent for bringing democracy to an undemocratic and non-Western land like Iraq. From India to Iraq BRITISH IMPERIAL India might seem an unlikely model for an American occupation of Iraq. American rule in Iraq would ideally be a successful and time-limited experiment in democratization. Yet the British governed sections of the Indian subcontinent for nearly 200 years. The earliest period of British colonial rule was marked by extreme exploitation and neglect. Once colonial government was placed on a sounder footing, even the best-intentioned policies of dedicated and sympa thetic administrators frequently went awry, leading to serious social disruption. …

25 citations


Journal Article
TL;DR: For example, the U.S. rejection of the Kyoto Protocol and the threat to withdraw from the ABM Treaty of Lisbon has been widely cited as a major reason for U.N. rejection as discussed by the authors.
Abstract: IN RECENT YEARS unilateralism has emerged as the most contentious issue in U.S.-European relations. Europeans have complained about what they see as a U.S. go-it-alone approach to foreign policy, one said to contrast with a European tendency to emphasize "negotiation, compromise, and the virtues of agreed constraints." (1) These criticisms, already common during the Clinton years, intensified during the early months of the Bush administration when they were fueled by U.S. rejection of the Kyoto Protocol and the threat to withdraw from the ABM treaty. After a lull following the September 11 terrorist attacks, they rose to new heights as Europeans protested the U.S. failure to use NATO in the war in Afghanistan, the treatment of captured al Qaeda suspects in Cuba, Washington's sidetracking of the Biological Weapons Convention (BWC) review conference, and the U.S. campaign to exempt itself from jurisdiction of the International Criminal Court (ICC). For all its fervor, however, the transatlantic debate over unilateralism has been intellectually rather superficial, with little serious discussion of the political, economic, and legal questions that arise in connection with the practice of multilateralism in a world of sovereign states. Political rhetoric has tended to obscure the fact that there is no consensus in either the academic or policymaking communities about how multilateralism should be defined; about when, if ever, unilateral action is acceptable; or about such issues as the relationship between regional multilateral cooperation, such as has unfolded in Europe for the past 50 years, and the kind of global multilateralism embodied in the United Nations. Multilateralism is easiest to define in economic affairs, where it remains the bedrock on which the international financial and trading systems are built. As codified in the IMF Articles of Agreement, monetary multilateralism traditionally has meant the convertibility of national currencies on a non-discriminatory basis and rejection of the currency blocs and competitive devaluations that characterized the interwar period. As enshrined in the General Agreement on Tariffs and Trade (GATT), trade multilateralism has meant application of the most-favored-nation principle on a non-discriminatory basis. In both the IMF and GATT/World Trade Organization (WTO) systems, offenders against agreed multilateralist principles have been countries within these organizations that have failed to observe these principles. Nonmember countries do not take on the obligations or receive the benefits of membership, but the stigma of unilateralism has not attached to nonmembership. In the political sphere, multilateralism is embodied in the universally accepted obligations contained in the U.N. Charter, the provisions of international treaties, and customary international law. Given the somewhat schizophrenic character of the charter's attitude toward the state, unilateralism is both absolutely prohibited (Article 25, obligation to carry out the decisions of the Security Council) and absolutely protected (Article 51, inherent right to self-defense; Article 2, sovereign equality and sanctity of domestic jurisdiction). A third and increasingly contentious area in which multilateralism applies is in regard to global issues such as arms control, the environment, and human rights. In these areas, unilateralism tends to be associated with nonparticipation in or non-participation of agreements, as in the U.S. rejection of Kyoto and the ICC. It is not clear, however, when a nonuniversal agreement to cooperate in a particular issue area should acquire the same multilateral status as, for example, the U.N. Charter or whether a state exercising its traditional sovereign right not to sign such a treaty should be branded as unilateralist. Against this background, it is worth identifying some of the key conceptual issues that might be used to frame a more productive transatlantic discussion of multilateralism. …

17 citations


Journal Article
TL;DR: A "EUROPEAN CONVENTION" chaired by former French President Valery Giscard d'Estaing recently finished drafting a new constitution for the European Union, but the parallels with the Philadelphia Convention of 1787 that this inevitably conjures up for American observers are extremely misleading as discussed by the authors.
Abstract: A "EUROPEAN CONVENTION" chaired by former French President Valery Giscard d'Estaing recently finished drafting a new constitution for the European Union, but the parallels with the Philadelphia Convention of 1787 that this inevitably conjures up for American observers are extremely misleading. Anyone who expects the current debate over European unification to mirror the historic contest in the United States between Federalists and Anti-Federalists is quickly disabused. That was an argument about the proper locus of sovereignty and the appropriate scale of the state. Politicians can sometimes be heard voicing such concerns in Europe today, but in scholarly and intellectual circles the predominant tendency is not to argue about where sovereignty should be lodged, but to call into question the concept of sovereignty; not to argue about how big the state should be, but to wonder about whether the era of the modern state is coming to an end. This may seem odd at a time when the modern state seems to be enjoying the hour of its greatest triumph. Virtually the entire world now consists of independent states, their number greater than ever before. And the most important global institutions, beginning with the United Nations itself, are intergovernmental organizations whose members are states, represented by the delegates of their governments. Yet there is no denying the fact that in many quarters, especially in some of the advanced democracies, there is a widespread feeling that the modern state is becoming obsolete, that it is increasingly incapable of responding to the problems of the contemporary world, and above all to the challenges posed by globalization. It is this feeling that shapes the moral and political context in which European unification is unfolding. In one sense, of course, the EU is merely a regional organization, but the debate over its future is intimately bound up with the issue of globalization. Globalization is a subject on everyone's lips today, not just in Europe but around the world. I am inclined to believe that recent advances in telecommunications technology and in the internationalization of markets have created a greater degree of mutual interpenetration among societies worldwide than ever existed before. But the trends that are summed up by the term "globalization" are not new. Following the rise of multinational corporations and the oil price shocks of the 1970s, many observers called attention to the idea of international "interdependence." And some scholars have plausibly argued that there was greater international openness and mobility during the period prior to World War I than there is today. In my view, what is distinctive about the current discourse on globalization is the jaundiced view that it takes of the modern state. After having long been regarded as the culmination of political evolution and the indispensable framework for freedom and democracy, the state is now often seen as a historically contingent institution built on shaky moral foundations. Deconstructing the state ONE OF THE scholars who appears to have been especially influential in shaping current thinking about the modern state is John Ruggie. Fittingly enough, Ruggie not only is a distinguished professor of international relations, but has recently served as assistant secretary-general of the United Nations. His writings, and especially his International Organization article "Territoriality and Beyond: Problematizing Modernity in International Relations" (Winter 1993), are widely cited not only in the academic literature but also in more policy-oriented discussions regarding the future of the European Union. What Ruggie "problematizes" in his essay is not just modernity, but the modern state and the concept of sovereignty. The discipline of international relations tends to take for granted the "modern system of states," Ruggie argues. Thus, while it is adept at under standing changes in the balance of power among states, it is poorly equipped to understand the more momentous kind of transformation that may result in "fundamental institutional discontinuity in the system of states. …

15 citations


Journal Article
TL;DR: New Zealand was at the forefront of cutting-edge liberalizing economic reforms, an agenda that was pursued by both the country's main political parties as mentioned in this paper. But now the reformers are to be found at the margins of New Zealand politics.
Abstract: ADECADE AGO, New Zealand was at the forefront of cutting-edge liberalizing economic reforms, an agenda that was pursued by both the country's main political parties. First, the left-of-center Labour party, elected in 1984, deregulated, privatized, cut industry and agriculture welfare, and pushed through a major switch from an income to a consumption tax. It ended up proposing -- but not enacting -- a 21 percent flat rate tax. Then, in 1990, the right-of-center National government took up the reform baton with steep cuts in welfare programs and the most radical shake-up of labor law outside Margaret Thatcher's Britain. The economics worked: 4 percent growth in the mid-1990s and the fastest growth of employment in any OECD country. But now the reformers are to be found at the margins of New Zealand politics. The current Labour government led by Prime Minister Helen Clark -- containing some of the politicians most opposed to the Kiwi reforms -- was reelected in a 2-1 landslide in July 2002 after three years in power. What happened? In fact, the New Zealand case is part of a pattern. Bill Clinton and Tony Blair reaped what Ronald Reagan and Margaret Thatcher sowed. In central Europe, reformers have been replaced by their opponents -- Vaclav Klaus as prime minister in the Czech republic by Milos Zeman, and the author of Poland's Balcerowicz plan by former communist Aleksander Kwasniewski. New Zealand's experience is instructive for free-market reformers around the world. On these South Pacific islands of a little under 4 million people has been distilled the political dynamics of market-based reform and the countervailing forces opposed to it. Does reform create or consume political capital? To what extent is reform reversible, or does it create new constituencies and permanently change the terms of political debate? Has reform reached a natural frontier at the fortress gates of big-government social welfare programs? The economics of reform NEW ZEALAND HAS a palpable sense of economic underperformance -- of a country that has come nowhere near testing the envelope of what is possible. Its stock market has yet to recover its pre-October 1987 highs. In Auckland and capital city Wellington, it has urban areas that, given their natural setting, should be two of the most magnificent waterfront cities in the world. But neither compares to Sydney across the Tasman Sea, to San Francisco, Hong Kong, or even Seattle. Their architecture seems trapped in the 1970s, more like Santiago de Chile or Warsaw. There aren't enough people in New Zealand. Indeed, it is one of the most underpopulated countries in the world. Take away Australia, with its vast interior deserts, and countries such as Canada, which extends into the Arctic, and New Zealand has the lowest population density of any country in the OECD. In terms of land area, New Zealand is 50 percent larger than Washington State but has just over half the population. New Zealand should be one of the fastest-growing countries in the world, a magnet for ambitious people looking for a better life. The change in migration flows with Britain illustrates the problem. From 1974 to 1980, both New Zealand and Australia were net importers of people from the UK. While Australia has continued to be a net importer, New Zealand started to lose people to Britain. Since the election of the current Labour government in 1999, the country's talent drain has worsened. New Zealand entered the twentieth century as one of the wealthiest countries in the world in terms of income per head. On the eve of World War 1, per capita income was higher than in the UK and within 3 percent of the United States. It maintained its ranking through the Depression years, and in 1950 per capita income was still 88 percent of that of the U.S. But the tightening grip of protectionism and corporatist policies remorselessly pushed New Zealand down the international rankings. …

14 citations


Journal Article
TL;DR: The authors Americans are remarkably catholic in this fat problem of theirs--and, the nod to egalitarianism aside, their universalism here works to just about everyone's detriment.
Abstract: JUST THREE MONTHS ago, a major study published in the Journal of the American Medical Association confirmed what any American with eyes even half-open could already have reported--that not only our adults, but also our children, are fat and getting fatter all the time. As the Department of Health and Human Services put it in a summary of this latest study's evidence, "Among children and teens ages 6 to 19, 15 percent (almost 9 million) are overweight according to the 1999-2000 data, or triple what the proportion was in 1980." The widespread media attention given to this bad-news story would appear to be justified, for the JAMA study followed at least two other blue-chip examinations during the past year or so of the underage fat explosion. One of these, a report on the whopping economic costs of child and adolescent obesity, was published in Pediatrics magazine by researchers for the Centers for Disease Control (CDC). The other publication was a less prominent but also intriguing report in the May issue of the Journal of Nutrition written by researchers at the University of North Carolina. This one emphasized one of the lesser-known aspects of the fat problem--that "adolescent obesity increases significantly among second- and third-generation immigrants to the United States," in the words of a UNG press release. This is not to say that underage corpulence is unique to Americans and their offspring--far from it. Misconceptions and undeserved reputations to the contrary most other advanced countries (and for that matter, a number of not-so-advanced ones) do indeed share in the child-fat-and-obesity problem, for the most part differing from us in degree rather than kind. In England, reported the Guardian earlier this year, "Adult obesity rates have tripled and those in children have doubled since 1982." In Canada, says the Globe and Mail, also in 2002, "More than a third of Canadian children aged 2 to 11 are overweight, and half that number are obese, according to newly published Statistics Canada data." Moreover, "Canada now has more fat children than fat adults." As for Australia, a 2000 study there found that children of either sex were twice as likely to be defined as overweight in 2000 as in 1985. Nor is the Anglo-speaking world the only one with a child-fat problem. Its svelte reputation quite aside, for example, continental Europe and its children are ballooning as well. In Italy, report researchers for the Bollettino Epidemiologico Nazionale, "Neapolitan children were more at risk of obesity than were children from France, Holland, the United States, and also than children living in Milan in northern Italy," while in the province of Benevento, "The prevalence of overweight and obesity was greater... than in England, Scotland, and the United States." In Germany, according to researchers in the International Journal of Obesity, a "large study on all children entering school in Bavaria in 1997 shows patterns of overweight and obesity which are comparable with other European data" (though still "lower than us and Australian data"). Even vaunted France, if the French National Institute of Health and Medical Research is to be believed, admits an obesity problem among 10-year-olds of "epidemic proportions. " "The number of seriously overweight children in France," reports the same institute, "has more than doubled since the 1980's." Yet even granting the ambiguous relief of knowing that we are not the only giants pounding the earth, there is still no denying that the world's sole remaining superpower is also its most supersized. We Americans are remarkably catholic in this fat problem of ours--and, the nod to egalitarianism aside, our universalism here works to just about everyone's detriment. The fact that we fatten up even our immigrants, several subsets of whom are measurably bigger here than are counterparts in their home countries, means that we are virtually guaranteed a steadily growing quotient of bigger and bigger people, with an unhealthy degree of heaviness becoming ever more the unremarkable norm. …

12 citations


Journal Article
TL;DR: For some people, prisons are a substitute for parents This apparent overstatement is shorthand for two more precise points First, without parents, two of them, married to each other, working together as a team, a child is more likely to end up in the criminal justice system at some point in his life Without parents, prison becomes a greater probability in the child's life as discussed by the authors.
Abstract: FOR SOME PEOPLE, prisons are a substitute for parents This apparent overstatement is shorthand for two more precise points First, without parents--two of them, married to each other, working together as a team--a child is more likely to end up in the criminal justice system at some point in his life Without parents, prison becomes a greater probability in the child's life Second, if a child finds himself in the criminal justice system, either in his youth or adulthood, the prison will perform the parental function of supervising and controlling that person's behavior Of course, prison is a pathetic substitute for genuine parents Incarceration provides extreme, tightly controlled supervision that children typically outgrow in their toddler years and does so with none of the love and affection that characterize normal parental care of small children But that is what is happening: The person has failed to internalize the self-command necessary for living in a reasonably free and open society at the age most people do Since he cannot control himself, someone else must control him If he becomes too much for his parents, the criminal justice system takes over These necessary societal interventions do not repair the loss the child has sustained by the loss of a relationship with his parents By the time the penal system steps in, the state is engaged in damage control A child without a conscience, a child without self-control, is a lifelong problem for the rest of society A free society needs people with consciences The vast majority of people must obey the law voluntarily If people don't conform themselves to the law, someone will either have to compel them to do so or protect the public when they do not It costs a great deal of money to catch, convict, and incarcerate lawbreakers--not to mention that the surveillance and monitoring of potential criminals tax everybody's freedom if habitual lawbreakers comprise too large a percentage of the population The basic self-control and reciprocity that a free society takes for granted do not develop automatically Conscience development takes place in childhood Children need to develop empathy so they will care whether they hurt someone or whether they treat others fairly They need to develop self-control so they can follow through on these impulses and do the right thing even if it might benefit them to do otherwise All this development takes place inside the family Children attach to the rest of the human race through their first relationships with their parents They learn reciprocity, trust, and empathy from these primal relationships Disrupting those foundational relations has a major negative impact on children as well as on the people around them In particular, children of single parents--or completely absent parents--are more likely to commit crimes Without two parents, working together as a team, the child has more difficulty learning the combination of empathy, reciprocity, fairness, and selfcommand that people ordinarily take for granted If the child does not learn this at home, society will have to manage his behavior in some other way He may have to be rehabilitated, incarcerated, or otherwise restrained In this case, prisons will substitute for parents The observation that there are problems for children growing up in a disrupted family may seem to be old news Ever since Barbara Defoe Whitehead famously pronounced "Dan Quayle Was Right" (Atlantic Monthly, April 1993), the public has become more aware that single motherhood is not generally glamorous in the way it is sometimes portrayed on television David Blankenhorn's Fatherless America (Basic Books, 1995) depicted a country that is fragmenting along family lines Blankenhorn argued, and continues to argue in his work at the Institute for American Values, that the primary determinant of a person's life chances is whether he grew up in a household with his own father …

10 citations


Journal Article
TL;DR: The idea of the "information edge" was introduced by Nye and Owens as discussed by the authors, who argued that the one country that can best lead the information revolution will be more powerful than any other.
Abstract: Where is the knowledge we have lost in information? --T. S. Eliot MULTIFACETED DISCUSSIONS about "America's information edge" transpired throughout the 1990s. In a significant article by that name in Foreign Affairs (March-April 1996), Joseph S. Nye and William A. Owens captured the essence and underlying importance of the idea: The one country that can best lead the information revolution will be more powerful than any other. For the foreseeable future, that country is the United States. America has apparent strength in military power and economic production. Yet its more subtle comparative advantage is its ability to collect, process, act upon, and disseminate information, an edge that will almost certainly grow over the next decade. In defense policy, the 1990s information-edge thesis appeared in different guises. Concepts such as information superiority, dominant battlespace knowledge, and decision superiority emerged as key elements of joint doctrine. National security strategy discussions focused on national information highways and critical infrastructure protection--key components of sustaining information-edge capabilities. In the most significant intelligence organizational reform of the decade, the National Imagery and Mapping Agency (NIMA) was founded with the mission of "guaranteeing the information edge." By the end of the decade, the Department of Defense, the Central Intelligence Agency, and other national security agencies presented strategic plans aiming to sustain and expand America's information edge while planning for increased volumes of information gathered from an increasingly diverse range of sources. Our adversaries, meanwhile, moved to create and exploit their own information advantages. Al Qaeda, for example, developed a global intelligence capability, adapted the latest commercial information technology for their purposes, and exploited seams in our security defenses (witness the group's sophisticated use of stenography within the World Wide Web to communicate with operatives). Discussion of these seams now dominates national security reform debates. For us, the post-September 11 talk about intelligence transformation begged the question, Quo vadis, America's information edge? Despite advances in information technology and knowledge management within the most visible area of national security--the military--America's overall commitment to preserving its information edge across the larger security bureaucracy, pace Nye and Owens, foundered during the 1990s. To be sure, the situation is improving. Great strides in information sharing are being made. Pockets of innovation do exist. Additional funds are now available to correct nearly a decade of resource shortfalls. Yet we contend that despite significant initiatives to transform, government-wide information-sharing innovations and intelligence-integration initiatives are evolving too slowly. Framing the coming intelligence debate WE BELIEVE THAT the coming year will witness an unparalleled national debate over the future of American intelligence. Attention at the official level will be necessary to effect change, but by itself it is insufficient. What will also be needed is a reasoned public debate about the purposes and dynamics of U.S. intelligence. The debate will address a number of issues, though intelligence sharing and cross-agency integration will remain at the forefront. The intelligence-sharing issue has been a favorite topic within American security planning since the summer of 2002. So too have subsequent efforts to discover who had information and intelligence about the September 11 terrorist attacks and what, if anything, would have been different had all information been shared among U.S. intelligence and law enforcement agencies. The joint congressional inquiry into the intelligence context of 9-11 provides a number of points for consideration. …

9 citations


Journal Article
TL;DR: In this article, Sommers pointed out what I considered to be serious methodological flaws in the evaluations of these programs, especially with regard to the recruitment and retention of participants and/or the use of questionable analytical techniques such as one-tail significance tests and post hoc sub-group analysis.
Abstract: IN NOVEMBER 2001, a bizarre incident occurred at a conference sponsored by the Center for Substance Abuse Prevention (CSAP) in Washington, D.C. As reported by Stanley Kurtz on National Review Online (December 5 and 11, 2001) and Sally Satel on Tech Central Station (December 7, 2001), Christina Hoff Sommers of the American Enterprise Institute was invited to address an assembled audience of CSAP staff, grantees, and consultants concerning the agency's intentions to fund "Boy Talk," a prevention program for young men designed to influence such behaviors as drug use and violence. Sommers is a critic of this type of gender-specific exercise in health education, and suggested to the CSAP audience that the effectiveness of such programs and the commitment of dollars to them should be informed by scientific evidence regarding their effectiveness. She observed that "Girl Power!"--the program on which "Boy Talk" was based--remained unsupported by empirical evidence despite having been the recipient of federal funds fo r the past six years. At this point, a CSAP official informed Sommers that she should end her presentation, apparently because the "Girl Power!" program was off limits. Whether Sommers had been informed of this taboo before she began her talk is not clear, but given the association between the two programs it seems reasonable that the experiences concerning one be used to inform implementation of the other. Anyway, Sommers appears to have soldiered on with her presentation, for within minutes she was apparently instructed to "shut the f--- up, bitch" by a member of the audience, causing much merriment among the assembled group of professionals. Over the summer, I had a similar, although not as overtly censorial or hostile, experience while presenting at the Tenth Annual Meeting of the Society for Prevention Research in Seattle, Washington. While nobody dropped the F-bomb on me or called me "bitch," the response to my attempt to examine the scientific base of some widely advocated prevention programs was an ad hominem attack coupled with defensive arguments justifying the violation of basic tenets of evaluation practice in prevention research. Science and the learned society MY PRESENTATION AT the Society for Prevention Research conference focused on a list of drug and violence prevention programs deemed by an expert panel convened by the U.S. Department of Education's Safe, Disciplined, and Drug-Free Schools program to be "exemplary." According to the criteria used by this panel to define exemplary, there must be at least one evaluation that has demonstrated an effect on some violent or drug-related behavior, and this evidence must come from a "methodologically sound evaluation." In my presentation, I critiqued the Education Department's criteria through an examination of three of the programs that were conferred exemplary status - Project ALERT, the Second Step curriculum, and the Adolescent Training and Learning to Avoid Steroids (ATLAS) program. I pointed out what I considered to be serious methodological flaws in the evaluations of these programs, especially with regard to the recruitment and retention of participants and/or the use of questionable analytical techniques such as one-tail significance tests and post hoc sub-group analysis. In addition, I noted that while there were indeed isolated effects on behavioral outcomes to be found in the evaluations of these programs, the preponderance of evidence showed that they had little or no effect on drug use or violence. For example, the evaluation of the violence prevention program Second Step produced only one statistically significant result out of the 20 outcomes that were assessed at final follow-up, while two evaluations of the ATLAS steroid prevention program failed to demonstrate any effects on steroid use. The annual conference of the Society for Prevention Research seemed an appropriate venue to discuss such issues. …

6 citations


Journal Article
TL;DR: The first "person" in the new Bush "trinitarian" doctrine is military predominance, or, if you like, dominance as mentioned in this paper, which stresses American military dominance, military preemption, and political transformation, and it represents the practical (not necessarily successful) integration of international relations with non-Western political development in the form of an American foreign policy based on the ideological concept and political-military pursuit of democratic regime change.
Abstract: THROUGHOUT THE 1990S, THE 1990S, intellectuals and journalists, partly in response to the proliferation of prefixes -- post-Cold War, post-communist, even postmodern -- engaged in a competitive and seemingly imperative quest to name an era. The results of this effort at authoritative naming were phrases like the "end of history," the "clash of civilizations," an "age of anarchy," and, of course, "globalization" -- none of which, to the authors' undoubted frustration, swept the field. I saw the 1990S as a "Genesis age," a period of history when the world was not biblically "void" but was most assuredly beginning to see its "form," i.e. its shaping institutions (the nuclear family, nuclear deterrence, the nation-state), begin to lose their unchallenged status. Lacking the parsimonious elegance and dogmatism of many others, I also saw the 905 as a "Toga period," a decade when the world responded to the unique reality of American global dominance by imitating - not assimilating -- everything from legislative demo cracy to golf (the American "toga"). (1) With al Qaeda's attack on the U.S. in September 2001, the competitive scholastic exercise over naming was replaced by a more momentous political effort by the Bush administration to identify the threatening, and to author the defining, features of our age. The result is novel to the point of being radical and, unlike academic exercises, consequential. According to the administration, the essential element of our era is the threat emanating from a combination of tyrannical states and what I have called "movements of rage," a malignant political coalition that relentlessly pursues and may succeed in possessing and using weapons of mass destruction (WMD) against the United States and its allies. (2) The Bush national security doctrine is a response to the likely proliferation of horrendous "wildcat violence" when state disintegration and/or the covert actions of tyrannical regimes offer movements of rage access to insidious weapons whose advanced technology demands only global reach, not global power. Largely in response to this possibility, the Bush doctrine stresses American military predominance, military preemption, and political transformation. From an historical point of view, these are extraordinary ambitions. More, they represent the practical (not necessarily successful) integration of international relations with non-Western political development in the form of an American foreign policy based on the ideological concept, and political-military pursuit, of democratic regime change. Dominance THE FIRST "PERSON" in the new Bush "trinitarian" doctrine is military predominance -- or, if you like, dominance. In the administration's words, "our [military] forces will be strong enough to dissuade potential adversaries from pursuing a military build up in hopes of surpassing or equaling the power of the United States." (3) This tenet has no immediate bearing on the international issues facing the United States because it will most likely take at least a decade for any imaginable nation to be taken seriously as a military competitor (unless, of course, Japan undergoes radical regime change on its own nationalistic terms). But if the administration is looking at the long term, so will I. Suppose, for example, the European Union becomes a stable, effective, legitimate political entity in world affairs. As such, its expanding population would be greater than ours, its economic power nearly equal, and its military potential the same. Is it at all reasonable to assume that the United States would politically veto, economically prevent, or militarily challenge the further and future integration and military development of a Western, democratic, capitalist Europe? Or, if an economically successful China and an increasingly stable Russia form a political-military alliance, would the United States attack these two nuclear powers to prevent them from "equaling" the United States? …

Journal Article
TL;DR: For example, the authors argues that if a person's primary loyalty is not to her own country, but to a particular group, then why not to a group of others, such as a particular class or race or region or trade association?
Abstract: IS IT WRONG to teach our children to be patriotic? Or may we teach them to be a little patriotic, provided that we also teach them to value and respect the cultures of others? Should they be encouraged to be loyal to their own nation, or should they be taught that they are citizens of the world before all else? These are the questions that were addressed in the justly celebrated essay "Patriotism and Cosmopolitanism" by Martha Nussbaum, and her answer was that education should actively encourage "the very old ideal of the cosmopolitan, the person whose primary allegiance is to the community of human beings in the entire world." (1) The significance of the cosmopolitan ideal in contemporary American education is obvious at every level. At its most topical, it is illustrated by the course of Islamic studies that John Walker was encouraged to pursue in his Marin County high school, almost entirely to the exclusion of the once mandatory courses in American history and the subject quaintly known as civics. Indeed, a look at the curriculum of modern public education demonstrates a core emphasis on precisely those subjects that do not require an undue prejudice toward, or interest in, one's own culture, if it happens to be American. But what is the basic justification for the notion that American education should promote the cosmopolitan ideal? And, indeed, that it should promote this ideal even if it is at the expense of less inclusive ideals such as patriotism? At the beginning of her essay, Nussbaum sketches an answer. If we are going to stop short of full-fledged cosmopolitanism and choose to promote patriotic values instead, then what is to keep us from descending to increasingly narrower and ever more exclusive categories of allegiance, such as our particular class or race or region or trade association? Or, as Nussbaum puts it, "nationalism and ethnocentric particularism are not alien to one another, but akin" -- so that "to give support to nationalist sentiments subverts, ultimately, even the values that hold a nation together." But there is an immediate problem with this argument. For if Nussbaum is making a historical claim, the record runs counter to it. The rise of both German and Italian nationalism in the nineteenth century was accompanied by a struggle against all those less inclusive forms of particularism -- regional, linguistic, cultural, religious, and ethnic -- that had kept both of these nations so politically backward. In fact, it could be argued that the point of nationalism was precisely to dissolve the hold that lesser forms of group loyalty had traditionally imposed on the human mind by subsuming these lesser loyalties under an allegiance to the larger community. It is possible, however, to recast Nussbaum's argument as one that is not historical, but logical, in which case it can be stated roughly as follows: If your first loyalty should be to your own group, then why arbitrarily make this group the one represented by your country? Why not direct your primary allegiance to your own particular tribe or kinship group? After all, if the mere accident of belonging to such and such a group is to be the basis of your moral allegiance, then what makes your country a more logical choice than your sect or tribe or even your family? Where do you draw the line? The point of this argument should be obvious: There is only one nonarbitrary point at which such a line may be drawn, and that is at the community of all the human beings on the planet. This is the only group that cannot be challenged as being merely accidental in the sense that you might have been born into one nation rather than another, or one sect rather than another, or one tribe rather than another. In other words, while we are accidentally white or black, Christian or Muslim, Anglophone or Francophone, Indian or Swiss, we are all essentially and necessarily human beings, and this is something that remains no matter how much we abstract from the contingencies of our incidentality. …

Journal Article
TL;DR: In the United States, the term "progressive" has become synonymous with the notion of "progressivism" as mentioned in this paper, which is a more appropriate label for the attitude of many on the left of the political spectrum.
Abstract: I. Our liberalism NEVER HAS A people enjoyed a greater range of individual rights, or been more jealous of their freedoms, or been more convinced that the liberty they prize is good not only for themselves but also for other peoples than we in the United States today. The freest society in most respects that the world has ever seen has produced the world's most diverse society; the world's best army; the world's most organized, industrious, and productive economy; and a political order that to a remarkable degree contains the factions and divisions that have prevented so many other countries from innovating and solving collective problems. This represents the triumph in America of liberalism, a tradition of thought and politics stretching back at least to seventeenth-century England, whose fundamental moral premise is the natural freedom and equality of all and whose governing theme has been the securing of equal freedom in political life. Yet cause for anxiety comes from many quarters. Freedom in America has produced or permitted massive income inequalities. It has given rise to a popular culture that frequently descends into the cheap and salacious. It maintains a public school system that fails to teach many students the basics of reading and writing and arithmetic; and at higher levels of education, it breeds an academic culture that preaches the relativity of values and that cannot reach agreement on what a well-educated person ought to have learned by the time he or she graduates from college. It has contributed to a destabilizing erosion of the old rules, written and unwritten, that govern dating, sex, love, marriage, and family. It has fostered among opinion makers and intellectual elites a distrust that borders on contempt for religious belief. And it has fortified among the highly educated an uncritical faith in the coincidence of scientific progress and moral progress. To understand the challenge whole, it is first necessary to correct an unfortunate confusion of terms. In the United States, "liberal" commonly denotes the left wing of the Democratic Party. To be sure, as a result of bruising post-1960s political battles, many on the left have disavowed the term liberal, choosing instead the label "progressive," in fact a more apt designation for their outlook. Nevertheless, the term liberal retains a distinctive meaning, indeed a progressive one, in our political lexicon. It was not foreordained that "liberal" would become synonymous with progressive politics as it has in the United States. Witness the career of the term in Europe, where it has come to designate something much closer to libertarianism. Yet neither is the equation of liberalism with progressivism an accident, for there is a powerful progressive thrust inhering in the liberal tradition. When it arose in the seventeenth century, before it acquired its name, liberalism, particularly that of Locke, sought to limit the claims of religious authorities in politics and the claims of political authorities in religious matters. As these ideas took root, as religion receded from the center of politics (and as science and industry developed and markets spread), individual freedom acquired more space, more individuals began to enjoy its blessings, and power shifted to those who had long been denied it. When it came into its own in the nineteenth century, liberalism, particularly that of Mill, sought to limit the role in politics of status, wealth, and sex by assuring through the state formal equality. The result was to accelerate the pace at which power shifted to the people and to spread the blessings of freedom more equally. And when, in the United States in the last third of the twentieth century, it became synonymous with the left wing of the Democratic Party, liberalism aggressively sought to limit the role in politics of poverty, race, sex, old age, illness, and disability by guaranteeing to all individuals a certain minimum level of material goods and moral standing. …

Journal Article
TL;DR: The U.S.-Russian relationship has not yet recovered from the war in Iraq as mentioned in this paper, despite the fact that the two leaders seem likely to try to overcome their differences at their first meeting since the war on June x in St. Petersburg.
Abstract: THOUGH PRESIDENTS GEORGE W. BUSH and Vladimir V. Putin continue to express their desire to work together after sharp differences over Iraq, their governments have not yet managed to do so in a meaningful way. The two leaders seem likely to try to overcome their differences at their first meeting since the war on June x in St. Petersburg. Yet, even after that meeting, the future of the U.S.-Russian relationship will be somewhat uncertain. Before the flare up over Iraq, the United States and Russia enjoyed what some have described as their best relationship since Russian independence. Despite disagreements over the American withdrawal from the Anti-Ballistic Missile Treaty and the second round of NATO enlargement, the strong personal connection between the two presidents and new cooperation in the war on terrorism had contributed to a sense of optimism that Washington and Moscow were finally on track to becoming real partners. As a result, Russia's assertive opposition to the U.S.-British war against Saddam Hus sein came as a particular shock to many in the United States (and confirmed the suspicions of those who were not shocked) -- and the impact has only been worsened by Moscow's thus-far obstructionist postwar conduct. Yet the relationship remains one of considerable importance to American national interests. The Kremlin's cooperation in the war in Afghanistan -- in sharing intelligence, stepping up its preexisting effort to arm the Northern Alliance, and setting aside earlier objections to a major U.S. military presence in the region -- significantly aided U.S. forces in the field. And a strong and sustainable relationship with Moscow can serve important and even vital American interests in many other areas, ranging from the war on terrorism to non-proliferation and international trade and investment. Conversely, a weak relationship with Russia could embolden "rogue states" hostile to the United States, return the United Nations Security Council to its Cold War uselessness, and expose Americans to additional danger from terrorism and weapons of mass destruction. This raises two questions. What can be done to strengthen the U.S.-Russian relationship and put it on a more solid foundation? And, taking into account obvious and substantial differences with Moscow on some major international issues, how far can the relationship really go? What went wrong? ANY DISCUSSION OF improving the U.S.-Russian relationship should begin with an understanding of the status of the relationship today and analysis of "what went wrong" in American efforts to win Russian support for, or at least acquiescence to, the war in Iraq. Unfortunately, even before Iraq, neither Washington nor Moscow was satisfied with the progress in the relationship. American officials frequently complained that in the absence of Kremlin involvement, Russian government departments routinely obstructed effective collaboration. Russian officials similarly grumbled that only the White House could force action from Cold War-era bureaucrats in the Pentagon and the Central Intelligence Agency. Analysts and commentators in both countries lamented excessive reliance on the personal relationship between the two presidents. Nevertheless, American and Russian officials continued to declare their commitment to building a strong U.S.-Russian relationship, and - despite reservations about the American use of force and concerns that Russia could face more terrorism after a war -- Moscow initially seemed inclined to accommodate the United States on Iraq, where Russian economic and other interests were significant but not first-order concerns. After their meeting in St. Petersburg in November, Presidents Bush and Putin issued a joint statement essentially reiterating the message of U.N. Security Council Resolution 1441 by calling on Iraq "to completely and immediately comply" with all relevant U.N. Security Council resolutions or "face serious consequences. …

Journal Article
Abstract: CHINA IS WIDELY recognized as the next superpower. Over the past decade, it experienced tremendous economic growth and embarked on a major defense buildup. China now has the second largest defense budget in the world, with expenditures to boost its intercontinental ballistic missile arsenal and acquire nuclear submarines and destroyers. Yet the Chinese air force remains very weak, with capabilities dramatically inferior to the U.S.'s. The arsenals of the People's Liberation Army Air Force (PLAAF) and Naval Air Force (P LANAF) consist mostly of fighter planes (used primarily for defensive purposes) imported from Russia. The scarcity of bombers (used for offensive purposes) and China's continued reliance on foreign planes pose a puzzle to U.S. defense planners. Apparently content to rely on missiles to project power, China's doctrine contrasts sharply with American ideas about the importance of air superiority. Following the successful air campaigns of the 1991 Gulf War, Chinese defense analysts tuned in to the American debate over the possibility of relying on air power alone and the connection between the use of air power and avoiding friendly military and foreign civilian casualties. In recent publications, generals from China's military academies have treated air power-related themes -- including what America's strategic air advantage consists in and how it might be mitigated or neutralized -- at length. (1) Chinese observers also noted how the wars in Kosovo and Afghanistan confirmed American faith in the increasing efficacy of air power in light of dramatic technological advances. A lengthy May 2000 a article in the PLA's daily newspaper emphasizes the importance of improving China's air power capabilities. "In the 1990s, air warfare... reached a zenith," states author Dong Wenxian. "And no sooner had the Gulf War ended than the us President declared, 'The most important lesson learned from the Gulf War is the value of air power.'" Characterizing U.S. air power in Bosnia and Kosovo as the new standard mode of warfare, the writer concludes, "The primary threat faced by peace-loving third-world countries comes from the air, and regardless of whether they are willing or have the capabilities ... the contest in the air will be the decisive one." (2) Similarly, in a 1998 article called "The Military Revolution and Air Power," Major General Zheng Shenxia, president of China's Air Force Command College, and Colonel Zhang Changzhi argue: "Future information [high-tech] war will rely more and more on air superiority. The air force will no longer be an important independent strategic force but a n effective conventional campaign force that all services will depend upon." (3) Their discussions indicate that China's defense establishment recognizes the centrality of air power to U.S. operations. But the commentary, considered together with China's capabilities, suggests that in a military confrontation, China would adopt a different approach from the United States. China's leaders from Mao and Deng to Jiang Zemin have scanted domestic aircraft production, preferring to invest in ballistic missiles, antiaircraft artillery (AAA), and surveillance and reconnaissance equipment. China now has about 200 conventional ballistic missiles pointed at Taiwan and is adding 50 more each year. In total, the Chinese arsenal boasts 800-1,000 fighter aircraft, many of which are in some state of disrepair. With air power proving ever more important in the post-Cold War world, China's relative neglect of this force branch constitutes a conundrum. The divergence of Chinese air power doctrine from the U.S.'s presents another conundrum. Where the American literature stresses the goal of gaining and maintaining air superiority, Chinese doctrine emphasizes preemption and deception. Against a technologically superior foe like the U.S., China's strategists recommend disabling the enemy to prevent his exercising command of the air. …

Journal Article
Abstract: WITH AMERICA'S departure from the Anti-Ballistic Missile Treaty late in 2002, Bush officials have claimed that America has begun to lead the world away from security policies based on mutual assured destruction (MAD). The administrations decision to deploy a national missile defense system in Alaska certainly is a clear refutation of MAD-based opposition to such protection. What's less clear, however, is how America's rejection of MAD might affect U.S. nuclear weapons policies beyond missile defense--specifically, America's plans to stem the spread of nuclear weapons or to use nuclear weapons in certain circumstances. MAD and NPT TO AN EXTENT not generally appreciated, U.S. and international nonproliferation policies have had a fairly tight relation to MAD. During the Cold War, the most popular view concerning nuclear weapons reflected the MAD view that having a nuclear force capable of killing large numbers of civilians afforded nations basic security against attack. There also was a MAD fear that any attempt by nations to go beyond the finite force levels needed to attack undefended cities would lead to warprone arms races. The thinking here was that if the superpowers targeted more than their opponents' vulnerable cities, they would be forced to develop ever-quicker, more accurate nuclear delivery systems (necessary to evade or destroy opposing weapons). They also would have to place their weapons on hair-trigger alert and risk deploying them tactically to an ever-growing number of military commanders. All of this, it was argued, would only increase the chances of nuclear war. (1) These views certainly were common during the mid-1960s and were quite prevalent among those negotiating the Nuclear Nonproliferation Treaty (NPT). Thus, by the late 1960s, most of those crafting the NPT argued that the real proliferation danger emanated not so much from the spread of nuclear weapons to more nations as from as the superpowers' own never-ending arms race. This rivalry, these diplomats argued, was even more likely to result in worldwide destruction than smaller states' "independent manufacture" of nuclear weapons. (2) They agreed that all nations had a right to acquire nuclear weapons to defend themselves (not only against possible nuclear neighbors, but as a hedge against the superpowers if they refused to curb their own nuclear arming). But if "because of higher considerations of the interests of mankind" non-weapons states decided not to exercise this right, they were equally convinced that these states deserved to be compensated. (3) Under the NPT, this compensation consisted of: (I) non-weapons states having an "inalienable right" to acquire all forms of nuclear energy technology (Article IV); (2) the demand that the superpowers engage in good faith negotiations on "effective measures relating to the cessation of the nuclear arms race" (Article VI); and (3) the right of non-weapons states to withdraw from the NPT and develop nuclear weapons "if extraordinary events ... have jeopardized the[ir] supreme interests" (Article x). For nearly 30 years, this "grand bargain" was interpreted in a manner that focused greatest attention on the need for the superpowers to end the arms race--i.e., to stop nuclear innovation through nuclear testing and to reduce the size of their arsenals to levels (a few hundred weapons) no larger than needed to absorb an attack and yet be able to target other countries' undefended cities. Thus, the NPT's preamble calls for "the cessation of the arms race" and of further nuclear weapons production and testing. The treaty's negotiating record, meanwhile, speaks approvingly of restraints on national missile defenses (later to become the ABM Treaty) and on nuclear missile delivery systems (later to nuclear become SALT and START). As such, the various NPT review conferences that have been held on almost an energy annual basis since the NPT came into force have focused on these issues almost exclusively. …

Journal Article
TL;DR: A Future Perfect: The Challenge and Hidden Promise of Globalization (Random House, 2000) as mentioned in this paper was the first book to address the backlash against the current age of globalization and pointed out that, for all its other merits, globalization had made the West much more vulnerable to attack.
Abstract: ON THE MORNING of June 28, 1914, the world could rejoice in 60 years of extraordinary peace and progress. The first great age of globalization had made the world seem an infinitely smaller place. So great were the twin powers of technology (in the shape of the telephone, the telegram, the train, the car, electricity, the camera) and ideology (the gospel of free trade, guaranteed by the world's hegemonic power, Britain) that Edwardian intellectuals prophesied the end of all wars. Yet on that summer's day, one act of terrorism in Sarajevo--the assassination of the Archduke Ferdinand and his wife by a Serbian fanatic called Gavrilo Princip--set off a sickening train of events. The world plunged into the most horrific war in history, and even after the killing had stopped, countries everywhere renounced their previous openness, fortifying their borders to limit the movement of goods, people, and even ideas. It would be absurd to blame all the miseries of the first half of the twentieth century on a single act of terrorism: The causes of world war and protectionist folly had been germinating for years. But those causes became clear only in retrospect. John Maynard Keynes nicely describes the typical middleclass Londoner in 1914, "sipping his morning tea in bed" while ordering goods from around the world and planning his global investments. For such a man, "the projects and politics of militarism and imperialism, of racial and cultural rivalries, of monopolies, restriction and exclusions, which were to play the serpent to this paradise, were little more than the amusements in his daily newspaper." For such a man, and millions of others, Gavrilo Princip's two shots marked a turning point. In the first edition of our book, A Future Perfect: The Challenge and Hidden Promise of Globalization (Random House, 2000), one of the figures we chose to illustrate the backlash against the current age of globalization was another terrorist: Osama bin Laden. Drawing on research by journalist Peter L. Bergen, who had interviewed bin Laden in Afghanistan, we quoted his fury against the "New World Order" that "haughty" America was imposing on the world. We noted that bin Laden, like so many other opponents of globalization, had been remarkably sophisticated about exploiting the process he professed to hate, using the latest technology to promote his medieval message. And we argued that, for all its other merits, lowering borders had made the West much more vulnerable to attack. We raised the possibility of al-Qaeda using a primitive nuclear bomb to blow up the World Trade Center. At the time, this seemed a little far-fetched. We debated shortening the section on bin Laden, lest people might think that we were paying too much attention to the ramblings of a marginal crank. People were far more interested in the promise of open markets and technological innovation than they were in terrorism. It was a heady time for supporters of globalization, particularly in the United States. The stock market continued to defy gravity. The standard indicators of globalization pointed skyward: The volume of global trade increased steadily in the decade through 2000, and foreign direct investment flows topped $1.3 trillion. Economists speculated about the birth of a new economy" that had banished recessions for good. Gurus speculated about "Dow 36,000"--or whatever other number they plucked out of their hats. This was a time when Silicon Valley and Wall Street minted new millionaires every single day, when business-school graduates confidently expected to retire at 40 and begin a new career in philanthropy, and when the first signs of liberalization seemed to be awakening parts of the Third World. India and China were booming after decades of state-imposed stagnation. Of course, there were serpents in this paradise, no less than there were in the world of Keynes's Londoner. Too many people in the developing world lived in grinding poverty. …

Journal Article
TL;DR: The highly innovative Norwalk vaccine, which is grown in tomatoes and administered in pills that contain powder derived from freeze-dried tomatoes, is the latest development in a mushrooming new field known by the punny name "biopharming."
Abstract: TRY AS WE MIGHT, we cannot seem to control a highly infectious bug called Norwalk virus that annually causes millions of cases of gastric distress in North America alone. Cruise ships and nursing homes endlessly scrub and sanitize and disinfect; staffs, passengers, and patients wear gloves and avoid person-to-person contact; but the epidemics of gastroenteritis continue. In recent months, cruises have been canceled or ruined, and hospital emergency rooms, schools, and nursing homes have closed down. University of Arizona scientists have come up with a highly unconventional remedy: the first vaccine against Norwalk virus, now awaiting approval for clinical trials in humans. Why unconventional? After all, anti-viral vaccines have been around since British physician Edward Jenner introduced smallpox vaccine in the eighteenth century. The highly innovative Norwalk vaccine, which is grown in tomatoes and administered in pills that contain powder derived from freeze-dried tomatoes, is just the latest development in a mushrooming new field known by the punny name "biopharming." Scientists can even use this approach to make vaccines that will prevent certain kinds of cancers that appear to be caused by viral infections. German researchers have engineered tobacco and potato plants to synthesize a protein component from the human papillomavirus, which is thought to cause cervical cancer, the third most common cancer among women worldwide. When the protein is extracted from the tobacco and potato plants, then purified and administered to mice, it induces an immune response. When the potatoes are fed directly to mice, they also trigger an immune response, though not as strong. If bioengineers one day develop a vaccine against cervical cancer, there will be a huge demand, especially if the vaccine can be stored in edible doses without refrigeration. Biotech companies are using gene-splicing techniques to reprogram crops--mainly corn, initially--to produce significant concentrations of high-value pharmaceuticals. The concept is not new. Many common medicines, such as morphine, codeine, the laxative Metamucil, and the anti-cancer drug Taxol, are all purified from plants. But biopharming's great promise lies in using gene-splicing techniques to make old plants do radically new things. There is also great potential for cost-cutting in the process: The energy for product synthesis comes from the sun, and the primary raw materials are water and carbon dioxide. In addition, biopharming offers tremendous flexibility and economy when adjustments in production are necessary. Doubling the acreage of a crop requires far less capital than doubling the capacity of a bricks-and-mortar factory, making biopharmed drugs potentially much less expensive to produce than those made in conventional ways. As little as 2,000 acres can provide the substrate for a year's supply of some products. Grain from a biopharmed crop can be stored safely for at least a year with no loss of activity. The quality of the final drug can meet the same standards as current fermentation technology using microorganisms. But storm clouds are gathering on the horizon. The food industry fears that gene transfer or "volunteer" biopharmed plants in the field could cause vaccines, drugs, and other products to contaminate the food supply, triggering costly recalls and presenting thorny liability issues. Therefore, in comments filed with the Food and Drug Administration in early 2003, food producers called for excessively stringent federal regulation. Some of the demands of the Grocery Manufacturers of America and other food trade associations were reasonable, but not their recommendations to the FDA that food plants should be off limits for biopharming "unless the company developing the drug product clearly demonstrates that it is not feasible to use non-food crops" and that "land, labor and equipment [be] dedicated solely to growing" biopharmed products. …

Journal Article
TL;DR: The main possibility for getting rid of the North's nuclear weapons is an agreed strategy between China and the United States as discussed by the authors, but there is no good evidence that this will happen.
Abstract: NORTH KOREA'S nuclear-weapons programs confront us with hard choices. They create a sense of urgency to make another deal with the North, but experience tells us that any new agreement will not the flow of crises. However we handle the immediate crisis, we will do better if we do so while having in mind an end position--something we have not done since the end of the Korean War 50 years ago. The argument here is that there should be different leadership in Pyongyang as a step towards the political unification of the peninsula. Short of that goal, the main possibility for getting rid of the North's weapons is an agreed strategy between China and the United States. Unfortunately, there is no good evidence that this will happen. The North's weapons pose three immediate challenges. Combined with its long-range missiles, North Korea's nuclear weapons could inflict devastation at long distances, including the United States. The threat to Japan is already rousing Tokyo to rearm. Worse still, the regime threatens to sell bombs to all comers, including terrorist organizations. Kim Jong Il's game THIS CRISIS WAS set off by the North admitting that it had a secret nuclear-weapons program in violation of the 1994 Agreed Framework. Negotiated by the Clinton administration, the framework promised economic benefits in return for North Korea's "freezing" its nuclear program. Since breaking the agreement, the Kim Jong Il regime has loudly proclaimed that the U.S. is planning to attack and has demanded a guarantee of security from us. Perhaps seeing our campaign against Iraq has persuaded Kim that he's next. But it seems more likely that he has a different and overriding perspective. It is to gain enough resources to stay in power. The system his father, Kim IL Sung, perfected combines extreme nationalism, severe internal repression, and a Stalinist economy. The economy's dysfunctions have led to the deaths of upwards of a million people in the past decade. Kim Jong Il's margin of survival comes from extortion. At its core are nuclear weapons--along with an implicit threat of collapse and resulting social chaos that would be costly to North Korea's neighbors. The weapons program apparently started in the late 1970s and has continued despite several international commitments to stop it, each violated. An obvious reason for starting the program was to change the military balance on the peninsula. Although the North's conventional forces were then relatively stronger than they are now, the U.S. had both troops and nuclear weapons in the South. In 1992 we removed our weapons as part of a denuclearization agreement between North and South--one of several agreements violated by the North. The U.S. estimated that the North could soon make enough plutonium for some nuclear weapons--and might have done so already. The resulting confrontation led to the Agreed Framework in 1994, in which the North agreed to shut down its reactor and store the spent fuel (containing plutonium) under international inspection. We and others agreed to provide food and fuel, to normalize relations, and to build two large nuclear electric power reactors. (The American negotiators seemed to have assumed, not unreasonably in 1994, that the North's regime would be gone by the time the reactors were finished.) If nuclear weapons were so important in the North's strategy, why did it agree to this freeze? Its principal source of aid, the Soviet Union, had disappeared in 1991. This, plus endemic mismanagement, threw the economy into a slump. Apparently the urgent need for food and fuel, the U.S. threat to attack North Korea's nuclear plants, and perhaps arm-twisting from China did it. (The Chinese did not sweeten the deal with food; they cut their supply in 1994-95.) The North also presumably knew something we have come to believe only since: that it had enough plutonium for a few weapons. And we now know that at some point in the 1990s it started work on a separate, enriched uranium-based weapons program, evidently with Pakistani help. …

Journal Article
TL;DR: In the United States, there is a broad disaffection with pro-market reformism, which runs from disappointment at privatization and marketization in Russia to the return of leftist populism in Latin America as discussed by the authors.
Abstract: HISTORICAL WATERSHEDS ARE clearly perceived only in retrospect, but we can at least speculate that we are crossing one now. After two decades in which Anglo-Saxon political economy has been dominated by deregulation, privatization, and faith in the magic of the market, we may be entering a period when free-market wisdom is no longer conventional. If we are lucky, a new consensus may form around a slightly different principle: We will celebrate free competition rather than free markets, and we will recognize that promoting competition may frequently require departure from the principle of laissez-faire. If we are less lucky, we may face a more sweeping backlash against enterprise, with high costs to our prosperity. The signs of this watershed are scattered, but they are too numerous to ignore. Abroad, we face a broad disaffection with pro-market reformism, which runs from disappointment at privatization and marketization in Russia to the return of leftist populism in Latin America. Within the United States, market advocates have been on the defensive, arguably, since the summer of 2000, when California's first electricity blackouts called into question the deregulation of the energy sector. Around the same time, the telecommunications sector began its dramatic meltdown, culminating two years later in the bankruptcies of WorldCom, Global Crossing, and more than 150 less famous participants in the experiment unleashed by the 1996 telecoms deregulation law. The only healthy telephone companies--the heirs to the local Bell firms--face their own set of questions: Despite deregulation, they have managed to retain a stranglehold on the local residential market, with the result that the lower prices promised by deregulation hav e not materialized. Meanwhile the airline industry, deregulated with huge success in 1978, has faced periodic questions. Customer frustration with delays fueled calls for a passengers "bill of rights" in the late 1990s; mergers among airlines have caused some economists to fear for the price cuts that deregulation has delivered; recently the bankruptcy of U.S. Airways and United, along with the precarious finances of other airlines, has raised the question of whether further consolidation is inevitable. The health sector has faced calls for tougher regulation too, with state legislatures tying red tape around the health maintenance organizations that seek to restrain medical costs. All that's before you consider the twin shocks of terrorism and the outcry over corporate governance following the bankruptcy of Enron. In the aftermath of September 11, Americans turned instinctively to government for solutions--not just to hunt down al Qaeda but also to create the economic adjustments apparently required by the new circumstances. The airline industry, weak even before September 11 and now positively reeling, was immediately promised $s billion in cash and $10 billion in loan guarantees by Congress. The responsibility for security at airports was transferred from the private sector to the government. The Bush administration advocated the creation of a government insurer to underwrite terrorism risks. There were efforts to strengthen government surveillance of all kinds of communication and mobility--emails, money transfers, commercial shipments across borders--to the point that some commentators wondered whether globalization might be threatened. The aftermath of Enron brought a similarly broad response. After considering the string of failures that allowed the firm's chief financial officer to steal $45 million from shareholders, Congress passed soup-to-nuts legislation that President Bush signed in July 2002. The new law strengthens government oversight of auditors. It ends seven decades during which auditors have been allowed to write their own professional standards. It regulates the kinds of consulting services that accounting firms can peddle. It obliges Wall Street's stock analysts to declare the conflicts of interest created by their firms' investment-banking activities. …