scispace - formally typeset
Search or ask a question

Showing papers in "Policy Review in 2006"


Journal Article
TL;DR: FEMA is like a patient in triage as mentioned in this paperEMA has been criticized for a lack of clear measurable objectives, adequate resources, public concern or official commitments, and the lack of coordination between states and local governments has been identified as a major obstacle for disaster response.
Abstract: FOLLOWING A DEVASTATING hurricane, the Federal Emergency Management Agency is in crisis. Should it be abolished? Should emergency management responsibilities be given to the military? Returned to the states? Consider the descriptions from a post-disaster report: Prior to the hurricane, "relations between the independent cities ... and the county government were poor, as were those between the county and the state.... After the disaster these relations did not improve, which impeded response and recovery efforts." When the hurricane first made landfall, the country initially reacted with a "sense of relief" because the "most populated areas had been spared the full brunt of the storm." After a few days, however, it became apparent that flood waters would swamp both urban and rural areas, leaving thousands without power, food, water, or the possibility of evacuation. Compounding disasters--wind, floods, communications and power failures--led to catastrophe. While a large state might have had the resources to respond quickly, small states were overwhelmed. "They [small states] usually cannot hold up their end of the [federal-state] partnership needed for effective response and recovery." The severity of the disaster called into question the entire enterprise of federal involvement in natural hazard protection. "Emergency management suffers from ... a lack of clear measurable objectives, adequate resources, public concern or official commitments.... Currently, FEMA is like a patient in triage. The president and Congress must decide whether to treat it or to let it die." (1) The above criticisms pertain not, as one might expect, to FEMA's recent woes in New Orleans following Hurricane Katrina but to the agency's dilatory response to Hurricane Andrew, which devastated South Florida in 1992. After Andrew, Congress gave the agency an ultimatum: Make major improvements or be abolished. With the advice of the emergency management profession and an enterprising director, James Lee Witt, the agency underwent one of the most remarkable turnarounds in administrative history. In 1997, Bill Clinton called it one of the "most popular agencies in government." FEMA was well regarded by experts, disaster victims and its own employees. By 2005, however, the agency had once again fallen into ignominy. Before issuing more cries for radical change at FEMA, reformers should look to the lessons of the agency's reorganization in the 1990s, which focused on natural disasters rather than national security. Its turbulent history shows that while the agency can marshal resources for natural disasters and build relationships with states and localities, it lacks sufficient resources to take on too many tasks. Today, FEMA faces a protean terrorist threat and an increasing array of technological hazards. To address contemporary threats, the agency must hone its natural disaster expertise and delegate authority for disaster response to states and localities. True, delegation runs the risk of returning to the days of ad hoc disaster preparedness, when government poured money into recovery without reducing vulnerability to disasters. Nevertheless, decentralizing response functions is the best way to prepare for an increasingly complex array of disasters, as the risks and strategies for recovery for different kinds of disasters vary so dramatically from region to region. The birth of emergency management WHEN VIEWED AGAINST the history of emergency management, the success FEMA enjoyed in the 1990s was the exception, not the rule. For most of the century, states and localities rushed to the aid of disaster-stricken citizens, and the federal government helped overwhelmed communities with recovery. As a result, the federal government sent supplies, surveyors, and money to help rebuild the same regions over and over again. Federal involvement dates from at least the San Francisco earthquake of 1906, which registered an estimated 8. …

29 citations


Journal Article
TL;DR: For example, the authors argues that it is not nationalism or even religious animosities that explain the current violence in Iraq, but rather oil, its geographical distribution, and the loss of its political control by the Sunni minority.
Abstract: HAILED AS THE key to the solution of poverty, corruption, bad governance and, last but not least, terrorism, spreading democracy around the globe has become the centerpiece of U.S. foreign policy since 9/11. Unfortunately, however, this enterprise is at risk because most of our policymakers have a poor understanding of the economic and institutional landscape that is most favorable to the extension of political liberties and free elections. The experience of the past two years in Iraq shows that simply removing the old governing elite and holding elections is unlikely to suffice to establish a peaceful democracy. On the other hand, realizing that political freedom will not happen at a snap of our fingers should not draw us into holding a fatalistic, and equally mistaken, view that democracy is impossible in that region of the world. There does not seem to be anything inherent in Islam or even Arab culture that blocks the introduction of free elections in the Middle East. It is worth remembering that not many years ago Catholicism was (wrongly) believed to thwart liberal institutions in Southern Europe and Latin America. To make sense of the plight of Iraq, the Middle East and, for that matter, broad swaths of the developing world, we must understand first the nature of the democratic game. Democracy is a mechanism of decision in which, to a large extent, everything is up for grabs at each electoral contest. The majority of the day may choose to redraw property rights or alter the institutional and taxation landscape, thus dramatically reorganizing the entire social and economic fabric of the country. Hence, democracies survive only when all sides show restraint in their demands and accept the possibility of losing the election. In the Middle East, where inequality is rampant and wealth often derives from well-defined and easy-to-expropriate assets such as oil wells and other mineral resources, democracy poses an undeniable threat to those who profit from the authoritarian status quo. Not surprisingly, the minority in control of the state will be relentless in opposing the introduction of free elections. Thus, it is not nationalism or even religious animosities that explain the current violence in Iraq--but rather oil, its geographical distribution, and the loss of its political control by the Sunni minority that monopolized the state until two years ago. This diagnosis has very straightforward implications for any democratization strategy. Since the absence of democracy in the Arab world and, for that matter, in regions such as Africa and Central Asia derives from a particular distribution of wealth and power, this distribution must first change (or be changed) for democracy to flourish. This in turn means that democratization is possible everywhere--that is, there are no inherent cultural, psychological, or national-character reasons that block the attainment of political freedom anywhere. But it also means that its success is much harder than many wish to believe. Idealists versus realists BROADLY SPEAKING, THERE are today two competing schools of thought on the underlying forces that have pushed for and delivered democracy around the world--and which, for the sake of brevity, we may choose to label as "idealist" and "realist." Both are, however, mistaken. Idealists, who currently seem to enjoy the media's ear, explain today's democratic momentum in a way that is strikingly similar to how past democratization waves were portrayed by their contemporary publicists. The triumph of democracy, the argument goes, is rooted in a universal yearning, intimately connected with the best part of human nature, that should lead to blossoming liberal institutions once we knock down the stifling cliques and institutions of the past. To support their claim, they point to the impressive strides made by democracy in recent decades. By 2000 there were around 100 democracies--almost twice the number in 1989 and about three times as many as there were just after World War II (see Figure 1). …

19 citations


Journal Article
TL;DR: In this paper, the authors examine how China is attempting to secure the fuel supplies it needs and to reflect on the implications this may have for the international community as a whole, including the United States.
Abstract: CHINA'S JOURNEY IN just 25 years from the periphery to the center of the world economy is truly phenomenal. It took both Britain and the United States far longer to achieve the share of global output and trade that China has today. China's huge appetite for energy and resources--it is the world's No. 1 consumer of coal, steel, and copper and as an oil and electricity consumer is second only to the U.S.--has contributed to the soaring of prices on global oil and commodity markets. At the same time, cheap Chinese goods of high and ever higher quality are flooding world markets and endangering jobs both in industrialized countries in the North and developing countries in the South. China features increasingly often on the agenda of U.S. congressional hearings and is an issue for politicians in European capitals as well as those of many developing countries. The Chinese challenge is of global dimensions. When Japan aspired in the late 1980s to take over from the U.S. as the world's leading economic player, Washington and Brussels lost no time in responding. The fact is, however, that China's growing economic and political clout poses a far greater challenge to the international status quo than Japan's ambitions, which by the early nineties had suffered a severe setback. The rise of China has a global impact in a host of different ways and requires in-depth analysis. The purpose of this article is to examine how China is attempting--notably through an "energy-driven foreign policy"--to secure the fuel supplies it needs and to reflect on the implications this may have for the international community as a whole. Growing demand for energy IN TERMS OF energy consumption, the People's Republic of China is now second only to the United States. Its growing appetite for energy is the product of the country's 25-year-long economic boom, which has seen expanding external trade, rising incomes, a growing population, and increasing urbanization. Demand for energy has soared across the whole spectrum--coal, oil, gas, electricity, hydropower, and other renewables, as well as nuclear power. Thanks to its own vast reserves, coal is currently China's No. I fuel and supplies two-thirds of its energy needs. The rapid pace of economic growth has led to spiraling demand, however, for oil in particular. Following the government's decision to expand natural gas production, gas is likely in future to play a larger role in meeting the country's energy needs. With dependence on fuel imports clearly set to increase, the government is making strenuous efforts to enhance security of supply. Beijing's biggest worry at present is the security of oil supplies. Until recently China was self-sufficient in oil and into the early nineties even exported limited quantities. The country first started importing oil in 1993, and since then imports have risen steeply. Since the mid-sixties China has been Asia's largest oil producer, producing in recent years some 3.5 million barrels per day (BPD). Despite the expansion of production, demand has continually outstripped supply. From 1984 to 1995 demand leaped from 1.7 million to 3.4 million BPD, and by 2005 it had doubled again to 6.8 million BPD. China overtook Japan as the world's second largest oil consumer in 2003 and is now the third largest oil importer after the United States and Japan. Today China imports over 40 percent of its oil supplies. In response to this situation, China's leaders have launched an all-out program of domestic reform as well as a global import security strategy. Their aim is to keep production going on the traditional oilfields in the northeastern part of the country while at the same time expanding production in western China (a "stabilize-the-East, develop-the-West policy"). High priority is being given to developing offshore oil fields in both the South China and East China Seas, although results so far have been unimpressive. The country's own oil industry has been repeatedly restructured to make it more competitive and efficient as well as to allow a greater measure of price regulation through the market. …

19 citations


Journal Article
TL;DR: The relationship between India and the United States has never been stronger, and it will grow even closer in the days and years to come, according to Indian Prime Minister Manmohan Singh's words on the morning of July 18, 2005 as mentioned in this paper.
Abstract: FOR THOSE WHO missed the symbolism of Indian flags draped from the White House's Old Executive Office Building, President George Bush's words on the morning of July 18, 2005, while standing next to Indian Prime Minister Manmohan Singh, drove home an emerging reality with trademark pithiness: "The relationship between our two nations has never been stronger, and it will grow even closer in the days and years to come." Combined with the Bush administration's visible push to strengthen Japan's hand in managing Asian security, the Indian prime minister's visit to Washington cemented a growing de facto strategic partnership between the United States and India. Numerous American officials already used the term "irreversible" to describe the course of Indo-U.S. relations. No U.S. president visited India between January 1978 and March 2000, when President Clinton made a historic trip to the Subcontinent. Cabinet-level exchanges have since become routine, and President Bush's planned visit in early spring 2006 will reflect an agenda that has come to encompass shared global interests and concerns ranging from Iran and China to nuclear cooperation and biotechnology. Some have begun to see Bush's visit to India as similar, in both intent and consequence, to that of Richard Nixon to China in 1972--which transformed Sino-U.S. relations and the global balance of power for the next three decades. Given the bilateral tensions over nuclear proliferation in the 1990s, such strong relations are in themselves remarkable. When viewed through the prism of geopolitical shifts, however, Indo-U.S. alignment is if anything long overdue. American military and diplomatic movements from the Middle East through Central Asia to the Pacific Rim are in a state of flux for reasons ranging from the Iraqi insurgency to the Iranian nuclear crisis to the rise of vocal new regional institutions such as the Shanghai Cooperation Organization and East Asian Community. Asia, where two-thirds of the world's population resides, is the new geopolitical stage. It is the principal source of the global power shift and will also face most of the political consequences. Yet the constantly shifting loyalties and alliance patterns in Asia confound both historians and experts in geometry. There is the patron-client dyad from Beijing to Islamabad, routine Russian-Chinese-Indian summitry with declarations affirming the need for multipolairty, joint Russo-Japanese and Sino-Russian military maneuvers, talk of a three-cornered nuclear calculus in the U.S.-China-India triangle, and America's attempt to transcend its historical "tilting" between India and Pakistan. The only clear inference from these asymmetrical configurations is that most Asian states continue to subscribe to an adage common to their cultures: to be polite especially to one's enemies. While all Asian powers are wary of American preponderance, they have also sought good relations with Washington. None of them was at the forefront of the worldwide criticism (led by Europe) of the American occupation in Iraq. Historically, the U.S. has viewed the Middle East and Pacific Rim theaters as separate policy realms, with India falling in between and viewed through the exclusive prism of South Asian politics. But India lies at the crossroads of Asia, a factor which was at the heart of British policy towards the East. Only after the Second World War and the partition of the Subcontinent was India's position weakened, a shift accentuated by India's socialist and inward-looking policies. Yet as India's weight grows in the international system, it can become a strong anchor in support of America's ambition to pursue a liberal order across Eurasia. Indeed, if the U.S. should welcome the emergence of any one Asian power, it should be India, which shares America's concern over the spread of Islamic fundamentalism, sub-state nuclear proliferation, and China's ambitions. Furthermore, each Indian election entrenches its status and credibility as the world's largest democracy, and its growing economic clout and diaspora presence in the U. …

19 citations


Journal Article
TL;DR: In the wake of the 2003 arrest of Mikhail Khodorkovsky, and the subsequent seizure of the Yukos oil company, democracy in Russia has entered a free fall as mentioned in this paper, and it is obvious that the U.S. administration badly misread the implications of the nationalization of "strategic industries" and the concomitant return of state-directed show trials to Moscow.
Abstract: SINCE THE ARREST of Mikhail Khodorkovsky on October 25, 2003, and the subsequent seizure of the Yukos oil company, democracy in Russia has entered into free fall. It is obvious in hindsight that the Bush administration badly misread the implications of the nationalization of "strategic industries" and the concomitant return of state-directed show trials to Moscow. In the shorthand of Washington, President Bush and Secretary of State Condoleezza Rice continued to "see into the soul" of President Vladimir Putin even as the erstwhile KGB colonel was rounding up the remnants of civic society in Russia and laying the groundwork for economic and political aggression against Russia's neighbors. Despite statements at the time by President Putin and Minister of Defense Sergei Ivanov that the government intended to recover the great power status of which Russia had been unfairly deprived by the regrettable collapse of the Soviet Union, both Washington and European capitals refused to see any geopolitical calculation in the methodical suppression of political and press freedoms and the takeover of the economy by FSB officers and Kremlin cronies. Russia apologists on the National Security Council staff and in American academe argued that what appeared to be, if not exactly the return of the Stalinist state, at least a lurch in the direction of czarist authoritarianism, was perfectly susceptible to reasonable explanation. These transitory phenomena were attributable to the affinity of the Russian people for "order," they said, or to the pent-up resentment of the Russian narod against their predominantly Jewish oligarchic oppressors, or to the residual humiliation the voters felt as a consequence of our ill-considered decision to "win" the Cold War. Nothing could convince Western decision makers that something more sinister and more consequential might be underway. This was the counsel of "realism." Opinion differs as to precisely when the propensity to see Putin as a necessary partner gave way to the realization that a dangerously revanchist state was on the rise in Europe's East. For Russian civic society activists it was clear that the relatively brief period of liberalism, Moscow's fleeting Prague Spring, had ended in the course of 2004 with the suppression of independent journalists, the wholesale replacement of elected regional governors with Kremlin appointees, and the initial attacks on nongovernmental organizations. For the democracy community, the Orange Revolution in Ukraine during the winter of 2005-06 was the watershed event. The repeated assassination attempts on Victor Yushchenko, the blatant involvement of Russian security services in Ukrainian politics, and the provision of massive financing and a troop of political advisors to Moscow-controlled parties presaged the reappearance of a new Comintern in the former Soviet Union. For Europe as a whole, it was the Ukrainian gas crisis in January 2006 that convinced its leaders that a resurgent and menacing Russia had stolen a march on the West and now threatened the independence of Europe's East with its newly developed energy weapon. With its influence and financial power, the Russian state could reach the German chancellor and into the inner circle around President Bush. It is not clear when, in the course of the destruction of Russian democracy and the rise of post-Soviet authoritarianism, the U.S. administration realized that its Russia policy had failed. For most of 2005, those senior officials who would discuss the subject pointed to the constructive role that Russia was likely to play in negotiations with North Korea and subsequently in the crisis over Iran's nuclear pretensions. When these partnerships proved ephemeral, America's Russian hands fell back on the vaguer argument that Putin was likely to moderate his anti-democratic behavior at home and his growing imperial tendencies abroad as the all-important G-8 summit on July 15-17 in St. …

17 citations


Journal Article
TL;DR: This essay evaluates some of the studies routinely put forth as evidence of harmful discrimination and indicates that race-related variables, such as geography and socioeconomic status, shine important explanatory light into the recesses of the treatment gap.
Abstract: Two 50-YEAR-OLD men arrive at an emergency room with acute chest pain. One is white and the other black Will they get the same quality of treatment and have the same chance of recovery? We would hope so, but many experts today insist that their race will profoundly affect how the medical-care system deals with them and that the black patient will get much inferior care. Is this really true? And if so, why? Are differences in treatment due to deliberate discrimination or other (less invidious) factors? Interest in the determinants of minority health has grown considerably since the publication of the Report of the Secretary's Task Force on Black and Minority Health by the U.S. Department of Health, Education, and Welfare in 1985. The academic literature falls into two categories. One line of inquiry emphasizes overt or subtle racial discrimination by physicians. Research reports in this category assert that many physicians treat their white patients better than their minority patients on the basis of race alone. We call this the "biased-doctor model" of treatment disparities. The other line of research focuses on the influence of so-called third factors that are correlated with race. These factors can influence care at the level of the health system, the physician, or both. They include, for example, variations in insurance coverage (insured versus uninsured versus underinsured; public versus private health plans; profit versus not-for-profit health plans), quality of physicians, regional variations in medical practices, and patient characteristics (such as clinical features of disease or health literacy). Of course, it is possible that both of these mechanisms--"biased doctors" and "third factors"--could operate simultaneously. Yet it is the biased-doctor model that has acquired considerable weight in both academic literature and the popular press. It enjoyed a great boost in visibility from a 2002 report from the Institute of Medicine (IOM), part of the National Academy of Sciences. The IOM provides lawmakers with advice on matters of biomedical science, medicine, and health and issues high-profile reports written by panels of outside experts. Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care was widely hailed as the authoritative study on health disparities. It concluded that the dynamics of the doctor-patient relationship--"bias," "prejudice," and "discrimination"--were a significant cause of the treatment differential and, by extension, of the poorer health of minorities. Much media fanfare greeted the IOM report, and virtually every story ran the triumphant remark of Dr. Lucille Perez, then president of the National Medical Association, which represents black physicians: "It validates what many of us have been saying for so long--that racism is a major culprit in the mix of health disparities and has had a devastating impact on African-Americans." In this essay, we evaluate some of the studies routinely put forth as evidence of harmful discrimination. We report evidence not considered by the IOM panel. These additional findings indicate that race-related variables, such as geography and socioeconomic status, shine important explanatory light into the recesses of the treatment gap. Without adequate controls for just such variables, it is simply not possible to distinguish care patterns that correlate with race from those that are due directly to race. Indeed, as we will see, when researchers employ designs that control for more third factors, the magnitude of any race effect shrinks considerably, if it does not disappear altogether. Public health as civil rights FIRST, A BRIEF SKETCH of how we got here. Just before Christmas 2003, the Agency for Health Care Research and Quality of the U.S. Department of Health and Human Services (HHS) released the National Healthcare Disparities Report. It documented an all-too-familiar problem: the poorer health status of individuals on the lower rungs of the socioeconomic ladder and the fact that they often receive inadequate treatment compared to people with more resources and education. …

9 citations


Journal Article
TL;DR: In the OECD countries, the proportion of what is customarily called the "retirement age population" (65 years of age or older) will steadily swell, with the most rapid expansion occurring among those aged 80 or more as mentioned in this paper.
Abstract: OVER THE PAST decade and a half, the phenomenon of population aging in the "traditional" or already affluent OECD societies has become a topic of sustained policy analysis and concern. (1) The reasons for this growing attention--and apprehension--are clear enough. By such metrics as median age or proportion of total population above the age of 6 5, virtually every developed society today is more elderly than practically any human society ever surveyed before the year 1950--and every single currently developed society is slated to experience considerable further population aging in the decades immediately ahead. In all of the affluent OECD societies, the proportion of what is customarily called the "retirement age population" (65 years of age or older) will steadily swell, with the most rapid expansion occurring among those aged 80 or more. Simultaneously, the ratio of people of "working age" (the cohort, by arbitrary though not entirely unreasonable custom, designated at 15-64 years) to those of retirement age will relentlessly shrink--and within the working age grouping, more youthful adults will account for a steadily dwindling share of overall manpower. Whether these impending revolutionary transformations of national population structure actually constitute a crisis for the economies and societies in the industrialized world is--let us emphasize--still a matter of dispute. To be sure: This literal upending of the familiar population pyramid (characteristic of all humanity until just yesterday) will surely have direct consequences for economic institutions and structures in the world's more affluent societies--and could have major reverberations on their macroeconomic performance. Left unaddressed, the mounting pressures that population aging would pose on pension outlays, health care expenditures, fiscal discipline, savings levels, manpower availability, and workforce attainment could only have adverse domestic implications for productivity and economic growth in today's affluent OECD societies (to say nothing of their impact on the global outlook for innovation, entrepreneurship, and competitiveness). But a host of possible policy interventions and orderly changes in existing institutional arrangements offer the now-rich countries the possibility (to borrow a phrase from the OECD) of "maintaining prosperity in an aging society"--and in fact of steadily enhancing prosperity for graying populations. Today's rich countries may succeed in meeting the coming challenges (and grasping the potential opportunities) inherent in population aging--or then again, they may fail to do so. The point is that an aging crisis is theirs to avert--and they have considerable scope for so doing. In contrast to the intense interest currently accorded the issue of population aging in developed countries, the topic of aging in low-income regions has as yet attracted relatively little examination. This neglect is somewhat surprising, for over the coming decades a parallel, dramatic "graying" of much of the Third World also lies in store. The burdens of pronounced population aging, furthermore, are unlikely to be born as easily by countries still poor as by countries already rich. Simply stated, societies and governments have fewer options for dealing with the problems imposed by population aging when income levels are low--and the options available are distinctly less attractive than they would be if income levels were higher. Over the next generation, it seems entirely likely--indeed, all but inevitable--that a large fraction of humanity, peopling countries within the grouping often termed emerging market economies, will find themselves coping with the phenomenon of population aging on income levels far lower than those yet witnessed in any society with comparable degrees of graying. For such countries, the social and economic consequences of aging could be harsh--and the options for mitigating the adverse effects of population aging may be fairly limited. …

9 citations


Journal Article
TL;DR: The United States poverty rate (OPR) as mentioned in this paper is the most commonly used measure of poverty in the United States, and it has been used as an indicator of material deprivation and need for the past four decades.
Abstract: FOR WELL OVER a century, with ever-expanding scale and scope, the United States government has been generating statistics that might illuminate the plight of society's poorest and most vulnerable elements. From the beginning, the express objective of such efforts has always been to abet purposeful action to protect the weak, better the condition of the needy, and progressively enhance the general weal. America's official quest to describe the circumstances of the disadvantaged in quantitative terms began in the 1870s and the 1880s, with the Massachusetts Bureau of Statistics of Labor and the U.S. Bureau of Labor Statistics, and the initial efforts to compile systematic information on cost-of-living, wages, and employment conditions for urban working households in the United States. (1) U.S. statistical capabilities for describing the material well-being of the nation's population through numbers have developed greatly since then. Today the United States government regularly compiles hundreds upon hundreds of social and economic indicators that bear on poverty or progress on the domestic scene. Within that now-vast compendium, however, one number on deprivation and need in modern America is unquestionably more important than any of the others--and has been so regarded for the past four decades. This is what is commonly known as the "poverty rate" (the informal locution for the much more technical mouthful "the incidence of poverty as estimated against the federal poverty measure.") First unveiled in early 1965, shortly after the launch of the Johnson administration's "War on Poverty," the poverty rate is a measure identifying households with incomes falling below an official "poverty threshold" (levels based on that household's size and composition, devised to be fixed and unchanging over time). Almost immediately, this calculated federal poverty measure was accorded a special significance in the national conversation on the U.S. poverty situation and in policymakers' responses to the problem. Just months after its debut--in May 1965--the War on Poverty's new Office of Economic Opportunity (OEO) designated the measure as its unofficial "working definition" of poverty. By August 1969, the Bureau of the Budget had stipulated that the poverty thresholds used in calculating American poverty rates would constitute the federal government's official statistical definition for poverty. It has remained so ever since. (2) The authority and credibility that the official poverty rate (OPR) enjoys as a specially telling indicator of American domestic want is revealed in its unique official treatment. The OPR is regularly calculated not only for the country as a whole, but for every locality down to the county level and beyond--on to the level of the school district. (It is even available at the level of the census tract: enumerative designations that demarcate the nation into subdivisions of as few as one thousand residents.) Furthermore, U.S. government antipoverty spending has come to be calibrated against, and made contingent upon, this particular measure. Everywhere in America today, eligibility for means-tested public benefits depends on the relationship between a household's income and the apposite poverty threshold. In Fiscal Year 2002 (the latest period for which such figures are readily available), perhaps $300 billion in public funds were allocated directly against the criterion of the "poverty guideline" (the Department of Health and Human Services' version of poverty thresholds). (3) The poverty rate currently also conditions many billions of dollars of additional public spending not directly earmarked for anti-poverty programs: for example, as a component in the complex formulae through which community grants (what used to be called "revenue sharing") dispense funds to local communities. Given its unparalleled importance--both as a touchstone for informed public discussion and as a direct instrument for public policy--the reliability of the official poverty rate as an indicator of material deprivation is a critical question. …

8 citations


Journal Article
TL;DR: The U.S. and the European Union can be seen as two different species of the same fundamental genus as mentioned in this paper, i.e., federative federative systems, which are attempts to create a basis for cooperation among independent republics.
Abstract: THE HISTORIAN OF the early American federal union recognizes an immediate affinity between America's founding and the contemporary European project. Both are instances of an attempt to create a federative system, to ensure and perpetuate a basis for cooperation among independent republics in a political milieu in which multiple loyalties, identities, and interests and the centrifugal forces they produce are the commanding political fact. Such a union must guard against the rival dangers of international anarchy and despotic centralization both within itself and within the larger society of states. This objective constitutes, as it were, its reason of state, the narrow path between Scylla and Charybdis it must follow. There are signal differences, of course, between the old U.S. and the new EU. But the comparison I wish to draw does not rest upon any kind of essential identity in institutions--these, of course, have been very different--but rather upon the fundamental similarity in the problem they confronted and the aspirations they entertained. The problem was how to find a basis for common action in a system of states prone to unilateral action, the aspiration to instantiate a mode of resolving disputes among themselves that would bind them into a system of perpetual peace. In their various discourses about power--of the danger of falling into anarchy among themselves or, conversely, of sacrificing state independence to unrepresentative and unaccountable government from afar--we can recognize a set of problematics that identify the old U.S. and the new EU as experiments of the same type, or rather as different species of the same fundamental genus. Though this image of America and Europe clasping hands across the centuries may be a pleasing one, suggesting a common civilizational project in which Europe has managed to pick up a baton long held by the United States, this thought occurs against the backdrop of an existential crisis within the West. Having considered itself a friend of the European experiment for most of the post-World War II period, and having advised, really from its infancy, that Europe should emulate America in adopting a federal constitution, the American government now looks upon the European project with hostile and suspicious eyes. We are a long way from the time, in 1962, when John Kennedy could proclaim that "the basic objective" of America's post-World War II foreign policy was to aid the progress of a "strong, united Europe." Perhaps the most important reason for this hostility is that the United States has undergone a movement very nearly the opposite of Europe's. If Europe has ended up where America began, the "world's only superpower" has by contrast assumed the mantle of the now deceased European monarchies that "felt power and forgot right," in Jefferson's phrase. Its role as the world's most powerful state has dramatically changed its conception of itself and marks an unmistakable advance in the long and bloody route from federal union to universal empire. The militarism that America's founders once railed against and that was symbolized for them by Europe's "nations of eternal war" seems ominously familiar to those who witness the warlike tendencies of the contemporary American state. The modern world prides itself on its novelty and is always looking for the "new, new thing"; this issue is very old. Europe objects to American policies in precisely the same terms with which Americans once denounced Europe's war system, and it embraces what Jefferson called "the opposite system" of "cultivating general friendship, and of bringing collisions of interest to the umpirage of reason rather than of force." America and Europe, then, have "switched places," undergone a great reversal, with each now standing for what the other once stood. As Robert Kagan has suggested, the foreign policy vision articulated by Europe's leaders today sounds much like that of America's eighteenth and nineteenth century statesmen, who extolled "the virtues of commerce as the soothing balm of international strife" and appealed "to international law and international opinion over brute force. …

7 citations


Journal Article
TL;DR: For instance, if the facts show that school choice improves test scores overall, it is unlikely that vague moral claims, like unfairness to teachers, will stop advocates of school choice from making substantial political gains as discussed by the authors.
Abstract: PUNDITS AND POLITICIANS alike complain that virulent partisanship and the excessive power of special interests distort modern democracy. As result, it is difficult to elicit the consensus for policies that will promote the public interest. These are not new problems. In the early 1800s, for instance, Federalists and Democratic Republicans clashed sharply and vituperatively, disagreeing on such fundamental issues as whether creating a Bank of the United States was wise. Consensus periods of politics in American history have been few and far between. But our future politics is more likely to forge consensus than that of the past, because we are on the cusp of a golden age of social science empiricism that will help bring a greater measure of agreement on the consequences of public policy. The richer stream of information generated by empirical discoveries will provide an anchor for good public policy against partisan storms and special-interest disturbances, making it harder for the political process to be manipulated by narrow interests. Many great social scientists have understood that people are ultimately persuaded more by facts than by abstract theories. That is the reason that Adam Smith filled The Wealth of Nations with a wealth of factual observations to demonstrate the power of his ideas. It remains the case that if supporters of a policy can demonstrate that it leads to greater prosperity, the political battle is often half-won. It is true that facts alone cannot generate values, and thus no empirical evidence by itself can logically mandate support for a specific social policy. Smith's contemporary, the philosopher David Hume, himself made this clear with his famous "is-ought distinction." But politically, most people within modern industrial society adhere to a rather narrow range of values, at least in the economic realm. They favor more prosperity, better education and health care, and other such goods that make for a flourishing life. As to these issues, what is debated is which political program will in fact broadly deliver these goods. Empiricism has particular power in the United States, where a spirit of pragmatism limits the plausible boundaries of political debate. Republicans try to show that tax cuts will stimulate economic growth, while Democrats argue that the resulting deficits will impede it. Republicans argue that, in the long run, such tax cuts will raise the incomes of all. Democrats tend to disagree. Such consequential arguments are key to persuading the vast middle of American politics. For instance, if the facts show that school choice improves test scores overall, it is unlikely that vague moral claims, like unfairness to teachers, will stop advocates of school choice from making substantial political gains. Fortunately, we are at the dawn of the greatest age of empiricism the world has ever known. The driving force in the rise of empiricism is the accelerating power of information technology, often referred to as Moore's law. Moore's law--originated by Gordon Moore, one of the founders of Intel--is the now well-established rule that the number of transistors packed onto an integrated circuit doubles every 18 months. As a result, computer speed and memory have been doubling at approximately the same rates. Such exponential growth will persist for at least another 15 years. Many observers believe that new paradigms will continue the acceleration of computer power in the decades after silicon chip technology is exhausted. The fruits of Moore's law will be not only ever fancier gadgets, but also an ever more informed policy calculus. The accelerating power of computers addresses what has always been the Achilles' heel of empiricism--its need for enormous amounts of data and huge calculating capacity. Pythagoras famously said "the world is built on the power of numbers." That is the slogan of empiricists as well, but processing these numbers requires huge computer power. …

4 citations


Journal Article
TL;DR: For example, this paper pointed out that American higher education is the envy of the world, producing and maintaining research scientists of the highest caliber, despite administrators and faculty lacking a coherent idea about what constitutes an educated human being, and pointed out the disconnect between the requirements of liberal education and the express interests of parents, donors, professors, and students.
Abstract: I. Our university AN AUTO REPAIR shop in which mechanics and owners could not distinguish a wreck from a finely tuned car would soon go out of business. A hospital where doctors, nurses, and administrators were unable to recognize a healthy human being would present a grave menace to the public health. A ship whose captain and crew lacked navigation skills and were ignorant of their destination would spell doom for the cargo and passengers entrusted to their care. Yet at universities and colleges throughout the land, parents and students pay large sums of money for--and federal and state governments contribute sizeable tax exemptions to support--liberal education, despite administrators and faculty lacking a coherent idea about what constitutes an educated human being. To be sure, American higher education, or rather a part of it, is today the envy of the world, producing and maintaining research scientists of the highest caliber. But liberal education is another matter. Indeed, many professors in the humanities and social sciences proudly promulgate in their scholarship and courses doctrines that mock the very idea of a standard or measure defining an educated person and so legitimate the compassless curriculum over which they preside. In these circumstances, why should we not conclude that universities are betraying their mission? To be sure, universities and colleges put out plenty of glossy pamphlets containing high-minded statements touting the benefits of higher education. Aimed at prospective students, parents, and wealthy alumni, these publications celebrate a commitment to fostering diversity, developing an ethic of community service, and enhancing appreciation of cultures around the world. University publications also proclaim that graduates will have gained skills for success in an increasingly complex and globalized marketplace. Seldom, however, do institutions of higher education boast about how the curriculum cultivates the mind and refines judgment. This is not because universities are shy about the hard work they have put into curriculum design or because they have made a calculated decision to lure students and alumni dollars by focusing on the sexier side of the benefits conferred by higher education. It's because university curricula explicitly and effectively aimed at producing an educated person rarely exist. (1) Universities do provide a sort of structure for undergraduate education. Indeed, it can take years for advisors to master the intricacies of general curriculum requirements on the one hand and specific criteria established by individual departments and proliferating special majors and concentrations on the other. The Byzantine welter of required courses, bypass options, and substitutions that students confront may seem like an arbitrary and ramshackle construction. In large measure it is. At the same time, our compassless curriculum gives expression to a dominant intellectual opinion. And it reflects the gulf between the requirements of liberal education and the express interests of parents, donors, professors, and students. The dominant opinion proclaims that no shared set of ideas, no common body of knowledge, and no baseline set of values or virtues marking an educated human being exist. To be sure, the overwhelming majority of all American colleges adopt a general distribution requirement. (2) Usually this means that students must take a course or two of their choosing in the natural sciences, social sciences, and the humanities, with perhaps a dollop of fine arts thrown in for good measure. And all students must choose a major. Although departments of mathematics, engineering, and the natural sciences maintain a sense of sequence and rigor, students in the social sciences and humanities typically are required to take a smattering of courses in their major, which usually involves a choice of introductory classes and a potpourri of more specialized classes, topped off perhaps with a thesis on a topic of the student's choice. …

Journal Article
TL;DR: In fact, despite the end-of-term, politically charged pre-election legislative bustely, there is no indication that Congress has any appetite to undertake systematic, comprehensive legislation with respect to counterterrorism policy as discussed by the authors.
Abstract: TWO BRANCHES OF government have been hard at work in the war on terror these past years, even if they have not infrequently worked at cross-purposes. Executive agencies devise a warrantless surveillance program--and a federal judge declares it unconstitutional. Administration officials and federal bureaucrats devise rules for trying accused terrorists in military tribunals--and the Hamdan decision sends the tribunal drafters back to the drawing board. But where are the people's elected representatives in all of this? The Supreme Court has said that Congress has an indispensable role to play in establishing democratically legitimate policy in counterterrorism. Democratic theory tells us, moreover, that whatever actions the executive was able to undertake on its own authority in the immediate emergency of 9/11, and indeed whatever inherent powers it might permanently possess, in fact democracy is better off when the political branches work in concert to create a long term policy. So where is the legislation, passed by Congress and signed into law by the president, on the multiple topics that make up a full and complete counterterrorism policy for the country? It is true that, at this writing, Congress is finally coming to grapple with legislation proposed to regulate trials of enemy combatants and interrogation procedures. It is legislation of grave national importance. It is likewise true that Congress involved itself in counterterrorism policy last year in the Detainee Treatment Act of 2005 (the McCain amendmet). But notwithstanding the importance of those issues, they are in fact narrow ones. They are merely compelled by the Hamdan decision, and far from the range or depth of issues necessary to establish what the country urgently needs prior to the end of the Bush second term, and what the Bush administration ought to have been working toward from the day its second term began--a long term, systematic, comprehensive, institutionalized counterterrorism policy for the United States. Unfortunately, despite the end-of-term, politically charged pre-election legislative bustely, there is no indication that Congress has any appetite to undertake systematic, comprehensive legislation with respect to counterterrorism policy. Nor is there any indication that the Bush administration has any desire to seek it. No one should mistake the energetic debate of this moment--debate in which, in any case, Democrats are not really taking part--for more than what it is. The legislative proposals of both the administration and its interlocutors hew narrowly to the Hamdan minimum--another skirmish, in other words, not over concrete policy in the war on terror, but instead in the never-ending, abstract, and finally metaphysical battle over the constitutional extent of presidential discretion. But what the administration especially fails to appreciate is that no matter who wins the 2008 presidential election or the 2006 midterms, there is not likely to be any coherent national counterterrorism policy past the end of the second Bush administration unless Congress takes steps to legislate it, institutionalize it, and make it long-term and indeed permanent. The fissured Hamdan decision, while leaving open many momentous questions, at least makes one thing reasonably clear: Responsibility for democratically establishing the war on terror--its overall contours, its long-term legitimacy, its institutional form, its trade-offs of values of national security and civil liberties--today falls to the legislative branch. The most desirable policy result of Hamdan would be the acknowledgment that the judiciary has only a limited role to play in foreign policy and national security, and that it will discipline itself to stay out of such disputes. At the same time, the judiciary will stay out of these arenas because it knows that the separation of powers will remain in the two political branches of government operating actively as checks and balances upon each other. …

Journal Article
TL;DR: For example, the United States is "addicted to oil" as discussed by the authors, which is a notion that suggests that vital parts of the nation's economy can be segregated from the rest of its trade patterns.
Abstract: WHEN HE TOLD Congress last winter that the the United States is "addicted to oil," George W Bush joined every president since Richard Nixon in calling for some measure of energy independence But Bush's goals were more modest than his predecessors' The centerpiece of his energy statement was a proposal to eliminate 75 percent of our Middle East oil imports by 2020 Since even today our imports from that region are less than a fifth of the total--our three largest suppliers are Canada, Mexico, and Venezuela--that would require minor adjustments However, Bush's modesty may be warranted In 1973, when Richard Nixon first called for energy independence, the United States imported about 25 percent of its annual oil needs; now it imports over 60 percent Oil and gas imports now account for over $200 billion of the current trade deficit Nevertheless, it's questionable whether energy independence has ever really been a national goal; it isn't even clear what the term is supposed to mean Former CIA director James Woolsey, whose work with organizations like the Energy Future Coalition places him among the staunchest advocates of reducing our reliance on foreign oil, prefers to avoid the term "energy independence" altogether If the phrase is meant to suggest that the United States can somehow isolate its energy future from everyone else's, then of course Woolsey is right--it is a fatuous notion It counters the entire trend of globalization and suggests that vital parts of the nation's economy can be segregated from the rest of its trade patterns But energy independence need not imply energy isolation In fact, the United States had almost complete control of its own energy destiny until the 1960s even as its energy trade with the rest of the world was growing America's excess capacity could be tapped when the country chose to alleviate oil shortages elsewhere in the world, and its enormous domestic market assured that periodic oil gluts were no menace either The stubborn facts of geology and economics rule out the possibility of the United States' achieving energy independence based upon oil--or upon natural gas, since gas deposits are mainly found in the same areas But can other energy sources replace oil in key economic sectors just as oil, in the past century, frequently replaced coal? The iron age didn't end because the world stopped using iron, but because iron lost its relative importance If the "oil age" can be brought to a similar conclusion, the country's economic power might again achieve parity with its geopolitical power Of course, the reverse would also be true: If we don't find oil substitutes, a growing thirst for petroleum may sap our military, economic, and diplomatic strength Energy-independence hawks claim that this is already happening The vision of energy abundance is appealing, but scenarios for achieving that goal haven't captured the popular imagination Since few Americans want to contemplate the opposite future--gasoline lines, brownouts, and reduced living standards--the politics of energy remain in a chronic a state of denial The ceremonial exhortations found in State of the Union speeches and election campaigns have lost their credibility as statements of policy The past 35 years have made the public skeptical that government will ever address energy problems successfully--or will work effectively with markets toward that end When President Jimmy Carter said, in 1977, that energy independence was "the moral equivalent of war," he had broad support for programs that both spurred an increase in fuel efficiency and jump-started research into alternative fuels But he also confounded markets Stuart Eizenstat, Carter's chief domestic policy advisor, later told an energy seminar that "one of the worst things I ever recommended to the president was in the midst of the Iranian crisis suggesting that we maintain price controls on gasoline at the pump It was an absolute disaster …

Journal Article
TL;DR: Denial has been widely used in political discourse since the day of the 9/11 attacks as mentioned in this paper, and it has been used to describe the unwillingness to acknowledge painful reality, thoughts, or feelings.
Abstract: IF THERE'S ONE all-purpose concept bestriding the Zeitgeist these days, "denial" has to be it. Conventionally defined as "the refusal to acknowledge painful realities, thoughts, or feelings," the term has long since migrated from psychology into politics, where its explanatory power in one case after another appears practically talismanic. To cite examples from fall 2006 alone, Karl Rove and other Republican strategists were repeatedly charged with "denial" for their rosy assessments of mid-term election prospects; Senator George Allen stood accused of being in "denial" about his (Jewish) ethnic background; conservatives in the wake of the Mark Foley scandal were said to be in "denial" about the influence of gays in the Republican party, even as Foley himself was said to be in "denial" about his sexuality; and numerous books and articles marking the fifth anniversary of 9/11 argued for hanging the scarlet D variously on former President Clinton (for failing to kill Bin Laden), Secretary of State Condoleeza Rice (for failing to take the threat of Bin Laden seriously enough), and various other members of the Bush administration, including the president himself. Finally, as if to confirm denial's standing as the it-word of our time, came Bob Woodward's bestselling book about the Iraq war, State of Denial (Simon and Schuster). Its thesis--that Bush and his Cabinet had purposefully, if not perhaps always consciously, misled Americans and even themselves by refusing to acknowledge the severity of the problems in Iraq--resonated not only in the reflexive anti-Bush media venues that repeatedly showcased it, but also with other Americans increasingly skeptical about the war. Political particulars aside, the ubiquity of that word "denial" is worth pausing over. It connotes that we live in an era of unreality, perhaps even sur-reality, in which what is said in public is at odds with what is true--a shortfall invoked now more or less constantly as a feature of political discussion. And so to the obvious question: Why do so many Americans apparently share the sense that we are all being misled, one way and another, about political reality--and not only about reality in Iraq, but about politics more generally? I believe the answer to that question is the obvious one: because in some deep sense, it is true. This is not meant to affirm that every current charge of "denial" now circulating is a valid one. It's rather to suggest that the sheer volume of such charges reflects a deeper, underlying truth about the untethering of some current political ideas from firm reality. This is the deeper territory that the ubiquity of that term "denial" invites us to plumb. One way to begin is to survey the main intellectual and political currents since 9/11, which investigation yields a fact both unexpected and significant. As it turns out, a flight from political reality has indeed been underway on both the left and the right in America in the years since that event, as well as accelerating into more advanced forms in much of Europe. To switch metaphors, in the wake of the 9/11 attack--and later, related Islamist attacks on civilians, most notably in Spain and Britain--many Western observers have responded not by absorbing what we now know to be true about our world, but rather by transposing those brute facts into other, safer, more familiar keys. One result of that transposition, the record shows, has been the creation of a world of political scapegoats for the unease and anxiety that are the unwanted companions now of Westerners everywhere. These scapegoats, perverse non-explanations for what really ails us, can be identified by features common to the breed everywhere: The passion invested in them by their antagonists is disproportionate to any real problem the scapegoat represents; they are invoked to explain more about the world than they do; they capture some part of the truth, i.e., have a degree of verisimilitude without which a scapegoat cannot exist; and--also like scapegoats everywhere--they pose no threat of retaliation for their overburdening. …

Journal Article
TL;DR: To the best of our knowledge, there is only one work as discussed by the authors, where the authors argue that the decision to go to war in Iraq was a grave mistake of policy, and that an immediate pullout, or one by a declared date moderated by conditions on the ground, would today serve US interests.
Abstract: TO CONCEDE THAT going to war in Iraq was a grave mistake of policy is not to embrace the conclusion that an immediate pull-out--or one by a declared date moderated by conditions on the ground--would today serve US interests The country may have entered the war with erroneous notions of the state of Saddam's WMD programs It may have underestimated the resilience of former Baathists and regime loyalists, their access to weapons and the help they would get from foreign jihadists It may have failed to anticipate that a society divided and oppressed by an authoritarian ruler might erupt into ethnic and religious conflict when that leader departs It may have been naive in thinking that an externally modeled Iraqi democratic government would opt for secular rather than sectarian parliamentary representation and that its near perfect transition would transform the region into a galaxy of democratic states And it may have underestimated the number of troops needed to occupy a country of 25 million Yet the answer is not to compound those mistakes by leaving in a way that makes large-scale civil war nearly inevitable, pushes the country into the lap of its Iranian neighbor, or advertises the US as an unreliable friend, a hesitant hegemon, and a rewarder of those terrorists with the tenacity to outlast the behemoth No, when a Great Power puts its leg in a snare, there must be some cure other than amputation I spent most of August in Iraq assessing the situation on the ground with the help of US military, intelligence, and diplomatic personnel, Iraqi journalists, translators, contract workers, and, in one case, a senior government official In addition to day trips in and around Baghdad, I spent two days in the disputed Kirkuk area, a place where the character of the new Iraq may well be defined I had the feeling during much of my visit that the time of illusion had passed No longer could failure be disguised as the invention of lazy journalists confined to the secure Green Zone or their biased editors back home With Baghdad's sixty-mile circumference defining one of the world's largest killing zones and Anbar Province in the Sunni heartland locked in a bloody stalemate, no one was likely to be impressed by fallback reiterations that fourteen of the country's eighteen provinces were secure, averaging fewer than two violent incidents per day, or that three-fourths of the country receives more hours of electricity per day than it had before the war All eyes were (justifiably) on Baghdad, and the capital's slide into chaos showed no signs of reprieve In an act of arrogant complacency worthy of Marie Antoinette, the Council of Representatives went on more than a month's holiday, ignoring US pleas to remain in emergency session at least until the situation in the capital improved But with US public opinion turning sharply against the war, soon to be reflected in a landscape-changing election for control of Congress, pressure to produce visible progress on the ground weighed heavily on the minds of Coalition military and diplomatic leaders "The center of gravity for both Americans and Iraqis right now is something hard to measure: Time," observed Colonel David Gray, commander of the 1st Brigade, 101st Airborne Division operating primarily in the Kurdish North "How much time and perseverance do we have?" The consensus among those I interviewed was that if a dramatic reversal in the country's fortunes could not be demonstrated within the next 6-12 months, the call to establish a deadline for withdrawal would become a cacophony of irresistible demands The Coalition, however, neither bears the burden nor enjoys the luxury of singular command The Iraqis have both sovereignty and international legitimacy and with them all the trappings, if not the capability, of a full-fledged government More and more, the big decisions that will make or break the country fall into Iraqi hands--reconciliation, federalism, the divvying up of oil revenues, the status of Kirkuk, and the future of sectarian militias …

Journal Article
TL;DR: For example, the Congressional Budget Office and independent budget analysts show that federal spending will reach about 40 percent of gross domestic product (GDP) by the end of George W. Bush's first term as discussed by the authors.
Abstract: FEDERAL SPENDING ROSE from about 18.5 percent of gross domestic product (GDP) at the end of the Clinton administration to 20.3 percent by the end of George W. Bush's first term--during the watch, that is, of a Republican president and a Republican Congress. Of course, much of this increase is in defense spending and homeland security. But President Bush has not chosen guns at the expense of butter: He has opted for both. He did not veto even a single spending bill. And real (that is, inflation-adjusted) domestic discretionary spending, not counting homeland security, rose by an annual average of 4.8 percent over his first four years in office. (1) Someone who favors relatively small government could get awfully depressed looking at these numbers. One does not get less depressed contemplating the spending increases that are projected over the next 45 years. Credible estimates from the Congressional Budget Office and from independent budget analysts show federal spending doubling as a percent of GDP by the middle of the twenty-first century, reaching about 40 percent of GDP (see Figure 1). Yet some historical constants and some facts about Americans' views on taxes incline this observer to believe that federal spending will come nowhere close to 40 percent of GDP by mid-century. [FIGURE 1 OMITTED] Before we turn to the good news, let us consider the bad news: the news on spending. Defense spending rose by $161 billion between fiscal years 2001 and 2005, an increase of $117 billion in 2001 dollars. This was large, and yet it took defense spending from a postwar low of 3.0 percent of GDP in 2001 (tied with 1999 and 2000) to 3.8 percent of GDP. Even if the U.S. government maintains a strong military presence in the world, which seems likely, it can do so with less than 4 percent of GDP. To see where spending is projected to grow substantially as a percentage of GDP, therefore, we must look elsewhere. The three programs accounting for most of this increase are projected to be Medicare, Medicaid, and Social Security. Medicare and Medicaid spending together are credibly predicted to be about 21 percent of GDP by 2050, and Social Security spending is expected to equal about 6 percent of GDP by 2050. All three are driven by demographics--the aging of the U.S. population--and the first two are also driven, ironically, by improvements in health care. Consider Social Security first. Social Security spending, which is now about 4.2 percent of GDP, is likely to be 6.2 percent of GDP by the middle of the twenty-first century unless changes are made in the program. This is due to two main factors: 1) the retirement of the baby-boom generation and 2) the increasing life expectancy of the elderly. The two factors together mean that the fraction of people aged 6 5 or older will rise from 12 percent of the population in 2000 to 19 percent in 2030. The working-age population, by contrast, is projected to fall from 5 9 percent to 5 6 percent. (2) Based on this, the Social Security trustees project that the number of workers per Social Security recipient will decline from about 3.3 in the early 2000s to 2.2 in 2030. (3) Of course, substantially increased immigration of younger people or a significant decline in life expectancy of the elderly could slow this trend but absent that, these population numbers are fairly firm. And absent a change in that ratio, absent policy changes in Social Security (more on that later), and absent a substantial increase in the growth of productivity, the increase in Social Security to about 6 percent of GDP is also fairly firm. The scarier numbers are in Medicare, the federal government's socialized medicine program for the elderly, and Medicaid, the program for the poor and near-poor: Not only is the number of people enrolled in these programs increasing, but spending per person has also increased and will likely continue to do so. Since 1967, the first full year of Medicare spending, spending has risen from 0. …