scispace - formally typeset
Search or ask a question

Showing papers in "Policy Review in 2009"


Journal Article
TL;DR: In the run-up to the 2000 presidential election, candidate George W. Bush and his advisors made a strategic decision to appropriate educational rhetoric generally associated with Democrats and the left.
Abstract: IN THE RUN-UP to the 2000 presidential election, candidate George W. Bush and his advisors made a strategic decision to appropriate educational rhetoric generally associated with Democrats and the left. This decision helped Bush present himself as "different kind of Republican" and a "compassionate conservative" and to dramatically narrow the Democrats' traditional advantage on education, particularly among suburban women. This was critical in helping to win the election. But Bush didn't stop there. During his eight years in office, he would ultimately upend decades of conservative thinking on education, open the door to new spending and federal activity, and swap conventional conservative themes for language borrowed from the civil rights community. The administration's assault on the racial achievement gap--the huge disparity in test scores between both white and Asian students and their black and Hispanic peers--through the No Child Left Behind Act (NCLB) earned plaudits from many on the left and right. Over time, however, this approach has alienated suburban parents, who worry that NCLB's emphasis on low-achievers and low-level skills is harming their children and schools. In this way and others, partnering with the left on education reform has imposed real costs even as it has paid substantive dividends. First, the dividends: Simply put, the education-reform movement has seen progress during the Bush years that was once unimaginable. The number of students attending charter schools has nearly tripled, from 430,000 in 2000 (1) to 1.2 million in 2008. (2) The number of teachers coming into the classroom through alternative certification grew from 20,000 in 2000 to 59,000 in 2005. (3) The number of districts experimenting with merit pay has climbed into the hundreds--from essentially zero in 2000. (4) And a bona fide school-reform constituency has been born. Ten years ago, Teach For America (TFA) was sui generis and little-known. Today, it's a powerful international brand complemented by new ventures like The New Teacher Project, New Leaders for New Schools, and High Tech High Ed School, all of which work to recruit and train talented people as teachers. High-achieving charter-school providers--such as KIPP, Uncommon Schools, Achievement First, Aspire Public Schools, and the Green Dot Public Schools--are many and multiplying. TFA alumnus Michelle Rhee is breaking china as schools chancellor in Washington, D.C., and throw-out-the-rulebook superintendents such as Joel Klein in New York City and Arne Duncan in Chicago (Duncan is, as of this writing, U.S. Secretary of Education-designate) owe copious credit to the political cover, reform muscle, and abundant data that characterize the NCLB era. Most significantly, Bush's efforts helped midwife a group of reform-minded Democrats now willing to do battle with their traditional allies, the teachers' unions, over issues like charter schooling and teacher accountability. The day before the 2008 Democratic National Convention in Denver, at an education-related event sponsored by the group Democrats for Education Reform. Democratic leaders such as Rhee and Klein lashed out at teachers' unions in what Newark Mayor Cory Booker that day called "a battle for the heart of the Democratic Party." (5) "The Democratic Party is supposed to look out for poor and minority kids," said Rhee. "That's not the dynamic today." Such talk, coming from popular and influential Democratic politicians, is unprecedented. A representative of the National Education Association, the nation's largest teacher's union, told the Rocky Mountain News, "I was absolutely stunned at the level of union-bashing." And the reformers' aggressive tactics have continued apace, working the media and speaking forthrightly about their interest, and seeming success, in seeing one of their own at the helm of the U.S. Department of Education in the Obama administration. And yet, the positive results of Bush's education program are only half the tale. …

19 citations


Journal Article
TL;DR: In the European Union, women have 1.5 children on average, a number much too small to sustain a stable population as mentioned in this paper, and this problem will increase as world population moves up to 9.4 billion in 2050.
Abstract: PAGING THROUGH AMERICAN books or media reports on demographic change in Europe, one could conclude that the old continent is on its way to doom: Graying and shrinking, the Europeans sacrifice their occidental culture in favor of hordes of migrants from the south and the east. Walter Laqueur's book, The Last Days of Europe, pictures Europe as too decadent to reproduce--a place where only the uncultured masses, notably Muslim immigrants, have a considerable number of children. In other words, Europe is at demographic war with the rest of the world, and it can only lose. Of course, there is some truth in the basis figures (not necessarily in the conclusions) of these scenarios: Travel between Lisbon and Chisinau, the capital of tiny and poor Moldova, and you'll travel through countries where birthrates are low. In the European Union, women have 1.5 children on average, a number much too small to sustain a stable population. In 1957, when the European Union was founded, each of today's 27 EU members had fertility rates above 2.1 children per woman. What's more, the baby boomers are going to be replaced by a generation that is barely two thirds as large. How can these shrinking post-baby boomers generate the wealth needed to care for their aging parents and at the same time invest so much into innovative industries such that they can compete with the young and fertile populations on other continents? As all European nations age, some, such as Germany and Romania, have already started to shrink. Others will soon follow. Wherever fertility rates are appreciably below 2.1 there is no way, in the medium term, to reverse the trend toward natural population contraction. In 2008, eight out of 27 EU countries reported more deaths that births. Before mid-century, when the baby boomers will the out, a great number of nations will see dwindling population numbers. According to common population forecasts, the EU will lose some 50 million of its current inhabitants by the year 2050 through an excess of deaths over births--that is ten percent of today's inhabitants or roughly the populations of Poland and Greece combined. Most European countries will be able to achieve population growth, or even simply stability, only on the basis of immigration. [FIGURE 1 OMITTED] In fact, the natural losses will most certainly be compensated for by an equal number of immigrants over the next four decades, a figure that cannot be too disturbing for America or Canada, where almost everybody has a migration background. So despite low birthrates, EU-27's population will remain more or less constant until mid-century. From today's 500 million, it is estimated to peak at 5 20 million around 2035 and decline afterwards. As non-EU countries like Ukraine or Belarus are not likely to gain enough immigrants to compensate for the low fertility, Europe in total will definitely shrink. Europe's demographic share will also decline as all other continents continue to grow: the Americas will, and Asia will grow by about 30 percent. Africa's population will double by 2050. As a result, some scholars expect "massive shifts in economic and military potential" and "significant security challenges to Europe over the next two decades." Europe: Demographic trendsetter ALL THIS MAKES Europe the pioneer of a demographic trend that, sooner or later, will reach most corners of the world. World population nearly quadrupled during the past century. And it still grows by about 230,000 heads per day. With a population of 6.8 billion and resource consumption and pollutant emissions far beyond the limits of sustainability, humankind has reached a size at which it poses a danger to itself. This problem will increase as world population moves up to 9.4 billion in 2050. It was always clear that at some time, in some region of the world, the trend of demographic increase would reverse. Europe is the pioneer. And in this pioneering continent Germany is the frontrunner, because low fertility rates (two children replacing every three adults) have been common there for 35 years. …

15 citations


Journal Article
TL;DR: In 1997, I moved to Laos to work for the United Nations Development Programme and found that the country was so corrupt that little of the money went to its intended beneficiaries, and the only conference that might have been of use--a conference on combating governmental corruption was never held, presumably because the government would not permit it as discussed by the authors.
Abstract: IN 1997, I moved to Laos to work for the United Nations Development Programme. Laos was desperately impoverished. The country's infrastructure was primitive. A fifth of the nation's children died in infancy. Adult life expectancy barely exceeded 50 years. Less than half the population was literate. The UNDP spent most of its time endeavoring to raise funds from international donors to rectify this situation, and what time it did not spend this way, it spent holding elaborate conferences on the theme of how better to raise funds to rectify this situation. I quickly concluded that all this was a perfect waste of time. Laos was so corrupt that little of the money went to its intended beneficiaries, and the only conference that might have been of use--a conference on combating governmental corruption--was never held, presumably because the government would not permit it. Provincial governors solicited aid for the same purpose from multiple donors. None of the donors were informed about the others. When money came in, it promptly disappeared, and no one knew where it went. I came across many artifacts documenting this: A typical memorandum blandly noted that a donation of $62,000 from the Vatican, for famine relief, had simply vanished. Nothing in this memo suggested that this was in any way remarkable or that something should be done about it; it might as well have been a reminder that everyone was expected to be present at the Thursday afternoon staff meeting. During the time I was there, hearings were held on a proposal to build a massive dam in central Laos. It was to be funded by UN agencies. The project, like so many before it, was expected to wreak catastrophic environmental damage and enrich only a handful of Lao elite. This dissuaded no one from thinking it a good idea. This, too, was typical. The UN had solicited donor aid for the building of roads--who could object to roads?--but no sooner were these roads built than trucks stacked with logs began streaming out of the forest. Living large on the proceeds was an infamously venal Communist Party official known for murdering anyone who asked aloud whether it was a good idea to destroy one of the last intact, contiguous areas of tropical cover in Southeast Asia. Paramilitary logging conglomerates such as the DAFI group under General Bou Phon became warlords in the countryside. The deforestation caused massive flooding, destroying rice crops and leaving the peasants without food. Whenever I suggested to my fellow aid workers that it was pointless to solicit donations before addressing the problem of corruption, they would shrug. "At least it's better than Africa," someone would say, and everyone who had worked in Africa would nod in vigorous assent. I still receive the occasional e-mail from friends in Laos. From what they report, nothing has changed. Laos is, nominally, a communist country, but above all it is a kleptocracy. It is easy and tempting for enemies of communism to say that all command economies are in fact kleptocracies--that this is their inevitable nature--but there is no special reason to believe the problem to be limited to the communist sphere. I live now in Turkey, a nominally capitalist country, and the blight of corruption is equally plain to see. It is not as bad as Laos, of course, but it is bad enough to severely deform the country's economic and political life. Economists here, for example, believe that as much as 94 percent of the construction in Istanbul is illegal. "If the construction companies are fined, they just pay the penalties and keep on building," I was told by the Turkish academic Osman Altug, who specializes in the study of Turkey's underground economy. "It's not enough to stop them. The government doesn't really clamp down because they need those companies to support them financially." Istanbul lies on a massive fault zone. This construction is not built to code. It is visibly shoddy in the extreme. …

13 citations


Journal Article
TL;DR: In this article, the authors examine the history of the counterinsurgency campaign in Afghanistan and conclude that the most important metrics are those that gauge progress in the capacity and viability of the government.
Abstract: "Going forward [in Afghanistan], we will not blindly stay the course. Instead, we will set clear metrics to measure progress and hold ourselves accountable." --President Barack Obama, March 27, 2009 HOW TO TELL if a counterinsurgency campaign is being won? Sizing the force correctly for a stabilization mission is a key ingredient--and it has been the subject of much discussion in the modern American debate. But in fact, there is no exact formula for sizing forces. Even if there were, getting the numbers right would hardly ensure success. Troops might not perform optimally if poorly prepared for the mission; the security environment might pose too many daunting challenges for even properly sized and trained forces to contend with; indigenous forces might not be up to the job of gradually accepting primary responsibility for their country's security themselves; and the politics of the country in question might not evolve in a favorable direction due to the actions of internal or external spoilers. So to know if we are being successful, we must also track and study results on the ground. In conventional warfare, identifying the momentum of battle is a fairly straightforward undertaking. Predicting ultimate outcomes is still very difficult, but determining who is "ahead" at a given moment is usually feasible. Movement of the frontlines, attrition rates, industrial production of war materiel, and logistical sustainability of forces in the field provide fairly obvious standards by which to assess trends. But counterinsurgency and stabilization operations--like the ones in Iraq and Afghanistan--are different, and more complex. They also appear to be the future of warfare. How do we measure progress in such situations? This question is crucially important. Only by tracking progress can we know whether a strategy is working. And only by examining a range of indicators can we determine how to adjust a strategy that may require improvement. For example, a counterinsurgency effort in which violence is the central challenge facing a country will presumably demand different policy responses than would a mission in which economic stagnation, or poor quality of life for citizens, or political paralysis in a nation's government, presents the chief dilemma. In many cases all such problems will present themselves, and all must be addressed at some level--but it is unrealistic to think that all can receive equally rigorous and well-resourced responses. Priorities must be set. Metrics can help in determining what they should be. Assessing progress is also important because the perception of progress has an effect on the sustainability of the war effort. The theory of victory for insurgents fighting the United States and its allies is not to defeat their better equipped foe on the battlefield. It is to unequivocally demonstrate their capacity to fight a war of attrition indefinitely and then wait for political support for the mission to collapse on their enemies' home fronts. To counter this strategy, the United States and its allies must be able to demonstrate progress or at least the reasonable expectation of progress throughout the campaign. Given the political importance of measuring progress and the very limited set of agreed-upon benchmarks, the question of metrics has become deeply controversial. How to use metrics in the coming debate over whether the United States and its partners are succeeding in Afghanistan? To answer this question, it is important to examine the historical record for Iraq, in part because of the familiarity many readers will have with that case, and in part because the Iraq case illustrates the need for humility in applying metrics to any counterinsurgency. Such an examination yields three overall conclusions relevant to Afghanistan: * Unlike the case of Iraq, where trends in violence were the most important metrics for much of the war, Afghanistan presents a situation where the most important metrics are those that gauge progress in the capacity and viability of the government. …

11 citations


Journal Article
TL;DR: The most significant national-security challenge of the 21st century is the struggle against violent extremism as discussed by the authors, which is the challenge that makes all the threats we face (e.g., nuclear proliferation, chemical and biological weapons) that much more dangerous.
Abstract: THE STRUGGLE AGAINST violent extremism is the most significant national-security challenge of the 21st century. It is the challenge that makes all the threats we face--e.g., nuclear proliferation, chemical and biological weapons--that much more dangerous. The ungoverned spaces, urban slums, and impoverished regions of the Middle East, Asia, and Africa, along with the poorly integrated immigrant communities in Western Europe, are the epicenters of vulnerability around the world that al Qaeda and other terrorist groups actively exploit. There has been a great deal of debate about how we address these vulnerable populations and effectively challenge the threat posed by violent extremists; it is an argument fueled by the larger question of how we "win hearts and minds." This continuing discussion notwithstanding, most can agree that the end goal is to create a world in which the use of terrorist tactics to achieve political or other objectives is no longer acceptable or personally lucrative; in which extremists' efforts to radicalize and recruit new members are no longer successful; and in which the perpetrators of violent, extremist acts are isolated and marginalized by society at large. We have achieved this in America, as our domestic terrorist groups--Ku Klux Klan; Army of God; United Front; Aryan Nation; the Covenant, the Sword, and the Army of the Lord; the Weather Underground; and lone-ranger terrorists like the Unabomber--have little to no following and are rejected by American society. We have a long way to go to achieve this situation on a global level. Scattered and clandestine terrorist networks, groups, and leaders continue to inspire followers. These networks maintain strong bases of support and constituencies that legitimate their mission. In Iraq and Afghanistan, counterterrorism efforts have undoubtedly weakened al Qaeda and related groups, but pockets of instability still pose challenges by serving as frontiers for foreign fighters and nascent terrorist organizations looking to gain notoriety. Both countries are places where the ambitious yet impoverished go to fight in the name of Allah and to brand and market themselves for future extremist opportunities. Despite the work that remains to be done, though, we continue to see positive results in our kinetic operations to break up terrorist networks. Where we face more difficult challenges is in breaking the stream of new recruits that replenish violent and radical movements and severing the links between extremists and their target audience. This is the key to winning the long struggle against violent extremism. Our initial approach after September 11 was limited to a traditional public-diplomacy paradigm and failed to expand to broader elements of American power. This miscalculation was largely a legacy of how public diplomacy was used during the Cold War, when much of America's effort to win "hearts and minds" was directed at the elites within society. We were successful then, in part because of the nature of that time's more-centralized media, but also because of the nature of the debate. In the Cold War, the "foot soldiers" on both sides of the equation were intellectuals, and the main battlegrounds (with some exceptions) were journals and coffee houses. Today the "foot soldiers" are more likely to be young, disaffected males (and, increasingly, some females); they are not the elite of society. This means America's target audience for public diplomacy needs to be disaffected youths and those who influence them. This shift in whom we seek to influence presents a challenge that traditional public diplomacy alone cannot surmount. Some of these young people are in places like Hezbollah strongholds, al Qaeda havens, and ungoverned spaces--i.e., places we cannot reach through traditional public-diplomacy, democracy programs, or development assistance. Others are in urban slums, poorly integrated immigrant communities, or rural frontiers where we have some access but where local conditions render our efforts relatively ineffective. …

10 citations


Journal Article
TL;DR: Stateless groups as mentioned in this paper have become increasingly feasible and desirable in order to pursue a broad spectrum of objectives, such as self-defence, self-defense, and self-organization.
Abstract: MOST POLITICAL GROUPS in modern history have wanted to build and control a state. Whether movements of self-determination in the 19th century, of decolonization in the post-World War II decades, or political parties advocating separatism in several Western states in the 1990s (e.g., Italy and Quebec)--all aimed at one thing: to have a separate state that they could call their own. The means they employed to achieve this end ranged from terrorism and guerilla warfare to political pressure and electoral campaigns, but the ultimate goal was the same--creation of its own state. It is the ultimate goal no longer, and it is likely to be even less so in the future. Many of today's nonstate groups do not aspire to have a state. In fact, they are considerably more capable of achieving their objectives and maintaining their social cohesion without a state apparatus. The state is a burden for them, while statelessness is not only very feasible but also a source of enormous power. Modern technologies allow these groups to organize themselves, seek financing, and plan and implement actions against their targets--almost always other states--without ever establishing a state of their own. They seek power without the responsibility of governing. The result is the opposite of what we came to know over the past two or three centuries: Instead of groups seeking statehood through a variety of means, they now pursue a range of objectives while actively avoiding statehood. Statelessness is no longer eschewed as a source of weakness but embraced as an asset. (1) This does not mean that state-seeking groups have receded into history, though. The eruptions of violence in Yugoslavia and Chechnya in the 1990s, as well as the continuing tension in Kosovo and the Caucasus (not to mention the activities of FARC in Colombia, Hamas in the Palestinian Territories, LTTE in Sri Lanka), are examples of situations in which one group is vying to establish full state sovereignty in opposition to another group or government. These are macabre and violent celebrations of the idea of the nation-state. But these groups are no longer the only sources of security threats; nor, perhaps, are they the main ones. In fact, statelessness has become increasingly feasible and desirable in order to pursue a broad spectrum of objectives. And it is now a source of power for groups and, consequently, presents serious security challenges to existing countries. Furthermore, the rise of stateless groups may have an impact on the nature of the state itself. The modern nation-state arose in large measure because it was the most efficient way to provide defense, and other nation-states evolved to defend against those that had already been created. But if the major threat to today's countries comes not from their neighbors but from internal groups, countries will need to adapt to defend against internal subversion, costly strikes against infrastructure, destabilization of urban areas, and similar low-intensity and diffuse attacks perpetuated by small, decentralized, and mobile groups. The response to the threat of stateless groups may be a trend toward state decentralization. In fact, the most effective way of defending oneself against unpredictable attacks deep inside one's own territory is a devolution of security tasks to local communities. This may lead to a weakening of the monopoly of violence, which monopoly characterized the modern state. Paradoxically, then, the response to stateless groups may be the rise of more stateless--or sub-state--groups. Why a state, why statelessness THE PAST THREE centuries, but particularly the last 150 years or so, have taught us that to be a stateless group was to be weak. In some cases, such as that of the Jewish population in Europe, statelessness signified discrimination and, under the Nazi regime, extermination. The power of a state easily trumped the power of small groups that, unless they wanted to face forceful submission or even death, had either to acquiesce or migrate. …

10 citations


Journal Article
TL;DR: On a dark November night in 1978, 18 Chinese peasants from Xiaogang village in Anhui province secretly divided communal land to be farmed by individual families, who would keep what was left over after meeting state quotas.
Abstract: ON A DARK November night in 1978, 18 Chinese peasants from Xiaogang village in Anhui province secretly divided communal land to be farmed by individual families, who would keep what was left over after meeting state quotas. Such a division was illegal and highly dangerous, but the peasants felt the risks were worth it. The timing is significant for our story. The peasants took action one month before the "reform" congress of the party was announced. Thus, without fanfare, began economic reform, as spontaneous land division spread to other villages. One farmer said, "When one family's chicken catches the pest, the whole village catches it. When one village has it, the whole county will be infected." Ten years later, in August of 1988, Mikhail Gorbachev lifted his nation's 50-year-old prohibition against private farming, offering 50-year leases to farm families who would subsequently work off of contracts with the state. Few accepted the offer; Russian farmers were too accustomed to the dreary but steady life on the state or collective farm. Thus began reform of agriculture in Soviet Russia. The results in each country could not have been more different. Chronically depressed Chinese agriculture began to blossom, not only for grain but for all crops. As farmers brought their crops to the city by bicycle or bus, long food lines began to dwindle and then disappear. The state grocery monopoly ended in less than one year. Soviet Russian agriculture continued to stagnate despite massive state subsidies. Citizens of a superpower again had to bear the indignity of sugar rations. These two examples point to the proper narrative of reform in Gorbachev's Russia and Deng Xiaoping's China. Our narrative contradicts much received doctrine. The standard account is that China succeeded because a wise party leadership deliberately chose gradualism, retained the monopoly of the Communist Party after rebuffing democracy at Tiananmen Square, and carefully guided the process over the years. The narrative says that Russia failed because the tempestuous Gorbachev ignored the Chinese reform model, moved too quickly, and allowed the party monopoly to fall apart. This standard account is incorrect. Deng Xiaoping and his supporters, contrary to popular legend, did not agree on a reform program at the Third Plenum of the Eighth Party Congress in 1978, which installed him in power. A Chinese reform official by the name of Bao Tong later admitted as much: "In fact, reform wasn't discussed. Reform wasn't listed on the agenda, nor was it mentioned in the work reports." (1) Throughout the reform process, the Chinese Communist Party simply reacted to (and wisely did not oppose) bottom-up reform initiatives that emanated largely from the rural population. Deng Xiaoping's famous description of Chinese reform as "fording the river by feeling for the stones" is not incorrect, but it was the Chinese people who placed the stones under his feet. Mikhail Gorbachev became general secretary of his party in March of 1985. By that time, he knew that the Chinese reforms were successful. His reforms, contrary to the popular narrative, closely mimicked China's. He proposed to lease land to peasants, establish free trade zones, promote small cooperative businesses, and set up joint ventures. The difference was that Gorbachev imposed these changes from above, on an urban economy in which virtually all citizens worked for the state. Gorbachev's reforms either were ignored or they were enacted with perverse consequences. Bottom-up reforms worked in China; top-down reforms failed in Russia. Both countries began serious reform after the passing of a leader (or leadership) that abhorred reform. Deng Xiaoping and his allies succeeded Mao in 1978 after a brief power struggle with hardliners. Gorbachev succeeded the initial beneficiaries of Stalin's purges of the 1930s, who rose quickly as young men to replace those who were executed. …

7 citations


Journal Article
TL;DR: The Amethyst Initiative as mentioned in this paper advocates lowering the drinking age to 18 and licensing alcohol use for young people in much the same manner as driving, and encourages states at the least to adopt exceptions to the 21 laws that would allow minors to drink at home and in private clubs.
Abstract: THE PROBLEM of underage drinking on college campuses has been brewing for many years to the continued vexation of higher education administrators In 2008, John McCardell, president emeritus of Middlebury College, began to circulate for signature a public statement among colleagues titled "The Amethyst Initiative," (1) which calls for elected officials to reexamine underage drinking laws The project grew out of outreach efforts of a nonprofit organization he founded in 2,007 called Choose Responsibility The nonprofit advocates lowering the drinking age to 18 and licensing alcohol use for young people in much the same manner as driving--following coursework and an exam Choose Responsibility also favors the repeal of the laws that set 21 as the mandatory minimum age for drinking (known as the "21 laws") and encourages states at the least to adopt exceptions to the 21 laws that would allow minors to drink at home and in private clubs It also favors social changes that shift the focus on alcohol use among youth to the home, family, and individual The Amethyst Initiative's statement has been signed by 135 college presidents and chancellors at schools from Duke to Bennington The majority is private; most are in the Northeast The statement takes no formal position, unlike Choose Responsibility It does, however, drop heavy hints as to where the debate ought to come out The statement says "21 is not working" and asks "How many times must we relearn the lessons of Prohibition?" It draws comparisons to other age-of-majority rights conferred on 18-year-olds, such as voting and serving in the military, and calls upon elected officials to consider "whether current public policies are in line with current realities" It seems that the presidents of 135 colleges, including elite schools, large universities, and small state schools find themselves so exasperated with the amount of alcohol guzzled by undergraduates--or more to the point, the trouble the undergraduates get into while inebriated--that they now beseech lawmakers to "rethink 21," an elegant and rather roundabout way of saying: Let undergrads drink with the sanction of the law The primary argument made in the Initiative's statement in favor of repealing the 21 laws is that the 21 laws make alcohol taboo, thus driving underage drinking underground and causing more binge drinking to take place than otherwise would, due to the allure of forbidden fruit and the need for secrecy Hence, by lowering the drinking age, youth consumption would come out in the open and binge drinking would be largely reduced or even eliminated The second salutary effect of lowering the drinking age, the Initiative argues, would be educational: Colleges would be allowed to have open, frank discussions about responsible drinking In other words, institutions of higher education could teach young people how to drink responsibly The Initiative makes vague references to the "unintended consequences" of 21 "posing increasing risks to young people," and says that the original impetus for the 21 laws--reduction of highway fatalities by young drivers--has outlived its usefulness Since its launch, the Initiative has created a public dialog about the drinking age, resulting in media coverage and a hearing before the New Jersey state legislature in November 2008 Despite its gravity as a public health problem, even among children younger than 18, the topic of underage alcohol abuse has been underaddressed in the popular media and in public funding compared to illicit drug abuse The Initiative is a welcome development insofar as it challenges us to examine whether 21 "is working" The answer: It is not, as currently enforced So should 21 be scrapped or salvaged? First, a look at how we got here, and why the 21 laws are broken The 21 laws AMERICANS GENERALLY HAVE not allowed young people to drink Older teens were allowed to drink legally during part of the 1970s and early 1980s--a blip on the American-history radar screen …

6 citations


Journal Article
TL;DR: One evening in the spring of 2007, Barack Obama, drooping off the Senate floor, paused for an interview with New York Times columnist David Brooks. "His voice," Brooks later recounted, "was measured and fatigued, and he was taking those little pauses candidates take when they're afraid of saying something that might hurt them later on." In the midst of the interview, Brooks spontaneously severed his line of questioning about development aid and asked Obama, "Have you ever read Reinhold Niebuhr?" Obama perked up: "I love him," he said.
Abstract: ONE EVENING IN the spring of 2007, Barack Obama, drooping off the Senate floor, paused for an interview with New York Times columnist David Brooks. "His voice," Brooks later recounted, "was measured and fatigued, and he was taking those little pauses candidates take when they're afraid of saying something that might hurt them later on." In the midst of the interview, Brooks spontaneously severed his line of questioning about development aid and asked Obama, "Have you ever read Reinhold Niebuhr?" Obama perked up: "I love him," he said. "He's one of my favorite philosophers." Divers writers have dissected and probed this assertion, seeking to explicate how Obama's affinity for Niebuhr will or will not manifest itself in the new president's governance. Too often, though, they have assumed that Obama's statement provides extensive insight into his mental life. It does not. For detecting exactly what it is that Obama loves about Niebuhr is tricky, not only because Obama's positions are inclined to blurriness but because Niebuhr's thought is extensive and itself frequently less than forth-right. One needs more than a scalpel and probe to figure out the connection between Obama and Niebuhr; indeed, one may need more tools just to get at the crux of Niebuhr alone. The enigma IN SEPTEMBER, 2005, the Daily Kos blog ran a lengthy letter from Obama in which he defended those Democratic senators who, unlike himself, had voted to confirm John Roberts as the Supreme Court's chief justice and were taking heat from some liberals because of it. "To the degree that we brook no dissent within the Democratic Party, and demand fealty to the one, 'true' progressive vision for the country," Obama wrote, "we risk the very thoughtfulness and openness to new ideas that are required to move this country forward." It is the language of a centrist. And yet, Obama specifically rejected that label by urging Democrats not to tack centrist but steer their liberal agenda toward more audacious and innovative waters. "Too often," he wrote, "the 'centrist' label seems to mean compromise for compromise sake, whereas on issues like health care, energy, education, and tackling poverty, I don't think Democrats have been bold enough." And while happy to chide intolerant liberals, Obama refused to analogize left-wing activist groups to their rightwing counterparts because "Fighting on behalf of the poor and the vulnerable is not the same as fighting for homophobia and Halliburton." Such is the style of America's new president. Their tendency toward lapidary construction notwithstanding, his sentences can seem contradictory. In the Daily Kos letter, Obama defended some centrist Democrats and rebuked others; he applauded certain forms of what is called centrism and disavowed others. He compared left-wing litmus tests to right-wing ones but then found his own comparison inapt because the concerns of conservative groups (e.g., "fighting for homophobia") are fearful while those of liberal groups are hopeful. He applauded the idea of "disagreeing without being disagreeable," but certain Republican ideas, those he doesn't like, he called "dumb." That Barack Obama is an "enigma" has been aplenty noted: Last year, the Washington Post editorial page published "The Obama Enigma," David Broder penned "Obama's Enigma," and Victor Davis Hanson wrote of Obama, "After more than a year of campaigning, he still remains an enigma." The then-presidential-candidate could not be pinned down. Larissa MacFarquhar, writing in the New Yorker in 2007, found that "in his view of history, in his respect for tradition, in his skepticism that the world can be changed any way but very, very slowly, Obama is deeply conservative. There are moments when he sounds almost Burkean. He distrusts abstractions, generalizations, extrapolations, projections." Fine, but was this not the man--purportedly so skeptical of change and abstractions--who built a winning presidential campaign on abstractions about change? …

6 citations


Journal Article
TL;DR: Pornography is a relatively new substance that is relatively new in the public square, but by now so ubiquitous in our society that a great many people find its presence unremarkable as discussed by the authors.
Abstract: IMAGINE A SUBSTANCE that is relatively new in the public square, but by now so ubiquitous in your society that a great many people find its presence unremarkable. Day in and day out, your own encounters with this substance, whether direct or indirect, are legion. Your exposure is so constant that it rarely even occurs to you to wonder what life might be like without it. In fact, so common is this substance that you take the status quo for granted, though you're aware that certain people disagree. A noisy minority of Americans firmly opposes its consumption, and these neo-Puritans try routinely to alert the public to what they claim to be its dangers and risks. Despite this occasional resistance, however, you--like many other people of your time--continue to regard this substance with relative equanimity. You may or may not consume the thing yourself, but even if you don't, you can't much see the point of interfering with anyone else's doing it. Why bother? After all, that particular genie's out of the bottle. The scenario sketched in these paragraphs captures two very different moments in recent American history. One is the early 1960s, exactly the moment when tobacco is ubiquitous, roundly defended by interested parties, and widely accepted as an inevitable social fact--and is about to be propelled over the cliff of respectability and down the other side by the surgeon general's famous 1964 "Report on Smoking and Health." The resulting social turnaround, though taking decades and unfolding still, has nevertheless been nothing short of remarkable. In 1950, almost half the adult American population smoked; by 2004, just over a fifth did. Though still in common use and still legally available, cigarettes somehow went from being widely consumed and accepted throughout the Western world to nearly universally discouraged and stigmatized--all in the course of a few decades. The other moment in time captured by the opening description is our own, except that the substance under discussion this time around is not tobacco, but pornography--especially internet pornography, which today is just about as ubiquitous, as roundly defended by interested parties, and as widely accepted as an inevitable social fact as smoking was 50-odd years ago. The ubiquity is plain. Pornography is the single most searched-for item on the internet and also the most profitable. It is referred to knowingly, whether explicitly or with a wink and a nod, in more public venues than one can possibly enumerate--including on phones and in video games and popular music, in comic books and on skateboards, among other areas of juvenile culture. Even the more "serious" quarters of the internet, those devoted to news and politics and general-interest blogs, are riddled with knowing references to pornography. As the protagonist of the recently chic movie Zack and Miri Make a Porno comments, "It's all mainstream now." Today's prevailing social consensus about pornography is practically identical to the social consensus about tobacco in 1963: i.e., it is characterized by widespread tolerance, tinged with resignation about the notion that things could ever be otherwise. After all, many people reason, pornography's not going to go away any time soon. Serious people, including experts, either endorse its use or deny its harms or both. Also, it is widely seen as cool, especially among younger people, and this coveted social status further reduces the already low incentive for making a public issue of it. In addition, many people also say that consumers have a "right" to pornography--possibly even a constitutional right. No wonder so many are laissez-faire about this substance. Given the social and political circumstances arrayed in its favor, what would be the point of objecting? Such is the apparent consensus of the times, and apart from a minority of opponents it appears very nearly bulletproof--every bit as bulletproof, in fact, as the prevailing laissez-faire public view of smoking did in 1963. …

5 citations


Journal Article
TL;DR: In this paper, the authors trace the evolution of the Pakistani military and argue that the military has been able to dominate the political life of the country for much of its history, which is the case in India and Pakistan.
Abstract: PRACTICALLY FROM THE moment of its creation in 1947, Pakistan has been plagued by ethnic tensions, mis-management, and corruption. The profound incompetence of its civilian rulers in the first decade of independence created a political vacuum filled by the generals, who have ruled or dominated Pakistan, directly or indirectly, for much of its history. The country's dismal political record and lackluster socioeconomic development are all the more remarkable when contrasted with the relative success of its gigantic neighbor to the east. To be sure, India has also encountered ethno-religious conflicts, widespread poverty, and many other challenges, but it has remained a functioning democracy with an increasingly promising economic future. Why has Pakistan failed where India succeeded? Why has it become an authoritarian state? Why have its armed forces been able to dominate its political life? To place Pakistan's predicament in the proper perspective, we should consider the roots of its sovereign statehood--the colonial past, the circumstances of its founding, and the early years of its independence--and trace the evolution of its principal political player, the military. Three points are key. First, the political legacies of British colonialism impacted India and Pakistan differently. Second, the circumstances of British India's 1947 partition and the events of the immediate post-partition period suggest several reasons for the different political trajectories of its two successor states. Finally, within the first decade of its independence, the authoritarian mold of Pakistan's political system was cast, and since then we have witnessed different permutations of that early prototype. Thus, the Pakistani experience supports the argument that the fate of political transitions is frequently determined in the first few years after the fall of the ancien regime. The British legacy FEW IMPERIAL POWERS succeeded in leaving behind such a durable imprint on their subject peoples as the British did in South Asia. And after more than six decades of independence, no other Pakistani or Indian institution retains as much of its British origins as the armed forces, owing to their members' extensive training by and exposure to their British counterparts during the colonial era, the continued education of their elites at British institutions and, no doubt, to the military's relative separation from the rest of society. Four distinct legacies of the British India Army (BIA) are particularly relevant here. Three of these--professionalism, ethnocentric recruitment, and the army's aid to civil authorities--have had a similar effect on the armies of both successor states. The fourth--the British insistence on clear separation between the political and military domains--had a strong impact on India but eluded Pakistan. Let us take a closer look at all of them. Professionalism. The British provided rigorous and modern military training and an attractive career option to qualified native Indians. The cream of the crop received officer education at Sandhurst in England, but training was ongoing in the garrisons and bases throughout British India. Until 1939 the officer corps was relatively small and tight-knit, but the need for a much larger force in World War II required its quick expansion, which resulted in the changing ratio of British to Indian officers from 10:1 to 4.1:1. (1) Most importantly, the British instilled a military ethos that put high value on professional competence, and the officer corps of both independent Pakistan and India has kept these traditions alive. Ethnic preference in recruitment. A more controversial legacy is the discriminatory view of the warlike qualities of various ethnic groups. One of the pillars of the BIA'S success was its careful staffing policy. Recruiters generally avoided enlisting Bengalis and drew from regions in the west, especially the Punjab, which had mostly remained loyal to the British at the time of the anti-colonial Mutiny of 1857. …

Journal Article
TL;DR: For instance, this article argued that the desire for sex and food are joined at the root of human desire, and pointed out that if pursued without regard to consequence, it can prove ruinous not only to oneself, but also to other people, and even to society itself.
Abstract: OF ALL THE TRULY seismic shifts transforming daily life today--deeper than our financial fissures, wider even than our most obvious political and cultural divides--one of the most important is also among the least remarked. That is the chasm in attitude that separates almost all of us living in the West today from almost all of our ancestors, over two things without which human beings cannot exist: food and sex. The question before us today is not whether the two appetites are closely connected. About that much, philosophers and other commentators have been agreed for a very long time. As far back as Aristotle, observers have made the same point reiterated in 1749 in Henry Fielding's famous scene in Tom Jones: The desires for sex and for food are joined at the root. The fact that Fielding's scene would go on to inspire an equally iconic movie segment over 200 years later, in the Tom Jones film from 1963, just clinches the point. Philosophers and artists aside, ordinary language itself verifies how similarly the two appetites are experienced, with many of the same words crossing over to describe what is desirable and undesirable in each case. In fact, we sometimes have trouble even talking about food without metaphorically invoking sex, and vice versa. In a hundred entangled ways, judging by either language or literature, the human mind juggles sex and food almost interchangeably at times. And why not? Both desires can make people do things they otherwise would not; and both are experienced at different times by most men and women as the most powerful of all human drives. One more critical link between the appetites for sex and food is this: Both, if pursued without regard to consequence, can prove ruinous not only to oneself, but also to other people, and even to society itself. No doubt for that reason, both appetites have historically been subject in all civilizations to rules both formal and informal. Thus the potentially destructive forces of sex--disease, disorder, sexual aggression, sexual jealousy, and what used to be called "home-wrecking"--have been ameliorated in every recorded society by legal, social, and religious conventions, primarily stigma and punishment. Similarly, all societies have developed rules and rituals governing food in part to avoid the destructiveness of free-for-alls over scarce necessities. And while food rules may not always have been as stringent as sex rules, they have nevertheless been stringent as needed. Such is the meaning, for example, of being hanged for stealing a loaf of bread in the marketplace, or keel-hauled for plundering rations on a ship. These disciplines imposed historically on access to food and sex now raise a question that has not come up before, probably because it was not even possible to imagine it until the lifetimes of the people reading this: What happens when, for the first time in history--at least in theory, and at least in the advanced nations--adult human beings are more or less free to have all the sex and food they want? This question opens the door to a real paradox. For given how closely connected the two appetites appear to be, it would be natural to expect that people would do the same kinds of things with both appetites--that they would pursue both with equal ardor when finally allowed to do so, for example, or with equal abandon for consequence; or conversely, with similar degrees of discipline in the consumption of each. In fact, though, evidence from the advanced West suggests that nearly the opposite seems to be true. The answer appears to be that when many people are faced with these possibilities for the very first time, they end up doing very different things--things we might signal by shorthand as mindful eating, and mindless sex. This essay is both an exploration of that curious dynamic, and a speculation about what is driving it. As much as you want The dramatic expansion in access to food on the one hand and to sex on the other are complicated stories; but in each case, technology has written most of it. …

Journal Article
TL;DR: This paper argued that Islamic violent extremism is attractive only for a fringe group, the low-volume end of a popular support curve, and that the support base for global terrorism fueled by ideology and uprooted activists remains at the tail, not at the head.
Abstract: THE AFGHAN-PAKISTAN BORDER region is widely identified as a haven for jihadi extremists. But the joint between local insurgencies and global terrorism has been dislocated. A combination of new technologies and new ideologies has changed the role of popular support: In local insurgencies the population may still be the "terrain" on which resistance is thriving--and counterinsurgency, by creating security for the people, may still succeed locally. But Islamic violent extremism in its global and ambitious form is attractive only for groups at the outer edge, the flat end of a popular support curve. Jihad failed to muster mass support, but it is stable at the margin of society. Neither the West nor its enemies can win--or lose--a war on terror. Western anti-terror policy rests on the assumption that the threat of violent extremism has to be treated at the root--in Afghanistan. A stable Afghan-Pakistan border region, the theory goes, would stop exporting terrorism to the rest of the world. "Our strategy is this: We will fight them over there so we do not have to face them in the United States of America," in former President George W. Bush's words. President Barack Obama, it seems, has taken over a strategic mainstay from his predecessor: "I want the American people to understand that we have a clear and focused goal: To disrupt, dismantle, and defeat al Qaeda in Pakistan and Afghanistan," the president said on March 27, upon announcing his new strategy. This approach is indeed shared by many allies in Europe; confronting the jihadi menace in distant theaters, they agree, will make the homeland safer. In the argument's strong version, comprehensive counterinsurgency is the right means to that goal; a weaker version is limited to going after the hardened jihadis. This approach to countering terrorism, in both versions, is predicated on a more general assumption: that global terrorism and local insurgency are flipsides of the same coin. Insurgencies thrive on chaos and failed governance, environments from which they can, through terror, extract popular support and ultimately political success. The local population, therefore, is the critical "human terrain" of modern wars. Creating a safe environment for the population in one area, and making that area fit for basic governance and commerce, would win over popular support and choke the insurgency. Spreading these pacified areas like "oil slicks" would thus eliminate terrorist "safe havens." Terrorism and insurgency, in short, are seen as tightly linked. Modern land forces, in General Rupert Smith's commonly used phrase, win "among the people." But how valid is this old assumption in the ninth year of Operation Enduring Freedom? The linkage between terrorism and insurgency has been altered in the early 21st century. Instead of seeing high-volume popular support in an insurgency as the "soil" on which resistance and terrorism are flourishing--and counterinsurgency as a competition for that support--an additional paradigm is needed: the "tail." Islamic violent extremism is attractive only for a fringe group, the low-volume end of a popular support curve. Individuals and groups at the extreme margin hold views and embrace methods that antagonize mainstream society, in Lashkar Gah as well as in London. Yet, because of increased efficiencies in the distribution, manufacturing, and marketing of Salafist ideas and tactics, the demand and support for--and thus the feasibility of--violent extremism has become more independent of mainstream popular backing. The terrain analogy may retain limited relevance to grasp the role of the population in local insurgencies with a political and territorial agenda, and these local insurgencies might continue to be of interest to some militants with a global agenda. But the support base for global terrorism fueled by ideology and uprooted activists remains at the tail, not at the head--it is therefore immune to many of the methods that are employed to combat insurgency. …

Journal Article
TL;DR: This paper pointed out that communities have a profound effect on what seem like individual choices, from voting to purchasing to eating and beyond, and suggested that these tribal leaders have some sense of national citizenship.
Abstract: THE TIRED DEBATE between those who believe in nation-building and those who scoff at it glosses over a major difference between top-down and bottom-up society-building. The starting point for a bottom-up approach is the communitarian recognition that societies--even modern, so-called "mass" societies--are not composed of just millions upon millions of individual citizens. Instead, most societies are communities of communities. Most people come in social packages. They are greatly influenced by the communities of which they are members and by their natural leaders. These communities are not necessarily residential--the traditional village--but may be ethnic, religious, or based on national origin. This is not to suggest that individuals do not have degrees of freedom or that their behavior is determined in full by their communities. It merely points out that communities have a profound effect on what seem like individual choices, from voting to purchasing to eating and beyond. Moreover, American national society was formed to a significant extent only after the Civil War. Before that, most Americans' prime loyalty was to the colony, state, or region in which they lived. When asked overseas, "Where are you from?" Americans used to answer, "I am Virginian" or "I am Bostonian." Only after the 1870s did more and more Americans respond, "I am American." Only during the Reconstruction period did the Supreme Court stop referring to the United States as a plurality ("The United States are") and start referring to the nation as a singular entity ("The United States is"). It took a very bloody war and a generation of society-building afterwards to make the South and the North into one political community--a process that is still ongoing. Many other countries we now know as nations were similarly cobbled together, including the United Kingdom, Germany, Italy, and Switzerland. I point to these familiar pieces of sociological history because we tend to ignore them when we deal with countries that have not yet made much progress along these society-building, communitarian lines--whether or not they are called nations or have flags, seats at the United Nations, and diplomatic representatives in the capitals of the world. For instance: Iraq, Pakistan, and above all, Afghanistan. Think tribes, not a nation GIVEN THE POWER and import of communities (often referred to as "tribes"), the issue here is not whether we can or should avoid engaging in nation building, but how we proceed. Do we make our starting point the notion that there is a central national government, whose troops and police we can train as a national force and whose administration of justice and social services we can improve? Or, do we realize that such a center-to-periphery approach is unworkable, and that we need to build from the periphery to the center? This does not mean that we should go find individual citizens to "empower" and work with them. Instead, we should look at places like Afghanistan as lands in which several tribes lie next to each other. (I use the term "tribes" loosely, referring to ethnic and confessional communities whose members have tribe-like ties to one another, ties they do not have to members of other communities.) In other words, there are many societies in which nation building cannot start from the center--and those who insist otherwise pay a heavy price. Howard Hart, a member of the CIA clandestine service for 25 years and former CIA Chief of Station in Islamabad, observes that "Afghan" is purely a geographic distinction and that "there is not, and never has been, anything remotely approaching a shared national identity." He finds that tribal loyalties in the region are paramount, and "warlords" (or tribal chiefs) are loath to subordinate themselves to a higher authority, especially one fostered by foreign powers. I would temper this argument a bit and suggest that these tribal leaders have some sense of national citizenship. …

Journal Article
TL;DR: The faith-based initiative is a kind of a stand-in for the failed "faith-based" presidency of George W. Bush as discussed by the authors, and it has been used as a symbol of the failure of the entire Bush administration.
Abstract: BY THE TIME he left office, President Bush's faith-based initiative had become a kind of stand-in for his entire presidency. Whenever something went wrong on Bush's watch it was tarred as yet another "faith-based" policy. As the 2008 presidential election began to take shape, with the Democrats newly in charge of both the House and the Senate and the most aggressively religious president of the modern era plummeting toward record-low poll numbers, there was hope among many that the faith-based initiative would be swept away with the rest of Bush's failed "faith-based presidency." Instead, on the campaign trail in 2008, Barack Obama defied liberal expectations and pledged to revive the faith-based initiative. Surprised and disappointed civil-libertarian and secularist groups were somewhat consoled by Obama's further pledge to repeal the most controversial element of Bush's policy, so-called religious "hiring rights." On his watch, Obama initially said, receipt of federal funding would disqualify faith-based grantees from exercising religion-based discretion over hiring and employment policies, something they enjoy under civil rights law as a matter of religious liberty. But when President Obama unveiled his version of the faith-based initiative shortly after his inauguration in 2009, he did not repeal Bush's policy on religious hiring rights, as many assumed he would do by executive order. Instead, he took the issue off the political table, reclassifying it as a technical matter to be handled by White House legal counsel and Department of Justice lawyers. Responding to Obama's inaction, the Director of the ACLU Washington Legislative Office oddly accused him of "heading into uncharted and dangerous waters," when, in fact, he was simply maintaining existing White House policy, itself derived from provisions in social welfare law introduced during the Clinton years. President Obama has made his differences with Bush quite clear in terms of policy and spending in poor communities: We are not doing enough to create sufficient opportunity for all. Institutionally, however, he did not aggressively restore the pre-Bush status quo as many had hoped--by shutting down the White House faith-based offices Bush established and by repealing Bush's administrative actions on religious hiring rights and other controversial matters. Perhaps Obama is simply cautious enough in his own thinking not to assume the worst about everything Bush did, or perhaps he is making raw political calculations about building religious support for his presidency. To properly assess Obama's motivations and decisions in this area, however, it's important to have a deeper understanding of how the faith-based initiative came to exist, and why. Redrawing boundaries WHEN HE LAUNCHED his signature domestic policy nine days after his inauguration in 2001, Bush's understanding of the evolving legal and political context for government contracting with religious social-service providers was highly developed: The issue of church-state barriers in social aid programs first captured his attention as governor of Texas six years earlier, and he had tried to lower certain barriers there with help from national experts such as Marvin Olasky and Stanley Carlson-Thies. Although ultimately he had little success in expanding public support for faith-based providers in Texas, by the time Bush entered the White House in 2001, federal law had dramatically altered the church-state landscape in social services by establishing "charitable choice" provisions in several major social-welfare statutes, beginning with the welfare reform bill of 1996. Charitable choice essentially gives faith-based providers a statutory right of eligibility in government contracting for social services, coupled with strong protections for maintaining their religious autonomy within federally-funded programs. As of 2001, charitable choice rules applied to programs disseminating nearly $22 billion in federal funds annually. …

Journal Article
TL;DR: In fact, some of the most important financial instruments implicated in the financial crisis of 2008, including mortgage-backed securities and credit default swaps, owed their existence to regulatory anomalies.
Abstract: IN A COMPELLING fictional narrative, there are villains, victims, and heroes. One can give a compelling account of the financial crisis of 2008 that contains such characters, but it would be fictional: A true villain has to be aware of his villainous deeds. Instead, the primary candidates for the role of villain in the 2008 emergency--the executives of banks, Wall Street firms, and insurance companies--made out too poorly in the end to suggest willfulness. If these companies had done nothing but deliberately foist risks on others, they themselves would have survived. The fact that Bear Stearns, Lehman Brothers, and other companies took such large losses is indicative of self-deception. Executives had too much confidence in their risk management strategies. Regulators, too, had excessive confidence in the measures that they had in place to ensure safety and soundness of banks and other regulated institutions. The crisis was both a market failure and a government failure. In fact, some of the most important financial instruments implicated in the crisis, including mortgage-backed securities and credit default swaps, owed their existence to regulatory anomalies. In the way that they specified capital requirements, regulators gave their implicit blessing to risky mortgages laundered through securitization and to treating a broad portfolio of risky assets as if it were a safe asset. Simply put, there was a widespread gap between what people thought they knew to be true and what was actually true. Housing industrial policy HOUSING AND MORTGAGE debt are heavily influenced by public policy. It might even be fair to say that housing is to the United States what manufacturing exports were to Japan in the decades following the World War II: a sector viewed by government as critical for the health of the economy. Like manufacturing exports in Japan, housing in the United States has been the focus of industrial policy, in which government and private firms worked together to try to maintain continuous expansion. Increased home ownership and cheap and accessible mortgage finance were major policy goals, regardless of which political party held Congress or the presidency. This housing industrial policy can be traced back quite far, but we shall begin in 1968, when mortgage securitization made its debut. In that year, Lyndon Johnson was an unpopular president fighting an unpopular war in Vietnam. Under the circumstances, having to ask Congress to increase the limit on the national debt always caused friction and embarrassment for the administration. At the time, the national debt included the funds raised by government housing agencies, and in 1968, the government found two ways to get this debt off its books. The Federal National Mortgage Association, which had been created in 1938 to fill the void left by bank failures, functioned by purchasing home loans from independent originators known as mortgage bankers. Fannie Mae, as it was later called, acted like a giant national bank, financing mortgages from all over the country. At that time, it did not issue any mortgage securities. Instead, it funded its holdings by issuing bonds, as an agency of the federal government. To get Fannie Mae debt off its books, the government privatized it by selling shares to investors. The government may have retained an implicit promise not to allow Fannie Mae to fail, but this implicit promise appeared nowhere on the government's balance sheet. Selling Fannie Mae, however, still left the government issuing debt to finance mortgages under loan programs of the Federal Housing Administration (FHA) and the Veterans Administration (VA). To take these mortgage loans off the books, Johnson created the Government National Mortgage Association (GNMA), which pooled loans insured by FHA-VA into securities and sold them to investors. Thus, the government no longer had to issue its own bonds to finance these mortgages, but it continued to guarantee that FHA-VA mortgages would not default. …

Journal Article
TL;DR: In fact, unless these negative trends are restrained and reversed, nuclear weapons reductions in the U.S. and even Russia may not be enough to reduce continuing nuclear rivalries and could actually intensify them as discussed by the authors.
Abstract: IF CURRENT TRENDS continue, in a decade or less, the United Kingdom could find its nuclear forces eclipsed not only by those of Pakistan, but of Israel and India as well. Shortly thereafter, France could share the same fate. China, which has already amassed enough separated plutonium and highly enriched uranium to easily triple its current stockpile of roughly 300 deployed nuclear warheads, also is likely to increase its deployed numbers, quietly, during the coming years. (1) Meanwhile, over 2.5 states have announced their desire to build a large nuclear reactor--a key aspect of most previous nuclear weapons programs -- before 2030. None of these trends should be welcome to those who favor the abolition of nuclear weapons. Indeed, unless these negative trends are restrained and reversed, nuclear weapons reductions in the U.S. and even Russia may not be enough to reduce continuing nuclear rivalries and could actually intensify them. To understand why, one need only review what is currently being proposed to reduce these nuclear threats. The road to zero A DECADE AGO, AN analysis of the challenges of transitioning to a world without nuclear weapons would be dismissed as purely academic. No longer. Making total disarmament the touchstone of U.S. nuclear policy is now actively promoted by George Shultz, William Perry, Henry Kissinger, and Sam Nunn--four of the most respected American names in security policy. (2) Most of their proposals for reducing nuclear threats, moreover, received the backing of both presidential candidates in 2008 and, now, with President Obama's arms control pronouncements in April in Prague, they have become U.S. policy. These recommendations include getting the U.S. and Russia to make significant nuclear weapons reductions; providing developing states with "reliable supplies of nuclear fuel, reserves of enriched uranium, infrastructure assistance, financing, and spent fuel management" for peaceful nuclear power; and ratifying a verified Fissile Material Cut-off Treaty (FMCT) and a Comprehensive Test Ban Treaty (CTBT). This newfound enthusiasm for nuclear weapons reductions has been heralded as a clear break from the past. Politically, this may be so. Technically, however, the U.S. and Russian military establishments have steadily reduced the numbers of operational, tactical, and strategic nuclear weapons since the late 1960s sevenfold (i.e., from 77,000 warheads to less than 11,000). By 2012, this total is expected to decline by yet another 5 o percent. What has driven these reductions? Mostly, advances in military science. Since the Cold War, progress in computational science, digital mapping, and sensor and guidance technologies have significantly enhanced the precision with which weapons can be aimed. Rather than 5 o percent of warheads hitting within 1,000 meters of their intended targets--the average accuracy of the 1960s-design scud missiles--it now is possible to strike within a few feet (the average accuracy of a Predator-launched missile or a long-range cruise missile). Thus, the U.S. and Russian militaries no longer need to target more or larger-yield nuclear weapons to assure the destruction of fixed military targets. They can threaten them with a single, small-yield nuclear weapon or even conventional warheads. Hence, the massive reduction in U.S. and Russian deployed tactical and strategic nuclear weapons and in the average yields of these weapons (see Figure I ). (3) [FIGURE 1 OMITTED] When policymakers call for more nuclear weapons reductions and increased nuclear restraint, then, they are hardly pushing against historical or technological trends. Unfortunately, this desired harmony with history and science is far less evident when it comes to the specific proposals being made to reduce future nuclear threats. Here, it is unclear if the proposals will reduce or increase the nuclear threats we face. Consider the suggestion made in the 2008 Nunn-Shultz-Perry-Kissinger Wall Street Journal op-ed (a follow-up piece to one they had written a year earlier) that advocated spreading "civilian" nuclear power technology and large reactors to states that promise to forgo nuclear fuel making--a spread that would bring countries within weeks or months of acquiring nuclear weapons. …

Journal Article
TL;DR: Activists from nongovernmental organizations and the media and some within the government have targeted a panoply of products, technologies, and industries that they dislike--pesticides, food additives, chemicals in general, pharmaceuticals, nuclear power, and biotechnology, among others--for opprobrium, over-regulation, and even extinction.
Abstract: ACTIVISM HAS LONG been part of the fabric of American life It is often positive, as when it pushes for constraints on undue government intrusion into our lives Sometimes, however, activism can be destructive For instance, activists from nongovernmental organizations (NGOS) and the media, as well as some within the government, have targeted a panoply of products, technologies, and industries that they dislike--pesticides, food additives, chemicals in general, pharmaceuticals, nuclear power, and biotechnology, among others--for opprobrium, over-regulation, and even extinction And it seems that no stratagem, no misrepresentation, no outright lie is too outrageous for them Biotech: A favorite target BIOTECHNOLOGY HAS BEEN especially victimized by irresponsible activism A prototypic example is professional activist Jeremy Rifkin's relentless, decades-old antagonism toward recombinant DNA technology, or gene-splicing, applied to the production of innovative new drugs, gene therapy for life-threatening diseases, agriculture, or anything else Thirty years ago, he and his followers disrupted a public meeting, chanting, "We shall not be cloned," and displaying signs proclaiming, "Don't Xerox Life" That was hardly radical by the standards of the 1970s, but Rifkin's statement that biotechnology threatens "a form of annihilation every bit as deadly as nuclear holocaust" is extreme and baseless, a manifestation of a Big Lie--that biotech is untested, unsafe, unproven, unwanted, and unregulated--which is a mainstay of radical activism A broad scientific consensus long has held that the newest techniques of biotechnology are no more than an extension, or refinement, of earlier ones applied for centuries, and that gene transfer or modification by gene splicing techniques does not, per se, confer risk Rifkin's assertions about biotechnology ignore the seamless continuum that exists between old and new biotechnology and the monumental contributions that both have made to medicine, agriculture, and innumerable scientific disciplines The late Harvard evolutionary biologist Stephen Jay Gould, by his own admission, tried to be sympathetic to Rifkin's views but was overwhelmed by his "extremism" and "lack of integrity," and by his showing "no understanding of the norms and procedures of science" Gould characterized Rifkin's anti-biotechnology book, Algeny, as "a cleverly constructed tract of anti-intellectual propaganda masquerading as scholarship"; he said he had not "ever read a shoddier work" And then there is Greenpeace, which may have attained the nadir of anti-biotechnology activism when, in 1995, the organization announced that it had "intercepted a package containing rice seed genetically manipulated to produce a toxic insecticide, as it was being exported [and] swapped the genetically manipulated seed with normal rice" The rice seeds stolen by Greenpeace had been genetically improved for insect resistance and were en route to the International Rice Research Institute in the Philippines from the Swiss Federal Institute of Technology in Zurich The modified seeds were to be tested to confirm that they would grow and produce high yields of rice without requiring lots of chemical pesticide In the Philippines and many developing countries in Asia where rice is a staple, disease-resistant and insect-resistant rice are of course desperately needed, but this fact has not dissuaded Greenpeace from its opposition The organization has actually told inhabitants of developing countries concocted tales of gene-spliced crops causing homosexuality, illness, and baldness In Africa, it has promulgated the myth that improved crops cause impotence and increase the spread of HIV/AIDS Doreen Stabinsky, a so-called "science advisor" to Greenpeace International, has claimed that cotton fiber, animal feed, and cotton-seed oil from Bt-cotton plants can lead to an increase in the occurrence of antibiotic-resistant bacteria, including those that cause tuberculosis and gonorrhea …

Journal Article
TL;DR: In the movie Trading Places, the leading characters Louis and Billy Ray toast their crushing victory over the nasty Duke brothers as discussed by the authors and declare that a global democratic capitalist revolution led by and modeled on the United States was imminent.
Abstract: THERE IS A wonderful concluding scene in the movie Trading Places, when the leading characters Louis and Billy Ray toast their crushing victory over the nasty Duke brothers. Stretched out on the beach, drink in hand, Billy Ray yells, "Looking good Louis." Louis stands in his yacht with a glass of champagne in one hand and Jamie Lee Curtis in the other and yells back, "Feeling good, Billy Ray." In 1991, the American foreign policy and academic communities were filled with Billy Rays and Louis's. The response in the United States to the collapse of the Soviet Union, and more generally to the "Leninist Extinction"--the abrupt, accelerated, and comprehensive demise of Leninist regimes worldwide--was a growing and euphoric expectation and declaration that a global democratic capitalist revolution led by and modeled on the United States was imminent. (1) The same prophetic ideas and expectations had failed to materialize after World War I and World War II, however. Victory in World War I led President Woodrow Wilson to call for democratic self-determination in Eastern Europe and the creation of a League of Nations to prevent aggression. The United States failed to join the League, the League failed to prevent Italian, German, and Japanese aggression; and by the mid 1930s, with the exception of Czechoslovakia, all of Eastern Europe was led by monarchical or military dictators. Nonetheless, anticipated victory in World War II led President Franklin Roosevelt to call for a United Nations where the Soviet Union and the United States would continue their wartime cooperation, prevent further instances of aggression, and oversee the democratic decolonization of Asia, Africa, and the Middle East. Only three years after that victory the Soviet Union created a set of geographically contiguous replica regimes in Eastern Europe, and the Cold War had begun. As for decolonization, it produced dictatorships not democracies, socialism not capitalism, from Ghana to Egypt to Indonesia. Never mind. When the USSR collapsed, people thought, "Perhaps third time lucky." After all, the Soviet Union had been a much more substantial military, economic, and ideological challenge and threat to the West than Imperial Germany, Imperial Japan, Fascist Italy, and Nazi Germany. Furthermore, defeat of the major threat to liberal capitalist democracy was accompanied by a purported democratic tsunami, or at least a "third wave" of democratization, sweeping the world. If any further proof was necessary of liberalism and America's final and complete (to borrow a Stalinist phrase) victory in the world historical battle against communism, one had only to look at the mass transformation of Western socialist academics to postmodern film critics. Francis Fukuyama, in his 1989 essay "The End of History?" offered the most radical and theoretically sophisticated interpretation of the West's unique victory. In his words, "the triumph of the West, of the Western Idea" was "evident first of all in the total exhaustion of viable systematic alternatives to Western liberalism." He continued: "At the end of history it is not necessary that all societies become successful liberal societies, merely that they end their ideological pretensions of representing different and higher forms of human society." Because in Fukuyama's mind liberalism has essentially, if not yet practically, let alone completely, solved the problems of war, poverty, equality, and scientific discovery; there is no room for new world historical competitors, only critics. Fukuyama's philosophical argument was supplemented at a lower level of generalization by Thomas Friedman's apologia for globalization, the technological logos for liberalism's victory. In The Lexus and the Olive Tree, Friedman saw globalization as the future, the global name and reality replacing the Cold War. According to Friedman, globalization was the world's "tiger"--an irresistible, benign, homogenizing force that would make nation-states relics. …

Journal Article
TL;DR: For example, NASA is scheduled to retire the last three remaining space shuttle vehicles, Discovery, Endeavour, and Atlantis, in 2010 as discussed by the authors, leaving a significant gap between the last shuttle flight and the first CEV flight, a gap that could strain or even undermine American international standing.
Abstract: WHAT IF, IN the midst of the epic contest to explore and colonize the New World, Britain--the greatest seafaring power of its day--had to mothball its naval fleet and rely on other countries to transport British men and material across the oceans? This much we know: With British subjects, ideas, and goods tethered to a little island off the coast of Europe, Britain and the world would be very different today. Something not too dissimilar is about to happen in the heavens, as the United States prepares to retire its fleet of space shuttles. For almost 30 years, the venerable, if imperfect, space plane has been America's workhorse in space, carrying astronauts, scientific experiments, and satellites into orbit, painstakingly building the International Space Station, and just as important, reviving America's self-confidence and reinforcing America's image as a pioneering nation. But by 2010, with the fleet grounded due to budget, age, and safety concerns, America will have no way of delivering its own astronauts into space. The hiatus could last almost 5 years. America and the world--and space--could be very different by then. NASA IS RETIRING the remaining shuttles--Discovery, Endeavour, and Atlantis--in order to make way for the Constellation program, which includes the Orion Crew Exploration Vehicle (CEV) and Ares I and V rockets. The Constellation program will incorporate "the best aspects of the Apollo and Shuttle systems," according to NASA. AS the Government Accountability Office explains, "NASA is counting on the retirement of the Shuttle to free up resources to pursue a new generation of space flight vehicles." The problem is this: Those next-generation vehicles won't be ready until 2015. That leaves a significant gap between the last shuttle flight and first CEV flight--a gap that could strain or even undermine American international standing. How will we bridge that gap? The alternatives are grim, so grim that the best option appears to be purchasing "crew and cargo transport services from Russia and our international partners," in the worrisome words of one NASA official. As Michael Griffin, NASA administrator under President George W. Bush, observed in 2008, "It is dangerous for the United States to find itself dependent upon any external entity for a strategic capability, and space transportation is just that." His words were prescient, as became clear during Russia's blitzkrieg battering of--and slow-motion, scorched-earth withdrawal from--Georgia. Griffin told the International Herald Tribune that he ordered NASA to explore contingency plans for using the shuttle beyond 2010 "about 5 minutes after the Russians invaded Georgia." Griffin wasn't alone. During the presidential campaign, then-candidate Barack Obama voiced support for extending shuttle flights, calling on NASA to "take no further action that would make it more difficult or expensive to fly the Shuttle beyond 2010." Obama got a little wiggle room late last year, when Congress passed a measure that pays for an extra shuttle flight and for costs associated with delaying the planned retirement of the fleet. The delay could be expensive. According to estimates cited by the Orlando Sentinel, flying the shuttle beyond 2010 could cost some $4 billion per year. And because building new spaceships and retiring old ones is not like flipping a light switch, it is going to be very difficult to close the gap completely. The post-shuttle gap "is essentially unfixable now," according to Griffin. This is due to the transfer of personnel and resources to the post-shuttle program, the end of contracts, and the conversion of systems and facilities. It's no wonder that the GAO has identified the shuttle's status as one of its 13 "urgent issues" for the new administration. Obama's first budget calls for following through on plans to retire the space shuttle fleet in 2010, allowing for the possible addition of just one extra shuttle mission. …

Journal Article
TL;DR: For example, during the First Chechen War, Russian troops were up against experienced guerilla fighters, and the Russian troops may have had plenty of hardware, but their weapons were not the ones suited to the situation as mentioned in this paper.
Abstract: OF ALL THE kinds of fighting, urban warfare is the one that soldiers hate most. Just ask the Russian troops who fought in Grozny during the First Chechen War. Green, untrained troops were up against experienced guerilla fighters: The Russians may have had plenty of hardware, but their weapons were not the ones suited to the situation. Tanks in particular proved clumsy in the streets, because they couldn't fire high and they couldn't fire low, and many were disabled by Chechen rebels. Otherwise, the Russian tactics in city fighting hadn't changed much since Stalingrad. In terms of sheer viciousness, the fighting in Chechnya was remarkable. In their very own version of Shock and Awe, the Chechen methods included snipers hiding behind the bodies of dead or dying Russian soldiers strung up in the windows, and the display of cut off Russian heads along the roads leading into Grozny. Everything was booby-trapped, even an innocent looking object like a football. The Russians would respond in kind by killing Chechens in less than gentlemanly fashion: Once, when they found some of their own men crucified and castrated in a village, they castrated all the village's male inhabitants. Having been badly bloodied in the First Chechen War, the Russians were determined to prevail in the second round. After pinning apartment bombings in Moscow on the Chechens, the Russian response was to level Grozny to the ground in 1999. They ringed the city with artillery, tanks, and rocket stations and let fly. Among the new features was the use of fuel air explosives, which form an aerosol cloud that gets into every cranny of a building and ignites, crushing everyone inside. As Stalin might have put it, "No city. No city warfare." The Grozny model is not available to the Western democracies, but since urban warfare is becoming the most common type of fighting, they have had to come up with some other answers and fast. As former Marine Corps Commander General Charles Krulak predicted, "Our enemies will not allow us to fight the son of Desert Storm, but they will try to draw us into the son of Chechnya." Thus, military theorists like Britain's Michael Evans talk of "Mad Max" hybrid wars, combining elements of traditional and guerilla warfare, where irregulars, acting as proxies or on their own, now use the kind of weaponry previously reserved for states. Aware of their economic and technological disadvantages, such organizations seek to change the nature of the game, to play it by their rules, which means no rules. They seize the initiative, choosing when and where to strike. They wear no uniforms. They use civilians as shields, and use civilian homes, schools, hospitals, and mosques as command posts and weapons depots. When outgunned, they melt back into the civilian population. Their aim is to suck the conventional army into the city and bleed it in a war of attrition. And civilian casualties have become a great propaganda tool. The traditional army suddenly becomes the villain in the eyes of the press, because of misguided notions of fair play and lingering third-world romanticism. As Frank Hoffman drily observes in his groundbreaking study "Conflicts in the 21st Century: The Rise of Hybrid Wars," for Western democracies, these are not wars of choice. But still, we must fight them. Israel has had more urban-warfare experience than most countries, first in the Second Lebanon War and recently in the operation in Gaza. As Hoffman notes, the war in Lebanon afforded the clearest instance of the new type of hybrid war, which is why U.S. officers have paid special attention to the lessons that can be drawn from it. Civilized nations need to strike a careful balance in urban warfare: They must spare the civilian population, but also kill terrorists. They must fight humanely, but not so humanely that they end up paralyzed: At some point, it becomes a question of "better their civilians than ours." Oddly enough, many influential Western opinion makers seem to prefer it the other way around. …

Journal Article
TL;DR: The authors in this article presented an analysis of the costs of federal regulations by type: economic, workplace, environmental, and tax compliance, and found that economic regulations cost businesses $295 billion in 2004, while environmental regulations cost companies $250 billion.
Abstract: WHAT IS THE cost to U.S. businesses of complying with federal regulations? In 2.004, U.S. federal government regulation cost businesses in the United States an estimated $648 billion. (1) This cost burden has increased about 19 percent in inflation-adjusted dollars since 2000. Notwithstanding the many benefits to society of federal regulations, several indicators show that the cost for businesses of complying with these regulations is sizable and has been growing rapidly. Melinda Warren, director of the Weidenbaum Center Forum, notes that "expenditures of federal regulatory agencies and the trends in this regulatory spending over time ... are a proxy for the size and growth in regulations with which American businesses, workers, and consumers must comply." (2) A study by Susan Dudley and Melinda Warren reports that spending by federal regulatory agencies on regulatory activity reached $37 billion in fiscal year 2004. (3) This cost has grown 36 percent in inflation-adjusted dollars since 2000. Total staffing in federal regulatory agencies in fiscal year 2004 equaled 239,624 fulltime equivalent employees; it grew by 38 percent between 2000 and 2004. More than two decades ago, President Reagan made regulatory relief (deregulation as well as slowing the growth of new regulations) one of his four pillars of economic growth. Reducing the paperwork burden required by federal regulations became essential to the process. He specifically used the term regulatory relief rather than regulatory reform to emphasize his desire to cut back regulations, not just make them more effective. In line with this idea, he created the Competitiveness Policy Council in 1988 to advise the president and Congress on policies to promote U.S. competitiveness. The CPC recommended a federal law calling for the executive branch to attach a Competitiveness Impact Statement (CIS) to any new legislative proposal to Congress that might affect U.S. competitiveness. A CIS would require all U.S. federal government agencies to produce a document describing how major proposed federal laws significantly affect the competitiveness of companies operating in the United States. The CPC ceased operation in 1997 when Congress stopped funding it, and the recommendation for a CIS was never approved. The concern has recently arisen that the United States is losing its position as a global leader. Bernard Schwartz from the Brookings Institution notes that "rising economies like China and India represent new challenges to our position on the playing field as they introduce millions of new workers into the labor market, millions of dollars of investment into scientific programs which prior to this have been an area where the Western nations had a more important role to play." (4) Tom Donohue, President and CEO of the U.S. Chamber of Commerce, acknowledges the delicate balance between regulation and competitiveness: "Let's not overlook the importance of government regulation. Smarter regulation can strengthen America's economic competitiveness and status as a global leader--excessive regulation can cause us to fall behind competition." (5) A timely issue is, then, whether U.S. federal regulation is smart or excessive. Regulation strangling competitiveness? FIGURE I (6) PRESENTS estimates of the costs to business of federal regulations by type: economic, workplace, environmental, and tax compliance. Economic regulations--those that refer to government-imposed restrictions on the decisions a firm makes regarding price, quantity, entry, and exit--cost businesses $295 billion in 2004. This category also includes international trade and investment regulations that impose restrictions on foreign imports (e.g., quotas and tariffs). Social regulations cost businesses $250 billion in 2004. These are regulations that protect the public interest in the workplace (wages, benefits, safety, and health) and in the environment (e.g., water and air pollution). …