scispace - formally typeset
Search or ask a question

Showing papers in "Policy Review in 1996"


Journal Article
TL;DR: Fairey, the registrar of voters in Yuba County, California, has argued that bilingual ballots are an enormous waste of county resources and that they are used in so many other places as mentioned in this paper.
Abstract: California's Yuba County is getting ready to spend $12,000 this November on election materials that nobody will use. That's because the federal government forces local officials to print voting information in Spanish for every election. "Bilingual ballots are an enormous waste of county resources," says Frances Fairey, Yuba County's registrar of voters. In last March's primary election, this county north of Sacramento was forced to spend $17,411 on Spanish-language election materials. But, according to Fairey, "In my 16 years on this job, I have received only one request for Spanish literature from any of my constituents." The biggest problem with bilingual ballots, however, is not that they go unused in Yuba County, but that they are used in so many other places. Thousands of Americans are voting in foreign languages, even though naturalized citizens are required to know English. The National Asian Pacific American Legal Consortium estimates, for instance, that 31 percent of Chinese-American voters in New York City and 14 percent in San Francisco used some form of bilingual assistance in the November 1994 elections. Though these figures may be overstated, proportions anywhere near this magnitude are devastating to democracy. As Boston University president John Silber noted in congressional testimony last April, bilingual ballots "impose an unacceptable cost by degrading the very concept of the citizen to that of someone lost in a country whose public discourse is incomprehensible to him." A nation noted for its diversity needs certain instruments of unity to keep the pluribus from overrunning the unum. Our common citizenship is one such tool. Another, equally important, is the English language. It binds our multiethnic, multiracial, and multireligious society together. Not everyone need speak English all of the time, but it must be the lingua franca of civic life. Since the voting booth is one of the vital places in which citizens directly participate in democracy, it must be the official language of the election process. It is not, however, and political jurisdictions ranging from Yuba County to New York City can pin this mess on the perversion of voting-rights legislation. The Voting Rights Act of 1965 guaranteed blacks the right to vote in places, particularly the South, where they had been systematically blocked from electoral participation, often through the use of bogus "literacy tests." But as Manhattan Institute scholar Abigail Thernstrom has shown in her comprehensive book Whose Votes Count?, it did not take long for this important piece of civil rights legislation to expand in dangerous ways. After the Act's passage in 1965, civil rights groups toiled to expand its authority. When the law came up for reauthorization in 1975, Hispanic organizations argued that English-language ballots were the equivalent of literacy tests. People whose first language was Spanish needed special protections in order to vote, they claimed, citing low turnout among Hispanics. This was sheer quackery. Literacy tests in the South were used for the fraudulent purpose of keeping blacks away from elections. Low Hispanic turnout was mainly due to the fact that so many Hispanics were not citizens and therefore ineligible to vote. Nevertheless, Congress sided with the activists. It required bilingual ballots in any political district where "language minorities" made up at least 5 percent of the total population and less than half of the district's citizens were either registered to vote or had voted in the 1972 presidential election. It also required that bilingual election materials be made available to voters in every county in which the language-minority population had an "illiteracy rate" -- meaning "failure to complete the 5th grade," a trait that includes many immigrants--above the national average. Interestingly, "language minorities" were not defined by language (a cultural characteristic), but by ancestry (a genetic one). …

36 citations



Journal Article
TL;DR: Girdhar et al. as discussed by the authors conducted a thought experiment to find out what Charles Murray might call "a thought experiment" and found that most Republicans in Congress are on the wrong side not only of the politics -- AmeriCorps is popular with voters -- but of their own ideology.
Abstract: Let's conduct what Charles Murray might call "a thought experiment." Imagine it's 1993 and Newt Gingrich has been sworn in as president. In his Inaugural Address, he pledges to "dismantle the welfare state and replace it with an Opportunity Society." He appoints a task force of the party's most creative conservatives to ensure that citizen action will fill the void left by the withdrawal of government. There is, by no means, unanimity. The Cato Institute's Doug Bandow argues that as government recedes, charities and volunteer groups will naturally fill the gap. Arianna Huffington says that the nonprofit sector must become more effective and less bureaucratic. Gingrich agrees and advises the task force to look at Habitat for Humanity as a model for truly effective compassion -- inexpensive, nongovernmental, and faith-based. From Switzerland, William F. Buckley Jr. faxes in a chapter from his book Gratitude calling for a national-service program to engage young people in solving problems outside of government bureaucracies. Jim Pinkerton urges the re-creation of the Civilian Conservation Corps on a massive scale. Colin Powell reminds the group that the most successful race- and class-mixing program has not been busing or quotas but service in the U.S. Army. William Bennett argues that all government benefits ought to require something of the beneficiaries in turn, shattering the entitlement mentality created by years of Democrat-created welfare programs. Senator Dan Coats suggests that government's role should be confined to helping local community-based institutions solve their own problems. The task force decides unanimously that there should be no big federal program, with armies of Washington bureaucrats telling communities what to do. Instead, Washington would give money to states to help local community groups help themselves. And, inspired by Buckley, the members of the task force hit on an innovative idea. Instead of just giving grants to nonprofit groups, thereby creating nonprofit bureaucracies, they could model it after programs like the Jesuit Volunteer Corps, to which committed young people devote themselves for a year or two of service. The federal government would in turn provide that young person with a "service scholarship." This would, someone points out, establish a principle that the "educrats" in the higher-education lobby have always opposed: financial aid awarded not on need but merit, merit in this case defined as a willingness to serve one's country. Pollster Frank Luntz tells Gingrich that even though it's a decentralized, community-based program, the young people it engages should be linked together with a national spirit -- and name. Haley Barbour suggests "RepubliCorps" but Gingrich believes that might deprive it of bipartisan support. He asks his advisors to come up with a better name and gives them one bit of advice, "Don't be afraid to make it sound patriotic. Unlike the other party, we are not embarrassed to be Americans." So Luntz has a brainstorm: Let's call it "AmeriCorps." The reality, of course, is that Bill Clinton thought of AmeriCorps first, and most Washington Republicans ended up opposing it as typical Big-Government liberalism. Republicans in Congress are now on the wrong side not only of the politics -- AmeriCorps is popular with voters -- but of their own ideology. There is, however, a striking difference between the comments of Beltway Republicans and those in the rest of the country. New Hampshire governor Steve Merrill has called AmeriCorps "a great success in the state of New Hampshire." Michigan governor John Engler has said AmeriCorps "captures the promise found in all citizens." Arizona governor Fife Symington said he was "enthusiastic and impressed with the work of AmeriCorps." And Massachusetts governor William Weld called it "one of the most intelligent uses of taxpayer money ever." Let us explain why we think these Republican governors are right. …

10 citations


Journal Article
TL;DR: In the fall of 1995, despite the dresistance of old-guard Republicans Bud Shuster and John Chafee, chairmen of the key committees in the House and Senate, legislation repealing the national speed limit passed both chambers by chasmic margins.
Abstract: In 1974, in the wake of the OPEC oil embargo, the federal government imposed the 55 mile-per-hour "National Maximum Speed Limit" (NMSL) on American motorists to conserve gas. But long after the fuel crisis had abated, the federal speed limit remained, backed by advocates who touted its supposed safety benefits. Even as these benefits were soon cast into doubt and drivers' compliance with the law began to erode, its supporters continued to lobby for the federal maximum. And for more than 20 years, Congress followed their lead. The Democratic leadership of the Congress blocked all efforts to overturn the national speed limit. Then last year, with their robust commitment to limiting government, the newly elected House and Senate freshmen pushed for a repeal of the federal speed limit. In the fall of 1995, despite the dresistance of old-guard Republicans Bud Shuster and John Chafee, chairmen of the key committees in the House and Senate, legislation repealing the national speed limit passed both chambers by chasmic margins -- 419-7 in the House, by voice vote in the Senate. On November 28, 1995, President Clinton reluctantly signed the legislation. The endurance of this anachronism is but one sign of Washington's bureaucratic paternalism; its demise, however, may augur a new era of limits on federal authority. The federal maximum was not, technically, mandatory. It was enforced through the power of the purse -- federal highway funds were withheld from states refusing to post the new limit. Naturally, every state capitulated. Building on legislation in 1987 and 1991 that allowed states to raise the limit to 65 m.p.h. on rural interstates and freeways designed for high speeds, the repeal removes the threat of reduced funding, freeing all states to set their own speed limits. Since the repeal, more than 20 states have raised their limits, including seven within a week of the effective date of the repeal. Four of those states -- Oklahoma, Wyoming, Nevada, and Arizona -- posted a top speed of 75 m.p.h., while Montana eliminated its daytime speed limit for passenger vehicles, reverting to its pre-1974 law requiring drivers to maintain a "reasonable and prudent speed." Once liberated, state legislatures ushered in these increases by lopsided margins. Missouri's House and Senate, for example, voted by 122-34 and 28-2, respectively, to raise the maximum limit to 70 m.p.h. Idaho's lower and upper chambers voted to increase limits by 56-12 and 28-7, respectively. Higher speed limits were voted down in Ohio, Connecticut, Kentucky, Virginia, and Louisiana. That's the beauty of a federal system. States can now set their own speed limits after assessing local conditions; some will increase their limits and some will not. The swift actions of these legislators were bolstered by the support of motorist advocates. "We believe that the states will do as good a job, or even a better job, at setting and enforcing the limits," says Bill Jackman, a spokesman for the American Automobile Association. "In many states the federal limits were really disdained. . . . All of the state departments of transportation are responsible institutions with respectable people working in them. We trust them to do their jobs well." With 39 million members nationwide, AAA represents the interests of almost a quarter of all licensed drivers. State traffic engineers and highway patrolmen, whose reputations depend on drivers' safety, added their applause. "Putting the decision up to the individual states was a good idea," says Ken Howes of the Florida Highway Patrol. "Our Florida Department of Transportation has done excellent research on what speeds are appropriate. We are very comfortable in Florida if a speed is raised, because we know it is not done haphazardly." Adds P.D. Kiser, the chief traffic engineer at the Nevada Department of Transportation: "The repeal was way overdue; the 55 m.p.h. speed limit was an experiment that failed miserably. …

7 citations


Journal Article
TL;DR: Coats as discussed by the authors introduced a package of legislative proposals to help empower local, community-based institutions that are addressing social problems, including the Project for American Renewal, which includes 19 separate bills designed to use public policy and public resources to energize mainly private efforts to meet human needs.
Abstract: With responses from Gertrude Himmelfarb, Don Eberly & David Boaz In their 1975 book To Empower People, Richard John Neuhaus and Peter Berger challenged policy makers to protect and foster the "mediating structures" -- neighborhood, family, church, and voluntary associations--that stand between the private individual and large government institutions. "Wherever possible,"social purposes." Twenty years later, Washington is heeding the call. In October 1995, Senator Dan Coats introduced a package of legislative proposals to help empower local, community-based institutions that are addressing social problems. Crafted with the help of William J. Bennett, a codirector of Empower America, the "Project for American Renewal" comprises 19 separate bills designed to use public policy--and public resources--to energize mainly private efforts to meet human needs. Coats defended his legislation at a recent symposium at The Heritage Foundation; what follows are his remarks and critiques by some of the symposium's participants. Senator Dan Coats Re-funding Our "Little Platoons" An intellectual revolution is underway concerning the nature of our social crisis. It is no longer credible to argue that rising illegitimacy, random violence, and declining values are rooted in the lack either of economic equality or of economic opportunity. These positions are still current in our political debate, but they have lost their plausibility., America's cultural decay can be traced directly to the breakdown of certain institutions--families, churches, neighborhoods, voluntary associations -- that act as an immune system against cultural disease. In nearly every community, these institutions once created an atmosphere in which most problems -- a teenage girl "in trouble,"local high school--could be confronted before their repetition threatened the existence of the community itself. When civil society is strong, it infuses a community with its warmth, trains its people to be good citizens, and transmits values between generations. When it is weak, no amount of police or politics can provide a substitute. There is a growing consensus that a declining civil society undermines both civility and society. In this discussion, the importance of civil society is something I want to assume, not argue. But it leads to another question: Does this intellectual revolution have political consequences? Should it influence the agenda of Congress, or is it irrelevant to our work? Let me begin by saying what is not at issue. I do not argue that government is sufficient to the need. Nothing short of a Great Awakening, as Gertrude Himmelfarb has noted, is sufficient to the need. But I do argue that conservatives have duties beyond waiting for another Charles Wesley. Realism about government's limits is not a substitute for effective public policy. I am convinced that political reflection on these themes is important not only for its own sake; it provides the next, necessary stage of the conservative revolution. Government clearly has had a role in undermining civil society. Families, churches, and community groups were forced to surrender their authority and function to bureaucratic experts. Fathers were replaced by welfare checks, private charities were displaced by government spending, religious volunteers were dismissed as "amateurs", whole communities were demolished in slum- clearance projects. The power to replace an institution is the power to destroy it. So the first item on the political agenda is a re-limited government, leaving enough social space for civil society to resume its role. But this brings us to a problem. The retreat of government does not automatically result in the rebirth of civil society. It is a necessary condition, but not a sufficient one. As Professor John DiIulio has observed, when a victim is stabbed, you need to remove the knife. But removing the knife will not heal the wound. …

4 citations


Journal Article
TL;DR: A business corporation is a moral institution that can act either morally or immorally, and by its own internal logic and inherent moral drive, business requires moral conduct Other moral obligations fall upon it through the moral and religious commitments of its members.
Abstract: Election-year politics and widespread anxiety about economic security have recently prompted fierce public discussion of the "social responsibilities" of business The debate has often been dominated by caricatures of business leaders callously ordering layoffs while earning unjustifiably high pay Some seek to define "responsibility" as an ethical obligation owed not to the owners of an enterprise, but to its "stakeholders," including employees and their families, customers, and all citizens of this country We have not developed a very sophisticated understanding of business as a moral calling Many people educated in the humanities and the social sciences are uncritically anti-capitalist, and think of business as vulgar, philistine, and morally suspect Popular culture treats big business as its favorite villain; it sees no ethical dimensions inherent in business activities But the business corporation is in its essence a moral institution Within this institution it is possible to act either morally or immorally, and by its own internal logic and inherent moral drive, business requires moral conduct Other moral obligations fall upon it through the moral and religious commitments of its members Thus, those who labor within the business corporation have many moral responsibilities and a richly various moral agenda It may help to divide these responsibilities into two different sets The second set will easily be recognized as "ethics," since the source of its authority comes from outside business--from religious conviction, moral traditions, humane principles, and human-rights commitments The first set consists of the moral requirements necessary for business success These are the virtues necessary for building a good business These are not always recognized as ethical in their own right One way to see that they are ethical is to ask yourself what happens when they are violated If you think earning a profit is a morally neutral rather than morally good way to acquit a responsibility, would you hold that deliberately running losses is ethical--particularly if it's with someone else's money? Too many analysts neglect a basic point: Simply to succeed in business imposes remarkable moral responsibilities The business corporation is not a church, not a state, not a welfare agency, not a family A corporation is an economic association with specific and limited functions The primary moral duty of business is to fulfill responsibilities that arise from its own nature, moral ideals inherent in business as business Seven Internal Responsibilities There are at least seven such corporate responsibilities I call these internal responsibilities, because a business must fulfill them simply for it to be a success at what it was founded to do 1 To satisfy customers with goods and services of real value This virtue is not so easy to practice as it seems Some three out of five new businesses fail--perhaps because the conception their founders have concerning how to serve the customer is not sufficiently realistic, either in its conception or in its execution Like other acts of freedom, launching a new business is in the beginning an act of faith; one has to trust one's instincts and vision, and hope that these are well enough grounded to build success It is the customers who, in the end, decide One set of responsibilities assumed by a business is to its customers These responsibilities have moral content 2 To make a reasonable return on the funds entrusted to the business corporation by its investors It is more practical to think of this responsibility in the second place rather than in the first, where some writers place it, because only if the first is satisfied will the second be met Milton Friedman has made the classic case for this fundamental social responsibility I agree with him in stressing how basic it is, but would place it also in the context of other responsibilities …

4 citations


Journal Article

4 citations


Journal Article
TL;DR: In the first place, it is difficult for human beings to live together as discussed by the authors, and therefore it is not good for humans to live alone. But as the Bible itself indicates, the human need for association, relationships, and for stable structures within which those relationships can grow.
Abstract: There are two fundamentally different ways of thinking about human association. Consider two phrases we associate with the Greek and Jewish traditions and their most distinguished representatives, Aristotle and Maimonides. Aristotle described man as a political animal. Maimonides described man as a social animal. Those two phrases tell two different stories about the human condition. Both are true, but they describe different aspects of our collective life and give rise to different institutions. Man as a political animal creates the institutions of political society: states, governments, and political systems. Man as a social animal creates the institutions of civil society: families, communities, voluntary, associations, and moral traditions. As we approach the millennium, the reinvigoration of these civil institutions is the single greatest challenge facing the liberal democracies of the West. But let me begin at the beginning: "In the beginning, God created heaven and earth." The biblical narrative of creation is constructed in a distinctive literary form. Repeatedly we read: "And God said, `Let there be ....' And there was .... And God saw that it was good." It is therefore discordant to hear suddenly, for the first time, the phrase "not good." What, in creation, is not good? "It is not good for man to live alone." The first statement of the Bible about the human condition is the sanctity of the individual as individual. Every human being is in the "image of God." But the second statement about the human condition is the incompleteness of the individual as individual. "It is not good for man to be alone." Hence the human need for association, relationships, and for stable structures within which those relationships can grow. In fact, much of the rest of the Hebrew Bible is the story of the unfolding of those relationships, from the nuclear family of Adam and Eve, to the extended family of Abraham and Sarah and their children, to the confederation of tribes in the days of Moses and Joshua, to the sovereign state in the ages of kings and prophets. So the Hebrew Bible begins with the recognition that it is very difficult for human beings to live alone. But as the Bible itself indicates, it is also difficult for human beings to live together. With Adam and Eve come conflict, with Cain and Abel, fratricide. And by the generation of the Flood in Genesis 6, "the earth was full of violence." How then do we move from unbearable isolation to some form of tolerable association? I want to tell two stories, both implicit in the Bible, but quite different in their implications. The first is told in its most famous form by Thomas Hobbes in the Leviathan. Hobbes, we recall, begins his story at the point which the Bible describes as the generation of the Flood, and which Hobbes called the "state of nature," in which human beings "are in that condition which is called War; and such a war, as is of every man, against every man" in which life is "solitary, poor, nasty, brutish and short." How then, given the human tendency to conflict, do human beings form societies? Hobbes's answer: The fear of violence and death. Some of us are stronger than others, but none is so strong that we are invulnerable to attack. Indeed each of us has reason to fear the preemptive attacks of others. Therefore it is in our essential interests as individuals, as a precondition of peace and security, to hand over some of our powers to a supreme authority that will make and enforce laws. …

3 citations


Journal Article
TL;DR: In Wisconsin, Walker's W-2 as discussed by the authors is based on the principle that for those who can work, only work should pay, and it is only by performing some level of work that they will receive economic assistance.
Abstract: What will the states do differently once they have block grants? The short and simple answer: We will put our ideas to work Once a block-grant system becomes operational, several key areas of public assistance will be set loose from the dead weight of federal oversight For the most part, the block-grant system frees states from the shackles of a waiver system that has inhibited innovation and stymied true reform Prior to recent congressional action, if a state passed an innovative welfare reform initiative, it would have to go to Washington on bended knee and kiss somebody's ring in order to get a waiver to implement the program In Wisconsin, we know the waiver system all too well We have sought and received 227 different waivers to implement our innovative welfare programs And we know that you never get everything you ask for in a waiver request, either If your waiver asks for 10 items, you are lucky if Washington grants you three In Wisconsin, the block-grant system means we will be able to implement the most ambitious change in the welfare system that this country has ever seen The changes we expect to enact are called W-2, or Wisconsin Works This initiative would end the automatic welfare check It would end AFDC And once implemented, W-2 will mark the greatest change in social policy in this country in the past 50 years In Wisconsin, we will no longer have a welfare system; we will have a work system In fact, we have already transferred our welfare division from our Department of Health and Human Services to our Department of Industry, Labor, and Human Relations, and have made it a work division The reason: We are replacing the automatic welfare check with a comprehensive package of work options, job training, health-care and child-care services, and even financial planning And we know W-2 will succeed in moving people off welfare and into the workforce Wisconsin has been reforming the welfare system longer than any other state by far, having begun when I took office in 1987 Since then, our AFDC caseload has dropped 30 percent, and we're saving our taxpayers $17 million a month Imagine what Wisconsin and other states could do with the freedom and flexibility provided by the block-grant system In a recent study, Princeton University professor Lawrence Mead concluded that Wisconsin cut its welfare dependency because it tied benefits to work W-2 is based on the philosophy that for those who can work, only work should pay We assume that everybody is able to work-or at least is able to contribute to society through some work activity within their abilities The only way to escape welfare or to escape poverty is to work There is no other way Therefore, W-2 helps the traditional welfare recipient make the transition to employment and an independent life as quickly as possible When people seek economic assistance (or what used to be known as AFDC) from the state, they will be required to sign up for work immediately And it is only by performing some level of work that they will receive economic assistance W-2 consists of a four-rung job ladder designed to help participants climb out of poverty and government dependence and into a life of independence and self-reliance The top rung is unsubsidized employment Under this option, participants who are job-ready will be matched up with the best available job This is the ideal option-the rung on the ladder that all participants should aspire to reach A W-2 participant directed into unsubsidized work is also eligible for, depending on his income, food stamps, health care, child care, and the earned-income tax credit The second rung on the ladder is subsidized employment, or trial jobs This rung is for people without a work history but with a willing attitude Employers who hire people on this rung get help in offsetting some of the costs of training and trying out a new employee The wage subsidy is limited to between three and six months …

3 citations


Journal Article
TL;DR: In the 1990s, the Clinton presidential campaign adopted a message of personal responsibility and citizenship, which was a sharp departure from the McGovernite ideology that has dominated American liberalism for the past 25 years.
Abstract: The greatest achievement of the Clinton presidency has been to incorporate a message of personal responsibility and citizenship into the rhetoric of the Democratic Party. On the campaign trail in 1992, candidate Bill Clinton stressed "individual responsibility,"times truer to conservative principles than George Bush. Clinton spoke eloquently of his efforts in Arkansas to preserve "the things we cherish and honor most about our way of life -- solid, middle-class values of work, faith, family, individual responsibility, and community."president, he frequently invokes these Reaganesque themes. But his conception of citizenship is deeply flawed. Clinton has used the rhetoric of citizenship to justify his efforts to expand the welfare state. He has contradicted his emphasis on personal responsibility by working to preserve bureaucratic federal control of social programs. The revival of citizenship requires a transfer of power from government to families, voluntary associations, and communities. The president has instead sought to reinvigorate civic life by trying to relegitimize government. At first blush, President Clinton's language of citizenship seems a sharp departure from the McGovernite ideology that has dominated American liberalism for the past 25 years. This ideology sees crime, drug abuse, family breakdown, and other social crises not primarily as deviant individual behavior but as social pathologies caused by underlying inequities. Moreover, in the McGovernite view, these pathologies are so complicated that traditional institutions such as churches and voluntary associations cannot possibly address them. Only policy specialists trained in the delivery of social services--therapeutic-state elites--are up to the task. The growth of McGovernism was accompanied by a massive transfer of power and resources to government. City halls, state capitals, and Washington, D.C.,administer the programs that, liberals promised, would solve America's social crises. By the mid-1980s, it had become apparent, even to Democrats, that liberalism's faith in the social-welfare state was misplaced. A growing body of research indicated that the expanded social-welfare bureaucracy, more often than not, worsened rather than relieved the problems it tried to solve. These policy failures also had clear political consequences. The Democratic Party began to lose traditional blue-collar voters, its former bedrock of support, who had become ever more frustrated by the mounting failures and rising costs of the welfare state. The liberal principles that undergirded the McGovernite approach--especially the effort to blame society for failures of personal responsibility--were at odds with the values of middle-class Americans. After Walter Mondale's landslide 1984 loss to Ronald Reagan, Democratic activists such as then-Governor Bill Clinton of Arkansas, then-Governor Charles Robb of Virginia, and Senator Sam Nunn of Georgia, banded together through the Democratic Leadership Council to bring Democrats back into the mainstream. They called themselves "New Democrats,"from a school of political theory called communitarianism. Communitarians argued that man fundamentally is not a rights-bearing, property-seeking individual, but a social being who flourishes in stable associations and communities. Communitarians criticized both liberals for placing too much faith in bureaucracy and conservatives for stressing the power of the marketplace; they claimed that bureaucracy and the marketplace together had displaced the community as the locus of American life. In the communitarians' view, 19th-century America offered the ideal model of community, with an active citizenry working locally to address shared concerns. Late 20th-century America, however, had witnessed the growth of big government and big business, which served to isolate and atomize Americans. The weakening of families and local community institutions, the communitarians warned, was beginning to erode fundamental elements of the American way of life. …

3 citations


Journal Article
TL;DR: America's 3,000 pregnancy centers have been rescuing women and their unborn children for 30 years now and are shaping attitudes about abortion and the unborn--one person at a time--in a decidedly pro-life direction.
Abstract: A story like this begins perhaps a thousand times every day: A woman's hand trembles as she scans the big-city Yellow Pages. The ads for abortion clinics have flowers and birds and slogans about caring, and one shows a pretty couple grinning at each other at the seashore. That makes her start crying again. Her boyfriend never looked at her like that. But elsewhere on the page she sees an ad showing a woman curled around a baby. The phone number ends with the letters H-O-P-E. She hadn't thought about hope, but she feels like she really needs to talk to someone who has some, right now. She dials the number. Half a year later, she is stepping out of the Hope Pregnancy Center. It's her first visit since the birth. Her little boy fusses while everyone hugs her. She stands for a moment in the spring sunshine. She's made it through these months after all. It's taken courage, and it's been tough, but this precious child has been brought through safely and given the gift of life. Without the love of the women at the center, this child would have died. She looks down at his wizened face. He is all she has now. Tears prick her eyes again, as she turns and walks to the bus stop. What happens to mother and child next? "It horrifies me sometimes," says Pat Evans, the unpaid director of Birthright, a crisis-pregnancy center in Annapolis, Maryland. "She's on the list for public housing, but that housing is invariably in bad sections. But if she turns it down, she's off the list, or put back on at the bottom to wait all over again." And how does she support herself? "She probably gets $225 a month on welfare, and there's food stamps, WIC [nutrition aid to women, infants, and children], and medical assistance," says Evans, who has counseled thousands of women in her 16 years there. "In all these years, I've seen less than a half-dozen find a way to work. Once they have that baby, how can they find a job that pays enough to buy a car to get to the job, and cover day care as well?" Evans's center, which assists about 1,200 women a year, can help some with housing for a year after the birth, but can't offer permanent housing or employment. "I don't see any answer to that; it's almost impossible. When she has a child, everything gets very hard." America's 3,000 pregnancy centers have been rescuing women and their unborn children for 30 years now. When we list the nonbureaucratic, grass-roots organizations that are doing the most to save threatened lives, pregnancy centers must rank near the top. These privately funded, storefront operations offer women material and emotional support free of charge, and they help hundreds of thousands each year to have their babies. Perhaps more than any other institution, these centers are shaping attitudes about abortion and the unborn--one person at a time--in a decidedly pro-life direction. While conservatives debate the wisdom of a pro-life agenda for the 1996 presidential campaign, tens of thousands of volunteers are quietly keeping a candle glowing through the darkest night of many women's lives. In the old joke, a grandmother peevishly greets the exhausted hero who has just rescued her grandson from a frozen lake: "But where are his mittens?" The pregnancy centers and their staff endure difficult, lifesaving work, little or no pay, and obscurity occasionally alleviated by insult. It hardly seems right for those of us who value these centers' hard work to punctuate our praise with grumbling about lost mittens. But the recent evolution in society's attitudes toward welfare compel a closer look. For all the good crisis-pregnancy centers do, 80 to 90 percent of their clients--about 200,000 a year--eventually set up single-parent households. Most of these mothers and their children soon find themselves on their own, with no reliable means of financial support. Many end up permanently dependent on government welfare. Some even slip into homelessness. …

Journal Article
TL;DR: In 1992, U.S. marshals sought to arrest white separatist Randy Weaver at his remote mountain cabin in Ruby Ridge, Idaho, resulting in the death of a federal marshal and Weaver's 14-year-old son as discussed by the authors.
Abstract: In recent years, two tragic events have fundamentally changed the way many Americans view federal law-enforcement agencies and jeopardized public confidence in the federal government itself. In August 1992, U.S. marshals sought to arrest white separatist Randy Weaver at his remote mountain cabin in Ruby Ridge, Idaho. A confrontation resulted in the death of a federal marshal and Weaver's 14-year-old son. Special FBI teams were called in, and during the siege that followed, Weaver's wife was killed by an FBI sniper's bullet. At his subsequent trial, Weaver was acquitted of all but the most minor offenses. Since Ruby Ridge, federal law enforcement as a whole has come under intense congressional and media scrutiny. Even those normally supportive of the police ask: Should the federal government have risked this loss of life and expended $10 million to capture a hermit whose only alleged crime was selling two sawed-off shotguns to an undercover federal agent? In February 1993, a 51-day standoff first erupted when agents of the Bureau of Alcohol, Tobacco, and Firearms (BATF) tried to apprehend the leaders of the Branch Davidian cult at their compound in Waco, Texas, for the violation of federal gun laws. Four BATF agents lost their lives in the initial foray, and the subsequent siege ended in the deaths of 85 cult members, killed either by gunfire or by a suicidal fire that was ignited when FBI agents stormed the compound. Again, conflicting versions of events led to congressional hearings as well as public and media criticism. Both these tragedies are the direct result of federal jurisdiction in crimes once considered wholly within the province of state and local police agencies. In neither incident did the underlying crime involve interstate activity or pose a threat to the federal government. Without the federalization of laws regulating firearms, a matter left to the states during most of our country's history, neither the BATF or FBI would have had jurisdiction at Ruby Ridge and Waco, and any law-enforcement actions would have been handled locally, if at all. 3,000 Federal Crimes "We federalize everything that walks, talks, and moves,"chairman of the Senate Judiciary Committee from 1986 to 1994. Unfortunately, this is not much of an exaggeration: Today there are more than 3,000 federal crimes on the books. Hardly any crime, no matter how local in nature, is beyond the reach of federal criminal jurisdiction. Federal crimes now range from serious but purely local crimes like carjacking and drug dealing to trivial crimes like disrupting a rodeo. President Clinton's 1994 crime bill alone created two dozen new federal crimes. The bill federalized such crimes as drive- by shootings, possession of a handgun near a school, possession of a handgun by a juvenile, embezzlement from an insurance company, theft of "major artwork,"official assisting a federal law-enforcement agent. Although many of these crimes pose a real threat to public safety, they are already outlawed by the states and need not be duplicated in the federal criminal code. The federal judiciary, for its part, has imposed national standards on all state criminal proceedings. In cases like Mapp v. Ohio and Miranda v. Arizona, the Supreme Court overruled state law and forced every state to exclude evidence under certain conditions. This massive federalization of the rules of evidence greatly increased the rights of criminal defendants and further burdened law-enforcement agencies throughout the nation. In addition, the federal courts have virtually taken over such vital state functions as the operation of prisons and mental hospitals. By 1993, the federal courts operated 80 percent of all state prison systems in America. Amazingly, federal judges determine virtually every detail of these prisons, including standards for food and clothing, grievance procedures, and cell space per convict. The nationalization of criminal law has included the criminalization of environmental and regulatory laws. …

Journal Article
TL;DR: Koslowski, who had no prior criminal record, is serving time in a Pennsylvania federal prison for credit-card fraud and is one of tens of thousands of young people who fall victim to America's gambling obsession every year as mentioned in this paper.
Abstract: Flush with a handful of money he had just won at a bowling tournament, Joe Koslowski invited some friends to celebrate with him at the nearby Atlantic City casinos. Joe, then 16, and all his buddies were allowed in despite the age limit of 21. Once inside, Joe's good fortune continued; he parlayed his bowling winnings into a couple of thousand dollars. After his initial success, Joe returned to the casinos frequently. His winning streak eventually ended, but his taste for the thrill of gambling did not. Once out of cash, he opened credit accounts under family members' names, using cash advances from the credit cards to gamble. The whole scheme finally came crashing in on Joe last year, after he had amassed $20,000 in debt. Now at age 20, Joe, who had no prior criminal record, is serving time in a Pennsylvania federal prison for credit-card fraud. Joe is one of tens of thousands of young people who fall victim to America's gambling obsession every year. At least three-quarters of the nation's teens engage in some form of gambling. Much of it, of course, is fairly innocuous and occurs among peers: weekend poker games, betting on football, the annual NCAA basketball tournament pool. Adolescents have become increasingly adept, however, at gaining access to state-sanctioned gambling--lotteries, casinos, electronic poker--which often becomes a bridge to compulsive or addictive behavior. In 1995, University of Minnesota researchers reported that more than half of underage Minnesota teens surveyed had participated in some form of legalized gambling. An earlier survey of Atlantic City high-school students revealed that nearly two-thirds had gambled at the city's casinos. It is becoming painfully apparent that the only jackpot awaiting many of these young people is a life out of control: More than a million adolescents are already addicted to gambling, according to Durand Jacobs, a clinical professor of psychiatry at Loma Linda University Medical School and an expert on youth gambling. Further, Jacobs says, the gambling addiction rate among teens is three times that among adults. In a recent review of major youth-gambling studies in North America, Howard Shaffer, director of the Center for Addiction Studies at Harvard Medical School, concluded that roughly one in six teens experiences gambling-related problems, while about 6 percent are actually addicted, or pathological, gamblers. The New Jersey Council on Compulsive Gambling, which operates a national toll-free hotline (1-800-GAMBLER),4,300 times in 1994, accounting for 11 percent of total calls. Ed Looney, the council's executive director, says many of these young people find themselves in desperate straits. He tells of a call regarding a 16-year- old who had slit his wrists after losing $6,000--four years of newspaper delivery earnings--on the lottery in a single day. He tells of the college student from the Midwest who dropped out of school because he lost his tuition money gambling; of the 19-year-old New Jersey youth who sold his car for a fraction of its value so he could get back into the casinos; of the numerous calls from kids too scared to go back to school because they can't pay back their bookies. The phenomenon of youth gambling is not entirely new, but its rapid growth and startling magnitude is alarming. Says Valerie Lorenz, head of Baltimore's Center for Compulsive Gambling, "We never saw a teenage gambler 10 years ago. Now we see them regularly."Moreover, the most addictive forms of gambling--eagerly promoted by more and more state governments in search of tax revenue--can produce ripple effects in young lives that undermine families, communities, and civic order. As we move into the next century, Shaffer says, "We're going to have major issues with youth gambling that will equal or eclipse the problems that we have with substance abuse." A Slippery Slope? Experts draw a distinction between the "problem" gambler and the "pathological" gambler. …



Journal Article
TL;DR: A year ago, a 16-year-old constituent of mine, Jeff Gardner, died from a lethal combination of "huffing" gasoline and smoking marijuana, and his mother, who was aware of a much larger drug problem in the community, called a parent's meeting at the local high school.
Abstract: A year ago, a 16-year-old constituent of mine, Jeff Gardner, died from a lethal combination of "huffing" gasoline and smoking marijuana. After Jeff's death, his mother, who was aware of a much larger drug problem in the community, called a parent's meeting at the local high school. No one came. She told me her story and asked how her representative in Congress was going to help address the growing drug problem. It was a fair question, but I was not satisfied with the response I could give her. Members of Congress take seriously their responsibility to represent their constituents in Washington -- by legislating, by voting, and when appropriate, by securing federal funding for state and local concerns. Despite spending $13 billion annually on drug-control programs, however, drug abuse is rising dramatically among our youth. A big part of the problem has been the president's failure to show any leadership on this issue (until his wise appointment of General Barry McCaffrey as the new drug czar). In fact, President Clinton hurt the antidrug effort by cutting the Office of National Drug Control Policy from 147 to 25 full-time positions, by hiring a surgeon general who advocated legalization of drugs, by cutting funding for interdiction efforts, and by sending confusing messages about the stigma of illegal drug use. It is no surprise, then, that after dramatic reductions in drug use during the decade before Clinton took office, drug use has nearly doubled among teenagers during his administration. The evidence shows that national leadership is critical in reducing drug abuse. Jeff's mother wanted that leadership, but in a manner that would help her in Goshen, Ohio. Spending more federal dollars on drug-control programs was unlikely to directly touch this mother's life. Neither would it encourage other parents in her community to address the drug problem. How could I really help? By rolling up my sleeves and providing leadership where it matters most--at the local community level. Members of Congress are in a unique position to mobilize people in their own communities. By the nature of our jobs, we deal with every sector of the districts we represent. We can also bring statewide and nationwide expertise and resources to bear on a problem. And we can draw the attention of news media that is so critical to educating and mobilizing neighborhoods to solve their toughest social problems. What I've initiated -- and what I'm challenging my colleagues in Congress to embrace -- is a new model of governance that recognizes the limitations of Washington-based solutions, while drawing on the resources of citizens locally. Cause for Alarm If there is any public-policy area that demands a new, more effective approach, it is drug abuse. Recent Gallup polls show that crime and drugs are Americans' top concerns. When you ask parents and children what is the most serious issue facing youth today, both groups cite drug abuse. National statistics show there is indeed cause for alarm. After a decade of progress in the war on drugs, the number of young people using drugs began to increase in 1992; use among young kids showed the sharpest increase. LSD use is now at its highest level since 1975, when it was first measured. Since 1992, the number of children between 12 and 17 using marijuana has nearly doubled. To put the problem in perspective, in the average class of 25 eighth-graders (13- and 14-year-olds), five are now using marijuana. And drug abuse is implicated in other social problems -- violent crime, dropout rates, and domestic violence, to name a few. Greater Cincinnati has a drug problem that mirrors the startling national statistics. This region of the country experienced a similar decline in drug use in the 1980s. But by the early 1990s, it began to skyrocket. Why? I'm convinced that the resurgence stems in part from the disappearance both of effective national leadership in the fight against drug use and of the media attention that usually follows such leadership. …

Journal Article
TL;DR: Employee training has been a hot topic in the last decade as discussed by the authors, with many experts arguing that unfair foreign competition and callous executives have destroyed the social contract between employers and employees.
Abstract: The 1996 presidential primaries have pushed the plight of American workers to the top of the nation's political agenda. Some commentators argue that unfair foreign competition and callous executives have destroyed the social contract between employers and employees. Others say that inexorable economic and social trends, such as the Information Revolution, are enlarging the prospects for highly skilled workers while displacing those with less education. All sides seem to agree that America must raise the educational level of its workers. But would-be reformers often forget that the largest educational system in the United States is neither the public schools nor institutions of higher learning. It is the education and training workers receive from their employers or in the private marketplace. Workers learn new skills, information, and problem- solving strategies virtually every day, in formal courses or on the job. Motivated by competition and the search for greater value, many businesses try to raise productivity through training programs. In the long run, such efforts pay off in larger profits, better products, and higher wages. Training magazine's annual survey of employee- training programs helps to clarify the issue. In 1995, about 50 million workers received some kind of formal training, up 25 percent since 1990. Employers spent more than $50 billion on formal training in 1995. Based on a ratio of formal to informal training documented in other studies, one can estimate that firms spent approximately $300 billion in informal training that same year, including the cost of supervision and lost-production time. Nearly 90 percent of companies with more than 100 employees provided courses in management skills or basic computer skills. Many also taught skills in communication, supervising, and clerical work. While 70 percent of employers provided training for middle managers and executives, only about 40 percent did so for salespeople and production workers. Such workers are more likely, however, to receive informal, on-the-job training not included in these statistics. Why do companies train their employees? Workers who become more skilled in their jobs are more productive. Also, studies show that training can raise employee loyalty and reduce turnover, in part because workers often find their skills to be more valuable at the firm where they were acquired. One way to gauge the value of workplace training is to see how many workers needed some to qualify for the jobs they currently hold. A 1991 survey by U.S. Bureau of Labor Statistics (BLS) found that 57 percent of workers said they needed training to qualify for their current jobs. About half of those workers learned such skills on the job. Another 21 percent received their training from formal company programs. Much smaller percentages said they obtained necessary skills from high school or post- secondary educational programs. Similarly, when BLS asked workers where they had received any training to improve skills, formal and informal company programs again accounted for about three-quarters. One interesting finding of the study was that many workers with college degrees earn less than workers without college degrees who nevertheless have obtained specific training for their jobs. BLS economist Alan Eck wrote that company-provided training programs, particularly formal ones, "have more of an impact on increasing those workers' earnings than does any other source of training." Corporate training programs run the gamut from high-tech training centers to on-the-job instruction in word processing or operating machinery. Motorola, which makes cellular phones and paging gear, is often cited as a model for its variety of training programs. The company gives all employees at least 40 hours of training a year. Much of this training occurs at Motorola University, its $120-million, 14-branch teaching center. In 1985, Motorola discovered that its employees lacked even the basic skills the fast-growing manufacturer needed. …

Journal Article
TL;DR: Wilson as mentioned in this paper proposed that unmarried, pregnant girls live in some type of supervised, privately run group home as a condition of receiving government benefits, which would help provide the social structure that young single mothers and their children now lack, but desperately need if they are to escape the cycle of dependence.
Abstract: Sociologist James Q. Wilson proposes that unmarried, pregnant girls live in some type of supervised, privately run group home as a condition of receiving government benefits. This, he argues, would help provide the social structure that young single mothers and their children now lack, but desperately need if they are to escape the cycle of dependence. Kathleen Sylvester of the Progressive Policy Institute describes a version of this concept that was approved by Congress last fall as part of its welfare reform package. President Clinton vetoed the bill. James Q. WilsonBeginning with Our Children The American political regime as envisioned by its founders was not supposed to have anything to do with character. It was supposed to enable people living in villages and towns to compose their differences at the national level in order to secure a more perfect union and ensure domestic tranquillity and justice. It was assumed that it was in private associations (family, neighborhood, and peer groups) and small political institutions (village and town governments) that character could be formed, so the national government could take character for granted. "Men," Madison wrote, "are presumed to have sufficient virtue to constitute and maintain a free republic." Today, by contrast, we are properly concerned with the issue of character, placed on the national agenda by both political parties. Why is it that the assumptions that influenced the men who gathered in Philadelphia in 1787 are no longer the assumptions we bring to the contemplation of the true purpose of our national polity? One reason is that government has gotten bigger. As government gets bigger, it touches all aspects of our lives. As it touches all aspects of our lives, we increasingly put our concerns back on the government. The price of big government is an ever-expanding agenda. Forty years ago, when I began studying politics, it was inconceivable that the federal government would ever be held responsible for crime, drug abuse, illegitimacy, welfare, civil rights, clean air, or clean water. Today it is responsible for all those things. Not only have our aspirations and the size of the federal government changed, but so has our culture. In many spheres of our lives, we are no longer confident that local private institutions like families, churches, and neighborhoods are sufficient to form a culture that will sustain and enrich a free society. One of the areas in which this concern has become most sharply focused is illegitimacy. An illegitimacy crisis that affects both blacks and whites has been growing without let up since the early 1960s. It has shaped the profound debate now unfolding in Washington about the relationship between federal welfare programs and the very formation and maintenance of families. When Title IV, which created the program now known as Aid to Families with Dependent Children (AFDC),controversial part of that landmark legislation. For years after its enactment, it caused scarcely a national political ripple. The reason was that it gave federal support to a few state programs designed to provide short-term compensation to women who had been widowed due to the First World War or coal-mining disasters, or who had been divorced or deserted by husbands. It nationalized these programs by extending them to the other states. No one expected it would become a way of life and no one ever expected teenagers would apply for it. All that has changed. Today, the typical new AFDC enrollee spends only two or three years on welfare and then moves on. It acts as a stopgap measure to provide for the needs of women and children during a particularly difficult time in their lives. This is the group for whom AFDC was originally intended. But if you look at the total number of welfare recipients, as opposed to those who enter the roles in any given year, you notice that the typical recipient has been on welfare for 10 years. …

Journal Article
TL;DR: Esquith's story is compelling, not only because his students do well, but also because he proves that high expectations and hard work produce results as mentioned in this paper. But the problem of public education today is the low expectations it sets for poor children, especially black and Hispanics.
Abstract: In the great debate about affirmative action that is about to begin, opponents of racial quotas and preferences must be prepared to offer a better way of opening opportunity for racial groups that have historically been victims of discrimination. The answer is high academic standards for everyone, a solution that the Clinton administration is, unfortunately, resisting. The greatest tragedy of public education today is the low expectations it sets for poor children, especially blacks and Hispanics. The prevailing assumption among teachers and principals in all too many poor neighborhoods seems to be: "These kids can't learn, and even if they do, society is so racist that there's no opportunity for them anyway." Talk about a self-fulfilling prophecy. Why study hard if your teacher doesn't think you can learn? Racial quotas and preferences reinforce this defeatism by sending the message that blacks and Hispanics can succeed academically and economically only if they are held to a lower standard than Asians and whites. But the experience of countless schools and teachers shows that, when much is expected of them, black and Hispanic children can achieve at a very high level. Test scores for racial minorities can rise by quantum jumps without the ugly practice of "race-norming," where test scores for blacks and Hispanics are artificially raised to compensate for their racial status. Consider the track record of Rafe Esquith, a fifth- and sixth-grade teacher at Hobart Elementary School in Los Angeles, near the site of the 1992 riots. Most children at Hobart are poor immigrants from families that don't speak English; 80 percent are Hispanic, 16 percent Asian, 2 percent African-American. Three-quarters have single parents, and many come from alcoholic families. But by the end of the sixth grade, his students have finished a year of algebra and classical literature--including eight of Shakespeare's plays. His junior thespian crew, the "Hobart Shakespeareans,"have performed with Britain's Royal Shakespeare Company. Their performance of Measure for Measure will open the World Shakespeare Congress in Los Angeles this June. How does Esquith bring out the best in his class? The same way outstanding teachers always do. He sets high standards, and gives his students the help they need to meet them. "I put my heart and soul into my teaching, and the kids know that. I demand the same from them."(They even eat lunch in the classroom.) By teaching his youngsters to solve difficult math equations and to read and perform Shakespeare early on, Esquith trains his kids to take on challenges that at first seem impossible. The result: Last year, in the mathematics portion of the Comprehensive Test of Basic Skills (a test administered to the Los Angeles Unified School District),achieved average scores in the 98th percentile, while the school itself scored in the 64th percentile, and the district in the 47th percentile. On the English test, they scored in the 92nd percentile, while the school overall scored in the 47th percentile, and the district in the 35th percentile. His sixth-grade class produced similar results--even though all of Esquith's students speak English as a second language, and, but for his devotion, most would be written off as "slow" or "troubled." Esquith's devotion to his students doesn't stop when they graduate from elementary school. On weekends he gathers his former students to prepare them for the Scholastic Aptitude Test (SAT).to their respective schools. Most of these students are the first in their family ever to attend college. Esquith's story is compelling, not only because his students do well, but because he proves that high expectations and hard work produce results. His students may be poor, but they will be able to compete in the marketplace on a level playing field because they are well educated, and Esquith--the recipient of the Walt Disney Outstanding Teacher of the Year award in 1992--can trump most private-school teachers any day. …

Journal Article
TL;DR: The 288 most dangerous convicts in Maryland are incarcerated in "Supermax," the Maryland Correctional Adjustment Center as discussed by the authors, which is a prisoners' prison, one in a growing class of "supermaximum security" facilities that take the worst of the worst--murderers and rapists who have continued their violent behavior while on the inside -- and lock them away in a redoubt of fortified walls and high-technology surveillance equipment.
Abstract: The 288 most dangerous convicts in Maryland are incarcerated in "Supermax," the Maryland Correctional Adjustment Center. Supermax is a prisoners' prison, one in a growing class of "supermaximum security" facilities that take the worst of the worst--murderers and rapists who have continued their violent behavior while on the inside -- and lock them away in a redoubt of fortified walls and high-technology surveillance equipment. Supermaxes exist to isolate "bad" prisoners from the general population, and in turn, these incorrigibles from each other. While some see supermaxes as civilized society's last line of defense against its most violent predators, others see them as cruel and unusual punishment. This latter group includes the ACLU's and National Prison Project, the National Campaign to Stop Control Unit Prisons, and, most recently, the Clinton Justice Department. Deval Patrick, the assistant attorney general for civil rights, has threatened to sue Maryland for alleged violations of prisoners' civil rights at Supermax. Maryland officials vehemently deny the charges. The Clinton administration, they claim, is conducting a quiet campaign on behalf of prisoners that belies its tough-on-crime rhetoric. Even as President Clinton proposes a constitutional amendment to protect the rights of crime victims, these officials charge, his Justice Department is trying to expand the rights of prisoners by singling out Supermax. And although the president has signed legislation to liberate state prisons from activist federal judges and groundless prisoner lawsuits, his political appointees are attempting to micromanage the incarceration of the states' most dangerous convicts. Rico Marzano is one such convict. In a 1987 drug deal that went bad, Marzano shot and killed four people, including two pregnant women. He was convicted of first-degree murder and placed in a maximum security facility. After his second escape attempt, he was sent to Supermax. Maryland's highest security prison houses three kinds of inmates: "serious institutional rule violators" (typically inmates who have assaulted or killed guards or other inmates); serious escape risks, like Marzano; and prisoners awaiting death sentences, like Anthony Brandison, who paid a man $9,000 to kill two witnesses scheduled to testify against him in a 1983 federal drug case. In all, 105 murderers and 19 rapists spend their days in Supermax in what the corrections system calls "restricted confinement." Inmates remain alone 23 hours a day in their 65-square-foot concrete cells. They are allowed no physical contact with other prisoners or guards; meals are passed through a narrow slit -- a "beanhole" -- in solid metal cell doors. Out-of-cell time is spent alone, in a windowless prison dayroom. "When we were letting them rec [recreate] together they were killing each other, so we had to stop," said William Sondervan, Maryland's assistant commissioner for security operations. Despite such concern for the safety of inmates -- not to mention the prison staff -- the Clinton Justice Department insists that the isolation of inmates at Supermax is "the mental equivalent of putting an asthmatic in a place with little air to breathe." But the staff must maintain order among prisoners who regularly hurl "correctional cocktails" of feces, urine, and other bodily fluids and often have to be forcibly removed from their cells. It is obviously frustrating, then, that prison officials must also fight a legal battle to protect what they see as a valuable -- and legal -- correctional tool. Each year, the 50 states spend $81 million defending themselves against prisoner lawsuits seeking redress for civil rights "violations" ranging from insufficiently stylish footwear to faulty television reception. This epidemic of prisoner litigation -- one-fourth of the civil cases filed in federal trial courts last year were initiated by prisoners -- is complemented by federal judges who impose "voluntary" consent decrees on states. …

Journal Article
TL;DR: In 1970, California governor Ronald Reagan signed the first no-fault divorce law as mentioned in this paper, which was hailed as an overdue reform of a wink-wink, nudge- nudge system rife with hypocrisy and lurid accusations.
Abstract: On September 5, 1969, with a stroke of his pen, California governor Ronald Reagan wiped out the moral basis for marriage in America. Within five years, 44 states had followed California's lead in instituting some form of no-fault divorce reform. (Oklahoma and Maryland already had no-fault laws on the books, but the Golden State is credited with igniting a national wildfire.) Today, every state in the nation permits at least one of two no-fault mechanisms for dissolving a marriage: (1) one or both spouses can sue for divorce because of "incompatibility,"and waiting a certain period of time. No longer must a spouse prove cruelty, desertion, or adultery, the traditional grounds for divorce. At the time, such legislation seemed humane and enlightened. It was hailed as an overdue reform of a wink-wink, nudge- nudge system rife with hypocrisy and lurid accusations. Under the fault-based system, the suing partner had to prove the fault of the other and show themselves to be blameless; otherwise their respective culpability canceled each other's claims. Because the assessment of guilt determined the division of assets between spouses, the stakes were high. Even when both partners desired the divorce, they were often reduced to perjury and collusion, sometimes staging "adulterous liaisons" to be captured in grainy photographs by lurking private eyes. And the definition of cruelty in a marriage was stretched to ludicrous lengths. Although attorneys and women's-rights advocates argued vigorously for reform, most Americans disagreed. In 1968, just before the wave of no-fault reforms, less than 20 percent of the general public wanted to make divorce easier, and nearly two-thirds wanted to make divorce laws even stricter. It turns out that the public's instincts were right. With the advent of widespread no-fault divorce, the United States saw a dramatic surge in the number of divorces granted each year. In 1960, 16 percent of first marriages ended in divorce; today, the figure is closer to 50 percent. In the five years following the enactment of no-fault in California, the national divorce rate increased almost 40 percent. The causal connection between the unravelling of divorce laws and the unravelling of marriages is admittedly debatable. But it seems clear that the states' drive to make divorce more humane inadvertently denuded marriage of any meaning. In her forthcoming book, The Abolition of Marriage: How We Destroy Lasting Love, Maggie Gallagher, a scholar with the Institute for American Values, writes, "From a formal, legal standpoint, marriage is no longer an enforceable commitment."Wagner in a paper for the Family Research Council, "marriage becomes nothing more than notarized dating." Last year in National Review, Robert Plunkett, the vice dean of the Southern California Institute of Law, eloquently described the legal case against no-fault divorce: "The wedding vow has devolved from being the most serious and solemn oath a typical person ever made into being less than a contract. An oral contract made with a two-year- old is more binding than the contract of marriage; it at least binds one party, the adult. A marriage contract is binding on no one." Equally clear is the harm to innocent parties under the no-fault regime: They have been stripped of their legal protection. Even when one spouse desperately wants to keep the marriage together, he or she has no legal recourse, and may have to shell out thousands of dollars in legal fees for an unwanted divorce. All the power belongs to the initiator. In addition to its unintended consequences, no-fault divorce hasn't necessarily fulfilled its goal of attenuating the acrimony of divorce proceedings; the conflict has merely shifted from the allocation of blame to conflicts over child custody and division of property. It still gets ugly when families break apart. In divorce, the scarlet letter no longer stands for adultery--it now can refer to malicious accusations of child sexual abuse. …

Journal Article
TL;DR: Blankenhorn argues that being a father is "society's most important role for men." While mothers typically concentrate on meeting children's immediate physical and emotional needs, fathers tend to focus on preparation for the future and on children's success in the larger society.
Abstract: Are You a Sperm Dad? Perhaps the most important book written this decade on our greatest social crisis is Fatherless America: Confronting Our Most Urgent Social Problem, by New York intellectual David Blankenhorn. The president of the Institute of American Values, Blankenhorn dissects the insidious and prevalent bias against the value of fatherhood in both elite and popular culture. In this impassioned yet carefully reasoned and researched book, he argues that being a father is "society's most important role for men." While mothers typically concentrate on meeting children's immediate physical and emotional needs, Blankenhorn writes, fathers tend to "focus on preparation for the future and on children's success in the larger society. Fathers are likely to devote special attention to character traits necessary for the future, especially qualities such as independence, self-reliance, and the willingness to test limits and take risks." Moreover, Blankenhorn writes, a proper view of fatherhood is essential to the concept of masculinity. "Fatherhood cannot destroy or oppose masculinity, but fatherhood must domesticate masculinity. In a good society, men prove their manhood by being good fathers."But many journalists, academics, and purveyors of popular culture are writing a new "cultural script"-new definitions of what it means to be a man-that fails to serve the needs of children, families, or society. Here are some of the inadequate roles available to today's dads: The Unnecessary Father. "He may be useful in some ways. He may be a nice guy, perhaps even a force for good. But he is nonessential, peripheral, 'not that important.' His presence may be appreciated, but it is not required." The Visiting Father. This poor chump writes the checks and strives valiantly to stay in touch with his children after a divorce. Although often romanticized, as in the popular movie Mrs. Doubtfire, "the great majority of visiting fathers are not-indeed, cannot be-good-enough fathers to their children," The Sperm Father. He accounts for nearly 30 percent of all fathers of young children. "His is the fatherhood of the one-night stand, the favor for a friend, the donation or sale of sperm. His child is the unintended consequence, the result of the affair that did not work out, the reason for the paternity suit, the baby he never learned about." The Stepfather and the Nearby Guy. Rosy assumptions to the contrary, "children who live with stepfathers experience outcomes that are no better, and frequently worse, than children in mother-only homes."boyfriend) provides not fathering but often "ambiguity, complexity, and frequent change." The New Father. "He expresses his emotions. He is a healer, a companion, a colleague, He is a deeply involved parent. He changes diapers, gets up at 2 a.m. to feed the baby, goes beyond 'helping out' in order to share equally in the work, joys, and responsibilities of domestic life."doesn't fault dads who change diapers, but he insists that this is not their most important contribution. In contrast to this cast of losers, Blankenhorn holds up the Good Family Man. He "puts his family first. He is responsible for them. He sacrifices for them" to help prepare them for life in the real world. Sure, this guy also helps his wife care for the children and pitches in with household duties, but he views his role as complementary, not identical, to that of his spouse. This distinction is vital to the masculinity of men and to the future of fatherhood. That's why, Blankenhorn says, it is crucial that society provide men with the right cultural script, because successful fathering depends more on societal codes of conduct than on biology. If we get those codes wrong, we'll see "the continuing decline of fatherhood and a deepening ambivalence and skepticism toward masculinity."more than any other male activity, helps men to become good men." Your Man-Frog or Prince? …

Journal Article
TL;DR: Bartolini et al. as mentioned in this paper found that ambulances, medical supplies, and hospital construction were not high priorities for the Union military bureaucracy, yet the Medical Department, valuing bureaucratic ritual over need, refused to change policy.
Abstract: Clara Barton struggled with bureaucratic insensitivity all her life. Her biggest problem was getting official permission to do good. Born Christmas Day, 1821, Barton showed a talent for organizing charity when she established the first free public school in Bordentown, New Jersey. After enrollments soared, the local school board pushed her aside. When the Civil War erupted, she felt driven to help. After learning that soldiers from her home state of Massachusetts were quartered in the U.S. Senate chambers without beds or supplies, she brought items from her home and, with her own money, purchased and prepared food for them. Barton discovered that ambulances, medical supplies, and hospital construction were not high priorities for the Union military bureaucracy. Henry Halleck, the Army's general-in-chief, dismissed them as "effeminating comforts."soldiers commonly died on the battlefield waiting for treatment, yet the Medical Department, valuing bureaucratic ritual over need, refused to change policy. Frustrated, Barton collected supplies and personally arranged their distribution. Barton eventually accumulated three warehousefuls of supplies and persuaded the Army to help distribute them at the front. At great personal risk -- a bullet ripped through her sleeve at Antietam and killed the injured soldier she was aiding--Barton fed troops and helped evacuate the wounded. Nearly two years into the war, the Army finally agreed to contribute supplies to her volunteer efforts. She devised a plan for each state to establish a distribution agency where it had a substantial regimental presence. Barton encountered bureaucratic resistance everywhere. When the War Department ignored a sergeant's requests for a furlough--despite the fact that both of his arms had been blown off in battle--an outraged Barton escorted him to the office of Massachusetts senator Henry Wilson. She then explained that his furlough request was being disregarded and handed the senator the necessary paperwork. When Wilson extended his welcome, Barton commented, "You will pardon the sergeant for not offering you a hand--he has none."The soldier left on furlough the following day. Exhausted by such struggles, Barton sought rest in Europe. In 1870, the Grand Duchess of Baden enlisted her help with the International Red Cross's relief effort for civilian refugees and wounded soldiers in the Franco-Prussian War. Admiring the systematic organization and efficiency of the institution, Barton undertook a one-woman effort to found an American chapter of the Red Cross. She sought congressional approval for the organization's involvement in wartime humanitarian aid, but was initially thwarted by fears of foreign entanglements. …

Journal Article
TL;DR: The importance of the criminal jury has been emphasized by the authors of the Fifth, Sixth, and Seventh Amendments to the United States Constitution as mentioned in this paper, and the need for juries was especially acute in criminal cases: a grand jury could block any prosecution it deemed unfounded or malicious, and a petit jury could interpose itself on behalf of a defendant charged unfairly.
Abstract: The Founders of our nation understood that no idea was more central to our Bill of Rights--indeed, to government of the people, by the people, and for the people--than the citizen jury. It was cherished not only as a bulwark against tyranny but also as an essential means of educating Americans in the habits and duties of citizenship. By enacting the Fifth, Sixth, and Seventh Amendments to the Constitution, the Framers sought to install the right to trial by jury as a cornerstone of a free society. Today that cornerstone is crumbling. In recent years, a parade of notorious criminal trials has called into question the value of citizen juries. The prosecutions of Oliver North, O.J. Simpson, William Kennedy Smith, the Menendez brothers, and the assailants of Rodney King and Reginald Denny have made armchair jurors of millions of Americans. Now the failings of the system seem obvious to anyone with a television: In search of "impartial" jurors, the selection process seems stacked against the educated, the perceptive, and the well informed in favor of those more easily manipulated by lawyers and judges. Attorneys exercising their rights to strike candidates from the pool cynically and slyly seek to exclude jurors on the basis of race, gender, and other supposed indicators of bias. Courts subject citizens to repeated summonses, intrusive personal questioning, and long and inefficient trials. Unsurprisingly, many citizens avoid jury duty. In court, jurors serve a passive role dictated by rules that presume jurors are incapable of impartial deliberation and that provide little help in understanding points of law or evaluating testimony. The public perceives that the scales of justice tip in favor of rich defendants with high-priced counsel. More than a million Americans serve as jurors on state courts each year. Jury service offers these Americans an unequaled opportunity to participate democratically in the administration of justice. But on its present course, this vital egalitarian institution may shrivel up, avoided by citizens, manipulated by lawyers and litigants, and ridiculed by the general public. To be sure, the system has inherent limitations; "correct" verdicts cannot be guaranteed. But given the jury's present form, society is bearing the costs of a jury system's vices without enjoying a jury system's virtues. Our task is to demonstrate why the citizen jury is worth defending, and to propose a number of specific reforms designed to restore the jury to its rightful status in a democracy under law. A Cornerstone of Democracy The Framers of the Constitution felt that juries--because they were composed of ordinary citizens and because they owed no financial allegiance to the government--were indispensable to thwarting the excesses of powerful and overzealous government officials. The jury trial was the only right explicitly included in each of the state constitutions penned between 1776 and 1789. And the criminal jury was one of few rights explicitly mentioned in the original federal constitution proposed by the Philadelphia Convention. Anti-federalists complained that the proposed constitution did not go far enough in protecting juries, and federalists eventually responded by enacting three constitutional amendments guaranteeing grand, petit, and civil juries. The need for juries was especially acute in criminal cases: A grand jury could block any prosecution it deemed unfounded or malicious, and a petit jury could likewise interpose itself on behalf of a defendant charged unfairly. The famous Zenger case in the 1730s dramatized the libertarian advantages of juries. When New York's royal government sought to stifle its newspaper critics through criminal prosecution, New York grand juries refused to indict, and a petit jury famously refused to convict. But the Founders' vision of the jury went far beyond merely protecting defendants. The jury's democratic role was intertwined with other ideas enshrined in the Bill of Rights, including free speech and citizen militias. …

Journal Article
TL;DR: For example, the Bethlehem Area School District as discussed by the authors adopted an English-Second policy to teach students in their native language before they moved on to regular classes taught in English, and the results showed that it was not working.
Abstract: School superintendent Thomas J. Doluisio was puzzled. His Bethlehem, Pennsylvania, district had an elaborate program of Spanish-language classes for its large population of Spanish-speaking children. Proponents of bilingual education said this would help Hispanic children adjust when they moved on to English-only classes--which they were supposed to do after three years. But it wasn't working. Hispanic students lagged behind their peers in test scores, reading levels, and graduation rates. "Our college-track courses were lily-white,"of the difference." What went wrong? Doluisio found out in a 1992 meeting with his district's elementary-school principals. The short answer: seven years. That's how long it was taking a typical student in the bilingual program to move into regular classes taught in English. Bethlehem had effectively established an English-second policy, thanks to educators who considered native-language training of primary importance. "I was flabbergasted," Doluisio says. More than that, he was angry. And then he got busy. Within a year, Doluisio led a stunning transformation of Bethlehem's language policy. His district became one of a handful in the country to reverse course on bilingual education. Today, Bethlehem's Spanish-speaking students are immersed in English-speaking classrooms, where they hear almost nothing but English. The school district switched policies only after a bitter struggle had divided the community along racial and ethnic lines. But thanks to Doluisio's leadership, the benefits of English immersion are starting to show, and the naysayers are starting to change their minds. Today, Bethlehem provides a stirring example of how other school districts can challenge the bilingual-education orthodoxy--and win. Since 1968, the federal government has spent nearly $4 billion on bilingual education. In 1995 alone, it spent $206 million. (President Clinton wanted to increase the annual appropriation to $300 million, but was halted by the budget crunch and House Republicans. ) But federal money has been less important than federal power to the consolidation of bilingual education. During the 1970s and 1980s, bureaucrats in the U. S. Department of Education coerced hundreds of school districts around the country into adopting native-language instruction for their non-English-speaking students. They based their tactics on a 1974 Supreme Court decision that established a constitutional right for non-English-speaking children to receive some sort of special language assistance. The Court did not prescribe a particular pedagogical approach, but its federal enforcers did. Through a confusing array of regulations, court orders, and consent decrees, they insisted that school districts provide a curriculum with a heavy emphasis on the students' native tongues. In this political climate, Bethlehem's program was born. The Bethlehem Area School District, serving 13,000 children, is Pennsylvania's fifth-largest. About 10 percent of its students cannot speak English well, and of these, 86 percent speak Spanish in their homes. Most of these children are Puerto Rican, but immigrants from Central and South America make up a growing part of the Spanish-speaking population. Before the 1993-94 school year, Bethlehem was fully committed to bilingual education and its goal of teaching students in their native language before they moved into regular classrooms. English- speakers and immigrant children who didn't speak Spanish attended their neighborhood schools. But the school district essentially segregated its Spanish-speaking students, busing them to two elementary schools. There was little time for English at these segregated schools. Spanish was the language of the classroom, the lunchroom, and the playground. Most bilingual educators say that native- language instruction is the surest road to English fluency, since it makes for an easier transition. …

Journal Article
TL;DR: Hoffman's Course of Legal Study as discussed by the authors grounded the discipline on clear, objective standards: "[W]e assume it as undeniable that pure Ethics and Natural Law lie at the very foundation of all laws." He believed that Jacksonian democracy was weakening the barriers to excessive populism and feared it would end in "mob rule," a situation incompatible with both legal order and objective truth.
Abstract: To early generations of Americans, republicanism conveyed two concepts of citizenship: "rights," which limited government, and "responsibilities," which constituted civic virtue. If the former is divorced from the latter, the law becomes a collection of morally neutral legalisms and lawyers mere technicians of the law. The disrepute of the legal profession today comes largely from this very separation. One lawyer who spent his life trying to maintain the proper connection between rights and responsibilities within the legal profession was David Hoffman. Hoffman was born in 1784, the 11th of 12 children of Dorothea and Peter Hoffman, a prosperous Baltimore merchant. He attended St. John's College in Annapolis for three years before returning to Baltimore, where he apprenticed in law for another three years. Hoffman was the only son who declined to enter the family business, but Hoffman & Sons Dry Goods sometimes needed legal help, so his choice of profession suited the family. By 1816, his practice in bustling Baltimore (then the nation's third largest city) was bringing in $9,000 a year, a lucrative sum at the time. Inclined toward scholarship, Hoffman was unhappy with commercial practice, and in 1814 he accepted a part-time position as the only professor of law at the University of Maryland. In the early 19th century, law courses were available only at a few law schools and colleges, one usually entered the legal profession through an apprenticeship with a practicing lawyer. Yet Hoffman was convinced that jurists of the new republic needed to study history and philosophy to learn the timeless principles that supplemented the laws and made legal precedents intelligible. He also feared that his generation of legal practitioners was becoming too detached from the philosophical underpinnings of the Founding to appreciate the vision of law and lawyering required in republican America. Hoffman set aside a growing share of his professional time to develop a curriculum for the Maryland Law Institute. His Course of Legal Study, published in 1817, was widely circulated and admired by America's legal educators. Even more than his insistence on substantive legal education, however, Hoffman was concerned about the condition of ethical standards in the legal profession. In his view, legal questions ultimately resolved themselves into moral questions, and so his Course grounded the discipline on clear, objective standards: "[W]e assume it as undeniable that pure Ethics and Natural Law lie at the very foundation of all laws." He believed that Jacksonian democracy was weakening the barriers to excessive populism, and feared it would end in "mob rule," a situation incompatible with both legal order and objective truth. …

Journal Article
TL;DR: The V-chip has quickly become the most celebrated piece of computer circuitry in America as discussed by the authors, and it has become a symbol for the public's anger at the entertainment industry for degrading our culture and our society.
Abstract: Over the past few months, the V-chip has quickly become the most celebrated piece of computer circuitry in America. In swift succession, President Clinton championed this little byte of technology in his State of the Union address, Congress passed legislation mandating its use, and the major networks grumbled loudly about challenging the law in court. The drama finally culminated in February at a summit at the White House, where the TV industry's chieftains grudgingly accepted the president's challenge to do more for America's parents and create a ratings system compatible with the V-chip. The story of the V-chip unfolded so fast, and its potential impact is so great, that the media has spent most of its time struggling to answer a host of basic questions: How does this signal-blocking technology work? When will it be available? How much will it cost? Will it live up to its billing? Some are still not even sure what the "V" actually stands for. (It originally stood for "violence," As a Senate cosponsor of the V-chip bill along with Democrat Kent Conrad, I know these details matter, but I also believe the media's focus on them has obscured a larger point. Far more important than what the "V" stands for is what the coming of the V-chip tells us about the public's plummeting regard for the product that television delivers to our homes. Although this invention may merely be an irritant to those in the television business, to millions of Americans the V-chip is a surrogate for their anger at the entertainment industry for degrading our culture and our society. That anger is clearly reflected in any number of public opinion polls, which uniformly show that the public is fed up with the rising tide of sex, violence, and vulgarity in the entertainment media. These surveys are useful, but based on my conversations with people in diners, schools, and small businesses back in Connecticut, I believe they barely begin to measure the public's intense feelings toward television. My experience tells me that beneath the surface of the Telecommunications Revolution bubbles a revolution of another kind--a "Revolt of the Revolted,"William Bennett and I have taken to calling it. It is being fueled by a growing sense that our culture is not only out of touch with the values of mainstream America, but out of control as well. Many people believe that there are no standards that television will not violate, no lines television will not cross. Broadcasters may see the V- chip as a threat to their independence and financial well-being, but many average citizens see television as a threat to their children and their country. In the V-chip, they perceive a modicum of protection for their families. Why are people afraid of television? Much of the news media has focused on the violence, but that is only part of the problem. Millions of Americans are fed up with explicit sex scenes and crude language during prime time and with the pornographic content of those abysmal talk shows and soap operas during the day. They feel television is not only offensive, but on the offensive, assaulting the values they and most of their neighbors share. People are angry because they cannot sit down to watch TV with their children without fearing they will be embarrassed or demeaned. And they are angry because they feel our culture has been hijacked and replaced with something alien to their lives, something that openly rejects rather than reflects the values they try to instill in their families. In the world they see on TV, sex is a recreational pastime, indecency is a cause for laughter, and humans are killed as casually and senselessly as bugs. It is a coarse caricature of the America they love. David Levy, the executive producer of the Caucus for Producers, Writers, and Directors, aptly describes this situation as "television without representation."critics tell me that, in the zealous pursuit of the prized demographic cohort of young adults, the industry has shut out the rest of the public, and let the tastes of a few dictate the menu for all. …


Journal Article
TL;DR: It does not take a legal scholar to recognize that transferring authority to the federal government undermines the whole constitutional structure designed by our Founders Few realize that this shift is also destroying local self-government in towns and cities all across America as mentioned in this paper.
Abstract: It does not take a legal scholar to recognize that transferring authority to the federal government undermines the whole constitutional structure designed by our Founders Few realize that this shift is also destroying local self-government in towns and cities all across America Ours is a national government whose only powers are those delegated to it by the people These powers are enumerated and therefore limited by the Constitution, the Tenth Amendment of which specifies that "the powers not delegated to the United States by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or to the People" Jefferson called the doctrine of reserved powers "the foundation of the Constitution" and said that "to take a single step beyond the boundaries thus specially drawn is to take possession of a boundless field of power, no longer susceptible of any definition" The resulting federalist system meant that most political activity in America would occur at the local and state level, rather than at the federal level Today, of course, the opposite is true Over the course of the 20th century, the federal government has vastly grown in size and power This expansion occurred largely because Congress, with the acquiescence of the Supreme Court, has long acted beyond its authority and outside the limited powers granted to it The Tenth Amendment has become, in the eyes of the Supreme Court at least, "but a truism" But my complaints are more practical than theoretical Never before have local authorities exercised less control over local life Federal judges run our jails, bus our children, manage our schools, determine the rights of municipal employees, detail the conditions under which art can be displayed on city property, and the list goes on The weakening of the Tenth Amendment directly and unduly circumscribes my authority, as the mayor of a large American city, to perform my legitimate duties Several recent statutes, in my opinion, are in clear violation of the Tenth Amendment Consider the Americans with Disabilities Act Although it was passed with good intentions, the ADA mandates compliance with some of the most minutely detailed specifications ever issued by Congress Under this statute, we are now negotiating with the Justice Department the precise angle of slope on sidewalk ramps we want to construct for the wheelchair-bound If we don't even control our sidewalks anymore, what exactly is left of local authority? What Congress views as "good deeds" often undermine both the principles of federalism and the value inherent in a government that is close to the voter By mandating another set of federal laws, the much heralded Motor Voter Act denies states and local authorities discretion in how best to register their own voters And it all but orders the use of local tax dollars to achieve this federal objective Perhaps the most pestiferous offender is the federal judiciary itself A recent US Supreme Court ruling had the effect of extending the provisions of the Fair Labor Standards Act to state and local governments This decision raised eyebrows among even the most liberal interpreters of constitutional law FLSA rates do little more than "protect" our cities' strongest unions by mandating generous overtime rates and work weeks (I was recently sued by a police captain I had prosecuted for theft back when I was the DA; he argued that his punishment entitled him to overtime pay) The practical effect of this ruling is that nonfederal units of government must negotiate employee compensation and overtime rates in the fashion that Congress demands--however minute the level of detail American tax policy is being increasingly decided by unelected federal judges, rather than by elected local representatives By delving even deeper in state and local matters, federal courts are increasingly mandating nonfederal tax hikes by requiring things like new prison construction or huge property-tax increases to pay for federal social experiments like forced busing …

Journal Article
TL;DR: Low-cost computer devices that aid disabled users include screen- magnification software, head-pointer or Morse-code systems that allow severely disabled people to issue computer commands, and another product that helps quadriplegics accomplish tasks, called EyeGaze.
Abstract: While cleaning up the hard disk on my computer recently, I came across an accessory that I had never noticed before called "Text to Speech."discovered that my relatively low-end computer could convert standard text files into speech. It can read aloud whatever I wish, from a brief newspaper column to a massive document. The voice sounds very much like the one you hear throughout A Brief History of Time, a movie based on the book by scientist Stephen Hawking. Struck down by a debilitating illness that robbed him of speech and most of his mobility, Hawking has nevertheless become the world's most celebrated physicist and cosmologist. His extraordinary accomplishments were made possible by technologies that allow him to operate a computer -- and even, through it, to speak to movie audiences. What's amazing to me is not that such technologies exist, but that they are so commonplace as to be installed in a run-of-the- mill personal computer like mine. In fact, inexpensive programs and devices have dramatically expanded the economic opportunities of disabled Americans, not only improving their own lives but allowing all of us to benefit from previously untapped talents and resources. Mark Beckwith of Greenville, Maine, is a quadriplegic. But he is also a draftsman for the Log Home Co.,Systems. For less than a thousand dollars, you can purchase DragonDictate and run sophisticated computer programs by voice. The system converts each word it hears into text by piecing together the word's components and then comparing the combination with acoustic models in its built-in vocabulary list. By speaking key commands such as "Shift- Alt-F4,"even the drafting programs that Beckwith uses to design house-building plans. Other low-cost computer devices that aid disabled users include screen- magnification software ($200 to $600),head-pointer or Morse-code systems that allow severely disabled people to issue computer commands ($500 to $1,400). Toward the opposite end of the cost scale is another product that helps quadriplegics accomplish tasks, called EyeGaze. Developed by LC Technologies in Fairfax, Virginia, it allows users to turn on appliances, run machinery, type letters, and play video games with eye movements. The device uses beams of light to translate movements into computer commands. Even at a price of $25,000, EyeGaze offers many seriously disabled people opportunities they never thought they had. The market for products to enable physically handicapped workers--ranging from speech synthesis and voice-input systems to robotics, deaf relays, Braille text displays, text telephones, monitoring devices, and environmental controls--includes such major players as IBM, AT&T, Xerox, Nynex, Digital Equipment, and Apple. Thought of more broadly, products ranging from pharmaceuticals to ergonomic office furniture are also part of this effort to empower the disabled. Prozac and other antidepressants have helped many workers keep their jobs and improve their performance. Pain relievers allow employees with arthritis and similar ailments to perform work without enduring excruciating pain. Redesigned desks, chairs, and other equipment have done the same for those with chronic back problems, who make up about 15 percent of all people considered "disabled." Many observers who have taken notice of American businesses' interest in this market have attributed it to the Americans with Disabilities Act (ADA),fully implemented in 1994. The ADA, among other things, requires businesses with 15 or more employees to give "equal access" to all disabled workers and consumers. Portrayed as a civil-rights measure no less important than the 1960s legislation against racial discrimination, the ADA may have attracted the attention of some firms that otherwise wouldn't have entered the market for disabled-enabling products. But its overall effect on the employability of the disabled is a great deal more complicated. …