scispace - formally typeset
Search or ask a question

Showing papers by "Harvard University published in 1996"


Posted Content
TL;DR: This paper examined legal rules covering protection of corporate shareholders and creditors, the origin of these rules, and the quality of their enforcement in 49 countries and found that common law countries generally have the best, and French civil law countries the worst, legal protections of investors.
Abstract: This paper examines legal rules covering protection of corporate shareholders and creditors, the origin of these rules, and the quality of their enforcement in 49 countries. The results show that common law countries generally have the best, and French civil law countries the worst, legal protections of investors, with German and Scandinavian civil law countries located in the middle. We also find that concentration of ownership of shares in the largest public companies is negatively related to investor protections, consistent with the hypothesis that small, diversified shareholders are unlikely to be important in countries that fail to protect their rights.

14,563 citations


Posted Content
TL;DR: The authors surveys research on corporate governance, with special attention to the importance of legal protection of investors and of ownership concentration in corporate governance systems around the world, and presents a survey of the literature.
Abstract: This paper surveys research on corporate governance, with special attention to the importance of legal protection of investors and of ownership concentration in corporate governance systems around the world.

13,489 citations


Journal ArticleDOI
TL;DR: It is demonstrated that the benefit of cholesterol-lowering therapy extends to the majority of patients with coronary disease who have average cholesterol levels and was also greater in patients with higher pretreatment levels of LDL cholesterol.
Abstract: Background In patients with high cholesterol levels, lowering the cholesterol level reduces the risk of coronary events, but the effect of lowering cholesterol levels in the majority of patients with coronary disease, who have average levels, is less clear. Methods In a double-blind trial lasting five years, we administered either 40 mg of pravastatin per day or placebo to 4159 patients (3583 men and 576 women) with myocardial infarction who had plasma total cholesterol levels below 240 mg per deciliter (mean, 209) and low-density lipoprotein (LDL) cholesterol levels of 115 to 174 mg per deciliter (mean, 139). The primary end point was a fatal coronary event or a nonfatal myocardial infarction. Results The frequency of the primary end point was 10.2 percent in the pravastatin group and 13.2 percent in the placebo group, an absolute difference of 3 percentage points and a 24 percent reduction in risk (95 percent confidence interval, 9 to 36 percent; P = 0.003). Coronary bypass surgery was needed in 7.5 per...

7,272 citations


Journal ArticleDOI
09 Aug 1996-Cell
TL;DR: The work from the authors' laboratories reviewed herein was supported by grants from the National Cancer Institute.

6,895 citations


Journal ArticleDOI
TL;DR: The term "New Institutionalism" is a term that now appears with growing frequency in political science as mentioned in this paper, and there is considerable confusion about just what the new institutionalism is, how it differs from other approaches, and what sort of promise or problems it displays.
Abstract: The ‘new institutionalism’ is a term that now appears with growing frequency in political science. However, there is considerable confusion about just what the ‘new institutionalism’ is, how it differs from other approaches, and what sort of promise or problems it displays. The object of this essay is to provide some preliminary answers to these questions by reviewing recent work in a burgeoning literature. Some of the ambiguities surrounding the new institutionalism can be dispelled if we recognize that it does not constitute a unified body of thought. Instead, at least three different analytical approaches, each of which calls itself a ‘new institutionalism’, have appeared over the past fifteen years. We label these three schools of thought: historical institutionalism, rational choice institutionalism, and sociological institutionalism.’ All of these approaches developed in reaction to the behavioural perspectives that were influential during the 1960s and 1970s and all seek to elucidate the role that institutions play in the determination of social and political outcomes. However, they paint quite different pictures of the political world. In the sections that follow, we provide a brief account of the genesis of each school and characterize what is distinctive about its approach to social and political problems. We then compare their analytical strengths and weaknesses, * An earlier version of this paper WLS presented at the 1994 Annual Meeting of the American Political Science Association and at a Conference on ‘What is Institutionalism Now? at the

5,455 citations


Journal ArticleDOI
TL;DR: In this article, the authors describe the development and validation of a new instrument, KEYS: Assessing the Climate for Creativity, designed to assess perceived stimulants and obstacles to creativity in organizational work environments.
Abstract: We describe the development and validation of a new instrument, KEYS: Assessing the Climate for Creativity, designed to assess perceived stimulants and obstacles to creativity in organizational work environments. The KEYS scales have acceptable factor structures, internal consistencies, test-retest reliabilities, and preliminary convergent and discriminant validity. A construct validity study shows that perceived work environments, as assessed by the KEYS scales, discriminate between high-creativity projects and low-creativity projects; certain scales discriminate more strongly and consistently than others. We discuss the utility of this tool for research and practice.

5,240 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigate the extent to which the earnings manipulations can be explained by earnings management hypotheses and the relation between earnings manipulation and weaknesses in firms' internal governance structures, and the capital market consequences experienced by firms when the alleged earnings manipulation are made public.
Abstract: . This study investigates firms subject to accounting enforcement actions by the Securities and Exchange Commission for alleged violations of Generally Accepted Accounting Principles. We investigate: (i) the extent to which the alleged earnings manipulations can be explained by extant earnings management hypotheses; (ii) the relation between earnings manipulations and weaknesses in firms' internal governance structures; and (iii) the capital market consequences experienced by firms when the alleged earnings manipulations are made public. We find that an important motivation for earnings manipulation is the desire to attract external financing at low cost. We show that this motivation remains significant after controlling for contracting motives proposed in the academic literature. We also find that firms manipulating earnings are: (i) more likely to have boards of directors dominated by management; (ii) more likely to have a Chief Executive Officer who simultaneously serves as Chairman of the Board; (iii) more likely to have a Chief Executive Officer who is also the firm's founder, (iv) less likely to have an audit committee; and (v) less likely to have an outside blockholder. Finally, we document that firms manipulating earnings experience significant increases in their costs of capital when the manipulations are made public. Resume. Les auteurs analysent les entreprises assujetties aux mesures d'execution prises par la Securities and Exchange Commission dans les cas de presomption de transgression des principes comptables generalement reconnus. Ils s'interessent aux aspects suivants de la question: i) la mesure dans laquelle les presomptions de manipulations des benefices peuvent etre expliquees par les hypotheses existantes de gestion des benefices; ii) la relation entre les manipulations de benefices et les faiblesses des structures de regie interne des entreprises; et iii) la reaction du marche financier a l'endroit des entreprises au sujet desquelles les presomptions de manipulation des benefices sont rendues publiques. Les auteurs constatent qu'un incitatif majeur a la manipulation des benefices est le desir d'obtenir du financement externe a moindre cout. Ils demontrent que cet incitatif demeure important meme apres le controle des motifs contractuels que mettent de l'avant les travaux theoriques. Ils constatent egalement que les entreprises qui manipulent les benefices sont: i) davantage susceptibles d'avoir des conseils d'administration domines par la direction; ii) davantage susceptibles d'avoir un chef de la direction qui joue simultanement le role de president du conseil; iii) davantage susceptibles d'avoir un chef de la direction qui est egalement le fondateur de l'entreprise; iv) moins susceptibles d'avoir un comite de verification; et v) moins susceptibles d'avoir un bloc de titres detenus par un actionnaire exterieur. Enfin, les auteurs etablissent le fait que le cout du capital, pour les entreprises qui manipulent les benefices, enregistre des hausses appreciables lorsque ces manipulations sont rendues publiques.

4,081 citations


Journal ArticleDOI
TL;DR: It is shown that the instrumental variables (IV) estimand can be embedded within the Rubin Causal Model (RCM) and that under some simple and easily interpretable assumptions, the IV estimand is the average causal effect for a subgroup of units, the compliers.
Abstract: We outline a framework for causal inference in settings where assignment to a binary treatment is ignorable, but compliance with the assignment is not perfect so that the receipt of treatment is nonignorable. To address the problems associated with comparing subjects by the ignorable assignment—an “intention-to-treat analysis”—we make use of instrumental variables, which have long been used by economists in the context of regression models with constant treatment effects. We show that the instrumental variables (IV) estimand can be embedded within the Rubin Causal Model (RCM) and that under some simple and easily interpretable assumptions, the IV estimand is the average causal effect for a subgroup of units, the compliers. Without these assumptions, the IV estimand is simply the ratio of intention-to-treat causal estimands with no interpretation as an average causal effect. The advantages of embedding the IV approach in the RCM are that it clarifies the nature of critical assumptions needed for a...

3,978 citations


Journal ArticleDOI
TL;DR: The Bekenstein-Hawking area entropy relation S BH = A 4 was derived for a class of five-dimensional extremal black holes in string theory by counting the degeneracy of BPS solition bound states.

3,497 citations


Journal ArticleDOI
TL;DR: A description of the assumed context and objectives of multiple imputation is provided, and a review of the multiple imputations framework and its standard results are reviewed.
Abstract: Multiple imputation was designed to handle the problem of missing data in public-use data bases where the data-base constructor and the ultimate user are distinct entities. The objective is valid frequency inference for ultimate users who in general have access only to complete-data software and possess limited knowledge of specific reasons and models for nonresponse. For this situation and objective, I believe that multiple imputation by the data-base constructor is the method of choice. This article first provides a description of the assumed context and objectives, and second, reviews the multiple imputation framework and its standard results. These preliminary discussions are especially important because some recent commentaries on multiple imputation have reflected either misunderstandings of the practical objectives of multiple imputation or misunderstandings of fundamental theoretical results. Then, criticisms of multiple imputation are considered, and, finally, comparisons are made to alt...

3,495 citations


Book
Paul Pierson1
01 Jan 1996
TL;DR: In this article, the authors lay the foundation for an understanding of welfare state retrenchment and highlight the factors that limit or facilitate the success of such a strategy, using quantitative and qualitative data from four cases (Britain, United States, Germany, and Sweden).
Abstract: This essay seeks to lay the foundation for an understanding of welfare state retrenchment. Previous discussions have generally relied, at least implicitly, on a reflexive application of theories designed to explain welfare state expansion. Such an approach is seriously flawed. Not only is the goal of retrenchment (avoiding blame for cutting existing programs) far different from the goal of expansion (claiming credit for new social benefits), but the welfare state itself vastly alters the terrain on which the politics of social policy is fought out. Only an appreciation of how mature social programs create a new politics can allow us to make sense of the welfare state's remarkable resilience over the past two decades of austerity. Theoretical argument is combined with quantitative and qualitative data from four cases (Britain, the United States, Germany, and Sweden) to demonstrate the shortcomings of conventional wisdom and to highlight the factors that limit or facilitate retrenchment success.

Journal ArticleDOI
09 Aug 1996-Cell
TL;DR: A CKR-5 allele present in the human population appears to protect homozygous individuals from sexual transmission of HIV-1 and is suggested to provide a means of preventing or slowing disease progression.

Journal ArticleDOI
18 Jul 1996-Nature
TL;DR: It is proposed that regulation of the neuroendocrine system during starvation could be the main physiological role of leptin, and preventing the starvation-induced fall in leptin with exogenous leptin substantially blunts the changes in gonadal, adrenal and thyroid axes in male mice, and prevents the starve-induced delay in ovulation in female mice.
Abstract: A total deficiency in or resistance to the protein leptin causes severe obesity. As leptin levels rise with increasing adiposity in rodents and man, it is proposed to act as a negative feedback 'adipostatic signal' to brain centres controlling energy homeostasis, limiting obesity in times of nutritional abundance. Starvation is also a threat to homeostasis that triggers adaptive responses, but whether leptin plays a role in the physiology of starvation is unknown. Leptin concentration falls during starvation and totally leptin-deficient ob/ob mice have neuroendocrine abnormalities similar to those of starvation, suggesting that this may be the case. Here we show that preventing the starvation-induced fall in leptin with exogenous leptin substantially blunts the changes in gonadal, adrenal and thyroid axes in male mice, and prevents the starvation-induced delay in ovulation in female mice. In contrast, leptin repletion during this period of starvation has little or no effect on body weight, blood glucose or ketones. We propose that regulation of the neuroendocrine system during starvation could be the main physiological role of leptin.

Journal ArticleDOI
TL;DR: Reversible, predominantly posterior leukoencephalopathy may develop in patients who have renal insufficiency or hypertension or who are immunosuppressed and the findings on neuroimaging are characteristic of subcortical edema without infarction.
Abstract: Background and Methods In some patients who are hospitalized for acute illness, we have noted a reversible syndrome of headache, altered mental functioning, seizures, and loss of vision associated with findings indicating predominantly posterior leukoencephalopathy on imaging studies. To elucidate this syndrome, we searched the log books listing computed tomographic (CT) and magnetic resonance imaging (MRI) studies performed at the New England Medical Center in Boston and Hopital Sainte Anne in Paris; we found 15 such patients who were evaluated from 1988 through 1994. Results Of the 15 patients, 7 were receiving immunosuppressive therapy after transplantation or as treatment for aplastic anemia, 1 was receiving interferon for melanoma, 3 had eclampsia, and 4 had acute hypertensive encephalopathy associated with renal disease (2 with lupus nephritis, 1 with acute glomerulonephritis, and 1 with acetaminophen-induced hepatorenal failure). Altogether, 12 patients had abrupt increases in blood pressure, and 8...

Journal ArticleDOI
27 Dec 1996-Cell
TL;DR: It is shown that mice engineered to lack Angiopoietin-1 display angiogenic deficits reminiscent of those previously seen in mice lacking TIE2, demonstrating that AngiopOietIn-1 is a primary physiologic ligand for TIE1 and that it has critical in vivo angiogenesis actions that are distinct from VEGF and that are not reflected in the classic in vitro assays used to characterize VEGf.

Book
15 Nov 1996
TL;DR: Spikes provides a self-contained review of relevant concepts in information theory and statistical decision theory about the representation of sensory signals in neural spike trains and a quantitative framework is used to pose precise questions about the structure of the neural code.
Abstract: Our perception of the world is driven by input from the sensory nerves. This input arrives encoded as sequences of identical spikes. Much of neural computation involves processing these spike trains. What does it mean to say that a certain set of spikes is the right answer to a computational problem? In what sense does a spike train convey information about the sensory world? Spikes begins by providing precise formulations of these and related questions about the representation of sensory signals in neural spike trains. The answers to these questions are then pursued in experiments on sensory neurons.The authors invite the reader to play the role of a hypothetical observer inside the brain who makes decisions based on the incoming spike trains. Rather than asking how a neuron responds to a given stimulus, the authors ask how the brain could make inferences about an unknown stimulus from a given neural response. The flavor of some problems faced by the organism is captured by analyzing the way in which the observer can make a running reconstruction of the sensory stimulus as it evolves in time. These ideas are illustrated by examples from experiments on several biological systems. Intended for neurobiologists with an interest in mathematical analysis of neural data as well as the growing number of physicists and mathematicians interested in information processing by "real" nervous systems, Spikes provides a self-contained review of relevant concepts in information theory and statistical decision theory. A quantitative framework is used to pose precise questions about the structure of the neural code. These questions in turn influence both the design and analysis of experiments on sensory neurons.

Posted Content
TL;DR: The authors investigated the relationship between international trade patterns and international business cycle correlations and found that countries with closer trade links tend to have more tightly correlated business cycles and were more likely to satisfy the criteria for entry into a currency union after taking steps toward economic integration than before.
Abstract: A country's suitability for entry into a currency union depends on a number of economic conditions. These include, inter alia, the intensity of trade with other potential members of the currency union, and the extent to which domestic business cycles are correlated with those of the other countries. But international trade patterns and international business cycle correlations are endogenous. This paper develops and investigates the relationship between the two phenomena. Using thirty years of data for twenty industrialized countries, we uncover a strong and striking empirical finding: countries with closer trade links tend to have more tightly correlated business cycles. It follows that countries are more likely to satisfy the criteria for entry into a currency union after taking steps toward economic integration than before.

Journal ArticleDOI
02 Feb 1996-Science
TL;DR: Results indicate that TNF-α induces insulin resistance through an unexpected action of IRS-1 to attenuate insulin receptor signaling.
Abstract: Tumor necrosis factor-α (TNF-α) is an important mediator of insulin resistance in obesity and diabetes through its ability to decrease the tyrosine kinase activity of the insulin receptor (IR). Treatment of cultured murine adipocytes with TNF-α was shown to induce serine phosphorylation of insulin receptor substrate 1 (IRS-1) and convert IRS-1 into an inhibitor of the IR tyrosine kinase activity in vitro. Myeloid 32D cells, which lack endogenous IRS-1, were resistant to TNF-α-mediated inhibition of IR signaling, whereas transfected 32D cells that express IRS-1 were very sensitive to this effect of TNF-α. An inhibitory form of IRS-1 was observed in muscle and fat tissues from obese rats. These results indicate that TNF-α induces insulin resistance through an unexpected action of IRS-1 to attenuate insulin receptor signaling.

Book
07 Jun 1996
TL;DR: The case for a social psychology of creativity is discussed in this article, with a focus on the meaning and measure of creativity and a framework for assessing creativity assessment based on social and environmental influences.
Abstract: Preface to the Updated Edition -- Preface to the 1983 Edition -- Understanding and Assessing Creativity -- The Case for a Social Psychology of Creativity -- The Meaning and Measurement of Creativity -- A Consensual Technique for Creativity Assessment -- A Theoretical Framework -- Social and Environmental Influences -- Effects of Evaluation on Creativity -- Effects of Reward and Task Constraint -- Social Facilitation, Modeling, and Motivational Orientation -- Other Social and Environmental Influences -- Implications -- Implications for Enhancing Creativity -- Toward a Comprehensive Psychology of Creativity -- About the Book and Author -- Credits

Journal ArticleDOI
TL;DR: The findings indicate that the FAD–linked mutations may all cause Alzheimer's disease by increasing the extracellular concentration of Aβ42(43), thereby fostering cerebral deposition of this highly amyloidogenic peptide.
Abstract: To determine whether the presenilin 1 (PS1), presenilin 2 (PS2) and amyloid beta-protein precursor (APP) mutations linked to familial Alzheimer's disease (FAD) increase the extracellular concentration of amyloid beta-protein (A beta) ending at A beta 42(43) in vivo, we performed a blinded comparison of plasma A beta levels in carriers of these mutations and controls. A beta 1-42(43) was elevated in plasma from subjects with FAD-linked PS1 (P < 0.0001), PS2N1411 (P = 0.009), APPK670N,M671L (P < 0.0001), and APPV7171 (one subject) mutations. A beta ending at A beta 42(43) was also significantly elevated in fibroblast media from subjects with PS1 (P < 0.0001) or PS2 (P = 0.03) mutations. These findings indicate that the FAD-linked mutations may all cause Alzhelmer's disease by increasing the extracellular concentration of A beta 42(43), thereby fostering cerebral deposition of this highly amyloidogenic peptide.

Journal ArticleDOI
TL;DR: In this paper, the authors identify a channel for an inverse relationship between income inequality and growth, and measure socio-political instability with indices which capture the occurrence of more or less violent phenomena of political unrest.

Journal ArticleDOI
TL;DR: In this paper, the authors present a model, grounded in a study of the world disk drive industry, that charts the process through which the demands of a firm's customers shape the allocation of resources in technological innovation.
Abstract: Why might firms be regarded as astutely managed at one point, yet subsequently lose their positions of industry leadership when faced with technological change? We present a model, grounded in a study of the world disk drive industry, that charts the process through which the demands of a firm's customers shape the allocation of resources in technological innovation—a model that links theories of resource dependence and resource allocation. We show that established firms led the industry in developing technologies of every sort—even radical ones—whenever the technologies addressed existing customers' needs. The same firms failed to develop simpler technologies that initially were only useful in emerging markets, because impetus coalesces behind, and resources are allocated to, programs targeting powerful customers. Projects targeted at technologies for which no customers yet exist languish for lack of impetus and resources. Because the rate of technical progress can exceed the performance demanded in a market, technologies which initially can only be used in emerging markets later can invade mainstream ones, carrying entrant firms to victory over established companies.

Journal ArticleDOI
01 Mar 1996-Neuron
TL;DR: It is suggested that glial glutamate transporters provide the majority of functional glutamate transport and are essential for maintaining low extracellular glutamate and for preventing chronic glutamate neurotoxicity.

Journal ArticleDOI
TL;DR: Major advances have been achieved recently in knowledge about the molecular organization of the 20S and 19S particles, their subunits, the proteasome's role in MHC-class 1 antigen presentation, and regulators of its activities.
Abstract: The proteasome is an essential component of the ATP-dependent proteolytic pathway in eukaryotic cells and is responsible for the degradation of most cellular proteins. The 20S (700-kDa) proteasome contains multiple peptidase activities that function through a new type of proteolytic mechanism involving a threonine active site. The 26S (2000-kDa) complex, which degrades ubiquitinated proteins, contains in addition to the 20S proteasome a 19S regulatory complex composed of multiple ATPases and components necessary for binding protein substrates. The proteasome has been highly conserved during eukaryotic evolution, and simpler forms are even found in archaebacteria and eubacteria. Major advances have been achieved recently in our knowledge about the molecular organization of the 20S and 19S particles, their subunits, the proteasome's role in MHC-class 1 antigen presentation, and regulators of its activities. This article focuses on recent progress concerning the biochemical mechanisms and intracellular functions of the 20S and 26S proteasomes.

Journal ArticleDOI
18 Oct 1996-Cell
TL;DR: A committee of several scientists who have been involved in the identification and characterization of these enzymes have formed a committee, with the objective of proposing a nomenclature for the human members of this protease family that is sensible and easy to use.

Journal ArticleDOI
28 Jun 1996-Cell
TL;DR: The ability of various members of the chemokine receptor family to support the early stages of HIV-1 infection helps to explain viral tropism and beta-chemokine inhibition of primary HIV- 1 isolates.

Journal ArticleDOI
TL;DR: The Bekenstein-Hawking area-entropy relation for extremal black holes in string theory was derived in this paper by counting the degeneracy of BPS soliton bound states.
Abstract: The Bekenstein-Hawking area-entropy relation $S_{BH}=A/4$ is derived for a class of five-dimensional extremal black holes in string theory by counting the degeneracy of BPS soliton bound states.

Journal ArticleDOI
16 Oct 1996-JAMA
TL;DR: The basis for recommendations constituting the reference case analysis, the set of practices developed to guide CEAs that inform societal resource allocation decisions, and the content of these recommendations are described.
Abstract: Objective. —To develop consensus-based recommendations for the conduct of cost-effectiveness analysis (CEA). This article, the second in a 3-part series, describes the basis for recommendations constituting the reference case analysis, the set of practices developed to guide CEAs that inform societal resource allocation decisions, and the content of these recommendations. Participants. —The Panel on Cost-Effectiveness in Health and Medicine, a nonfederal panel with expertise in CEA, clinical medicine, ethics, and health outcomes measurement, was convened by the US Public Health Service (PHS). Evidence. —The panel reviewed the theoretical foundations of CEA, current practices, and alternative methods used in analyses. Recommendations were developed on the basis of theory where possible, but tempered by ethical and pragmatic considerations, as well as the needs of users. Consensus Process. —The panel developed recommendations through 21/2 years of discussions. Comments on preliminary drafts prepared by panel working groups were solicited from federal government methodologists, health agency officials, and academic methodologists. Conclusions. —The panel's methodological recommendations address (1) components belonging in the numerator and denominator of a cost-effectiveness (C/E) ratio; (2) measuring resource use in the numerator of a C/E ratio; (3) valuing health consequences in the denominator of a C/E ratio; (4) estimating effectiveness of interventions; (5) incorporating time preference and discounting; and (6) handling uncertainty. Recommendations are subject to the "rule of reason," balancing the burden engendered by a practice with its importance to a study. If researchers follow a standard set of methods in CEA, the quality and comparability of studies, and their ultimate utility, can be much improved.

Journal ArticleDOI
TL;DR: There has been a nearly exponential increase in the use of quality-of-life evaluation as a technique of clinical research since 1973, when only 5 articles listed “quality of life” as a reference key word in the MEDLINE data base; during the subsequent five-year periods there were 195, 273, 490, and 1252 such articles.
Abstract: Since 1948, when the World Health Organization defined health as being not only the absence of disease and infirmity but also the presence of physical, mental, and social well-being,1 quality-of-life issues have become steadily more important in health care practice and research. There has been a nearly exponential increase in the use of quality-of-life evaluation as a technique of clinical research since 1973, when only 5 articles listed “quality of life” as a reference key word in the MEDLINE data base; during the subsequent five-year periods, there were 195, 273, 490, and 1252 such articles. The growing fields of outcomes . . .

Journal ArticleDOI
TL;DR: The tissue-specific expression of a putative secreted protein suggests that this factor may function as a novel signaling molecule for adipose tissue.