scispace - formally typeset
Search or ask a question

Showing papers by "Brown University published in 1990"


Journal ArticleDOI
TL;DR: The use of natural symmetries (mirror images) in a well-defined family of patterns (human faces) is discussed within the framework of the Karhunen-Loeve expansion, which results in an extension of the data and imposes even and odd symmetry on the eigenfunctions of the covariance matrix.
Abstract: The use of natural symmetries (mirror images) in a well-defined family of patterns (human faces) is discussed within the framework of the Karhunen-Loeve expansion This results in an extension of the data and imposes even and odd symmetry on the eigenfunctions of the covariance matrix, without increasing the complexity of the calculation The resulting approximation of faces projected from outside of the data set onto this optimal basis is improved on average >

2,686 citations


Book
Tony Lancaster1
01 Jan 1990
TL;DR: In this paper, the gamma function and distribution of the Laplace transform have been investigated in the context of model building, and the hazard function has been shown to be an important process in modeling structural transition models.
Abstract: Preface Part I. Model Building: 1. Some basic results 2. Covariates and the hazard function 3. Parametric families of duration distribution 4. Mixture models 5. Some important processes 6. Some structural transition models Part II. Inference: 7. Identifiability issues 8. Fully parametric inference 9. Limited information inference 10. Misspecification analysis 11. Residual analysis Appendix 1: The gamma function and distribution Appendix 2: Some properties of the Laplace transform Bibliography Index.

1,788 citations


Book
23 Feb 1990
TL;DR: In this article, basic elastodynamic solutions for a stationary crack and asymptotic fields near a moving crack tip are presented. But they do not consider the elasticity and rate effects during crack growth.
Abstract: Preface List of symbols 1. Background and overview 2. Basic elastodynamic solutions for a stationary crack 3. Further results for a stationary crack 4. Asymptotic fields near a moving crack tip 5. Energy concepts in dynamic fracture 6. Elastic crack growth at constant speed 7. Elastic crack growth at nonuniform speed 8. Plasticity and rate effects during crack growth Bibliography Index.

1,694 citations


Journal ArticleDOI
TL;DR: The two-dimensional version of the Runge- Kutta Local Projection Discontinuous Galerkin (RKDG) methods are studied, which can easily handle the boundary conditions, verify maximum principles, and are formally uniformly high-order accurate.
Abstract: In this paper we study the two-dimensional version of the Runge- Kutta Local Projection Discontinuous Galerkin (RKDG) methods, already de- fined and analyzed in the one-dimensional case. These schemes are defined on general triangulations. They can easily handle the boundary conditions, verify maximum principles, and are formally uniformly high-order accurate. Prelimi- nary numerical results showing the performance of the schemes on a variety of initial-boundary value problems are shown.

1,583 citations


Journal ArticleDOI
TL;DR: In this article, the effect of group invariance on the stability of solitary waves was studied and applications were given to bound states and traveling wave solutions of nonlinear wave equations, where the authors considered an abstract Hamiltonian system which is invariant under a group of operators.

1,557 citations


Journal ArticleDOI
TL;DR: The different firing properties of neurons in neocortex contribute significantly to its network behavior, particularly in response to current steps.

1,422 citations


Book
01 Jan 1990
TL;DR: The primary ethical framework: patient-centered principles and application: Advance directives, personhood, and personal identity provide a framework for distributing justice and the incompetent.
Abstract: Preface Introduction Part I. Theory: 1. Competence and incompetence 2. The primary ethical framework: patient-centered principles 3. Advance directives, personhood, and personal identity 4. Distributive justice and the incompetent Part II. Application: 5. Minors 6. The elderly 7. The mentally ill Looking forward Appendix 1: living trust and nomination of conservatorship Appendix 2: durable power of attorney for health care Notes Index.

1,001 citations


Journal ArticleDOI
TL;DR: In this paper, a general criterion for testing a mesh with topologically similar repeat units is given, and it is shown that only a few conventional element types and arrangements are suitable for computations in the fully plastic range.

927 citations


Journal ArticleDOI
TL;DR: The MDS provides a structure and language in which to understand long-term care, design care plans, evaluate quality, and describe the nursing facility population for planning and policy efforts.
Abstract: In response to the Omnibus Reconciliation Act of 1987 mandate for the development of a national resident assessment system for nursing facilities, a consortium of professionals developed the first major component of this system, the Minimum Data Set (MDS) for Resident Assessment and Care Screening. A two-state field trial tested the reliability of individual assessment items, the overall performance of the instrument, and the time involved in its application. The trial demonstrated reasonable reliability for 55% of the items and pinpointed redundancy of items and initial design of scales. On the basis of these analyses and clinical input, 40% of the original items were kept, 20% dropped, and 40% altered. The MDS provides a structure and language in which to understand long-term care, design care plans, evaluate quality, and describe the nursing facility population for planning and policy efforts.

909 citations



Posted Content
TL;DR: In this paper, the authors develop a model based on Schumpeter's process of creative destruction, which departs from existing models of endogenous growth in emphasizing obsolescence of old technologies induced by the accumulation of knowledge and the resulting process or industrial innovations.
Abstract: This paper develops a model based on Schumpeter's process of creative destruction. It departs from existing models of endogenous growth in emphasizing obsolescence of old technologies induced by the accumulation of knowledge and the resulting process or industrial innovations. This has both positive and normative implications for growth. In positive terms, the prospect of a high level of research in the future can deter research today by threatening the fruits of that research with rapid obsolescence. In normative terms, obsolescence creates a negative externality from innovations, and hence a tendency for laissez-faire economies to generate too many innovations, i.e too much growth. This "business-stealing" effect is partly compensated by the fact that innovations tend to be too small under laissez-faire. The model possesses a unique balanced growth equilibrium in which the log of GNP follows a random walk with drift. The size of the drift is the average growth rate of the economy and it is endogenous to the model ; in particular it depends on the size and likelihood of innovations resulting from research and also on the degree of market power available to an innovator.

Journal ArticleDOI
TL;DR: The formalism is applied to the questions of particle production and reheating in inflationary universe models and requirements are found which the couplings in new-inflation-type models must satisfy for efficient reheating to occur.
Abstract: Techniques are developed to calculate the energy production in quantum fields which obtain a mass through the spontaneous symmetry breaking of a second field which is undergoing a phase transition. All fields are assumed to be out of thermal equilibrium and weakly coupled. The energy produced in a field, which is initially in its ground state, is computed for two generic types of time-dependent masses: a roughly monotonic turn on of the mass and an oscillatory mass. The formalism is applied to the questions of particle production and reheating in inflationary universe models. Requirements are found which the couplings in new-inflation-type models must satisfy for efficient reheating to occur.


Journal ArticleDOI
TL;DR: Hepatitis B e antigen and hepatitis B viral DNA disappeared from serum significantly more often in the patients given prednisone plus interferon or 5 million units of interferons alone than in the untreated controls.
Abstract: Background and Methods. Chronic hepatitis B is a common and often progressive liver disorder for which there is no accepted therapy. To assess the efficacy of treatment with interferon, we randomly assigned patients with chronic hepatitis B to one of the following regimens: prednisone for 6 weeks followed by 5 million units of recombinant interferon alfa-2b daily for 16 weeks; placebo followed by 5 million units of interferon daily for 16 weeks; placebo followed by 1 million units of interferon daily for 16 weeks; or observation with no treatment. Results. Hepatitis B e antigen and hepatitis B viral DNA disappeared from serum significantly more often in the patients given prednisone plus interferon (16 of 44 patients, or 36 percent) or 5 million units of interferon alone (15 of 41; 37 percent) than in the untreated controls (3 of 43; 7 percent; P<0.001); the difference between those given 1 million units of interferon (7 of 41; 17 percent) and the controls was not significant. The strongest indep...

Journal ArticleDOI
15 Jun 1990-Cell
TL;DR: Transgenic mice have been generated bearing a fusion gene consisting of the mouse metallothionein 1 promoter and a human TGF alpha cDNA that plays an important role in cellular proliferation, organogenesis, and neoplastic transformation.

MonographDOI
12 Jan 1990
TL;DR: The Yang-Mills equations and Vlasov-Maxwell equations have been used in this paper to solve the problem of small amplitude scattering of a single wave with small amplitude.
Abstract: Invariance Existence Singularities Solutions of small amplitude Scattering Stability of solitary waves Yang-Mills equations Vlasov-Maxwell equations.

Journal ArticleDOI
TL;DR: The computational complexity of testing finite state processes for equivalence in Milner's Calculus of Communicating Systems (CCS) is examined and it is proved that observational equivalence can be tested in polynomial time and that testing for failure equivalence is PSPACE-complete.
Abstract: We examine the computational complexity of testing finite state processes for equivalence in Milner's Calculus of Communicating Systems (CCS). The equivalence problems in CCS are presented as refinements of the familiar problem of testing whether two nondeterministic finite automata (NFA) are equivalent, i.e., accept the same language. Three notions of equivalence proposed for CCS are investigated, namely, observational equivalence, strong observational equivalence, and failure equivalence. We show that observational equivalence can be tested in polynomial time. As defined in CCS, observational equivalence is the limit of a sequence of successively finer equivalence relations, ≈k, where ≈1 is nondeterministic finite automaton equivalence. We prove that, for each fixed k, deciding ≈k is PSPACE-complete. We show that strong observational equivalence can be decided in polynomial time by reducing it to generalized partitioning, a new combinatorial problem of independent interest. Finally, we demonstrate that testing for failure equivalence is PSPACE-complete, even for a very restricted type of process.

Journal ArticleDOI
12 Jan 1990-Cell
TL;DR: Immunoblotting experiments, comigration on 2D gels, and 2D analysis of limit chymotryptic digests demonstrated that the 63 kd protein, present in the middle T complex in approximately equimolar ratio to the 36 kdprotein, is a known regulatory subunit of the PP2A holoenzyme.

Journal ArticleDOI
Alan Needleman1
TL;DR: In this paper, a COHESIVE zone type interface model is used to study the decohesion of a viscoplastic block from a rigid substrate, taking full account of finite geometry changes, and the specific boundary value problem analysed is one of plane strain tension with a superposed hydrostatic stress.
Abstract: A COHESIVE zone type interface model, taking full account of finite geometry changes, is used to study the decohesion of a viscoplastic block from a rigid substrate. Dimensional considerations introduce a characteristic length into the formulation. The specific boundary value problem analysed is one of plane strain tension with a superposed hydrostatic stress. For a perfect interface, if the maximum traction that the viscoplastic block can support is greater than the interfacial strength, decohesion takes place in a primarily tensile mode. If this maximum traction is lower than the interfacial strength, a shear dominated decohesion initiates at the block edge. Imperfections in the form of a non-bonded portion of the interface are considered. The effects of imposed stress triaxiality, size scale, loading rate and interfacial properties on the course of defect dominated decohesion are illustrated. The characterization of decohesion initiation and propagation in terms of rice's (J. appl. Mech. 35, 379, 1968) J-integral is investigated for a variety of interface descriptions and values of the superposed hydrostatic stress.

Journal ArticleDOI
TL;DR: Analysis of polyclonal anti-synthetic peptide serum indicates that inactivation of the RB protein, p105-Rb, is universal in retinoblastoma cells, vindicating the predictions of the Knudson "two-hit" hypothesis.
Abstract: We have used polyclonal anti-synthetic peptide serum to study the role of retinoblastoma gene (RB) inactivation in a variety of human tumor cell lines. Our analysis indicates that inactivation of the RB protein, p105-Rb, is universal in retinoblastoma cells, vindicating the predictions of the Knudson "two-hit" hypothesis. In addition, our analysis has shown that inactivations of the RB gene are nearly as frequent in a more common human tumor, small cell lung carcinoma. One-third of bladder carcinomas surveyed also carry altered or absent p105-Rb. Other human tumors by contrast demonstrate only infrequent inactivation of the RB gene. These results suggest that inactivation of the RB gene is a critical step in the pathogenesis of a subset of human tumors.

Proceedings ArticleDOI
05 Dec 1990
TL;DR: In this article, the convergence of a wide class of approximation schemes to the viscosity solution of fully nonlinear second-order elliptic or parabolic, possibly degenerate, partial differential equations is studied.
Abstract: The convergence of a wide class of approximation schemes to the viscosity solution of fully nonlinear second-order elliptic or parabolic, possibly degenerate, partial differential equations is studied It is proved that any monotone, stable, and consistent scheme converges (to the correct solution), provided that there exists a comparison principle for the limiting equation Several examples are given where the result applies >

Journal ArticleDOI
01 Jun 1990
TL;DR: The analysis presented obtains useful information about linked data structures and a simple extension obtains aliasing information for entire data structures that previously was obtained only through declarations.
Abstract: Compilers can make good use of knowledge about the shape of data structures and the values that pointers assume during execution. We present results which show how a compiler can automatically determine some of this information. We believe that practical analyses based on this work can be used in compilers for languages that provide linked data structures. The analysis we present obtains useful information about linked data structures. We summarize unbounded data structures by taking advantage of structure present in the original program. The worst-case time bounds for a naive algorithm are high-degree polynomial, but for the expected (sparse) case we have an efficient algorithm. Previous work has addressed time bounds rarely, and efficient algorithms not at all. The quality of information obtained by this analysis appears to be (generally) an improvement on what is obtained by existing techniques. A simple extension obtains aliasing information for entire data structures that previously was obtained only through declarations. Previous work has shown that this information, however obtained, allows worthwhile optimization.

Journal ArticleDOI
Peter M. Garber1
TL;DR: In this article, the authors consider market fundamental explanations for speculative events and conclude that "market fundamental explanations are often not easily generated due to the inherent complexity of economic phenomena; and our methodology should always require that we search intensively for market fundamental explanation before clutching the bubble" last resort".
Abstract: T he jargon of economics and finance contains numerous colorful expressions to denote a market-determined asset price at odds with any reasonable economic explanation. Such words as "tulip mania," "bubble," "chain letter," "Ponzi scheme," "panic," "crash," and "financial crisis" immediately evoke images of frenzied and probably irrational speculative activity. Many of these terms have emerged from specific speculative episodes which have been sufficiently frequent and important that they underpin a strong current belief among economists that key capital markets sometimes generate irrational and inefficient pricing and allocational outcomes. Before economists relegate a speculative event to the inexplicable or bubble category, however, we must exhaust all reasonable economic explanations. While such explanations are often not easily generated due to the inherent complexity of economic phenomena, the business of economists is to find clever fundamental market explanations for events; and our methodology should always require that we search intensively for market fundamental explanations before clutching the "bubble" last resort. Thus, among the "reasonable" or "market fundamental" explanations I would include the perception of an increased probability of large returns. The perception might be triggered by genuine economic good news, by a convincing new economic theory about payoffs or by a fraud launched by insiders acting strategically to trick investors. It might also be triggered by uninformed market participants correctly inferring changes in the distribution of dividends by observing price movements generated by the trading of informed insiders. While some of these perceptions might

Journal ArticleDOI
TL;DR: The B19 parvovirus is a remediable cause of severe chronic anemia in HIV-infected patients and recognition of and therapy for parVovirus in this population will avoid erythrocyte transfusion and should prevent transmission of the virus to other persons, including immunosuppressed persons and women of child-bearing age.
Abstract: Objective: To determine the role of B19 parvovirus in red cell aplasia of patients infected with human immunodeficiency virus type 1 (HIV-1). Design: Uncontrolled clinical trial, with assay of seru...

Journal ArticleDOI
Steven P. Reiss1
TL;DR: An overview is given of the Field environment, which was developed to show that highly integrated, interactive environments like those on PCs can be implemented on workstations and can be used for classical-language and large-scale programming.
Abstract: An overview is given of the Field environment, which was developed to show that highly integrated, interactive environments like those on PCs can be implemented on workstations and can be used for classical-language and large-scale programming. Field connects tools with selective broadcasting, which follows the Unix philosophy of letting independent tools cooperate through simple conventions, demonstrating that this simple approach is feasible and desirable. Field achieves its goals by providing a consistent graphical front end and a simple integration framework that lets existing and new Unix tools cooperate. The front end is based on a tool set called the Brown workstation environment. The framework combines selective broadcasting with an annotation editor that provides consistent access to the source code in multiple contexts and with a set of specialized interactive analysis tools. Field's integration framework and message facility are described. >

Posted Content
TL;DR: This article examined whether the Solow growth model is consistent with the international variation in the standard of living and showed that an augmented Solow model that includes accumulation of human as well as physical capital provides an excellent description of the cross-country data.
Abstract: This paper examines whether the Solow growth model is consistent with the international variation in the standard of living. It shows that an augmented Solow model that includes accumulation of human as well as physical capital provides an excellent description of the cross-country data. The model explains about 80 percent of the international variation in income per capita, and the estimated influences of physical-capital accumulation, human-capital accumulation, and population growth confirm the model's predictions. The paper also examines the implications of the Solow model for convergence in standards of living -- that is, for whether poor countries tend to grow faster than rich countries. The evidence indicates that, holding population growth and capital accumulation constant, countries converge at about the rate the augmented Solow model predicts.


Journal ArticleDOI
TL;DR: An improved approach to spectral deconvolution is presented in this article that accurately represents absorption bands as discrete mathematical distributions and resolves composite absorption features into individual absorption bands, and a modified Gaussian model is derived using a power law relationship of energy to average bond length.
Abstract: Although visible and near IR reflectance spectra contain absorption bands that are characteristic of the composition and structure of the absorbing species, deconvolving a complex spectrum is nontrivial. An improved approach to spectral deconvolution is presented that accurately represents absorption bands as discrete mathematical distributions and resolves composite absorption features into individual absorption bands. The frequently used Gaussian model of absorption bands is shown to be inappropriate for the Fe(2+) electronic transition absorptions in pyroxene spectra. A modified Gaussian model is derived using a power law relationship of energy to average bond length. The modified Gaussian model is shown to provide an objective and consistent tool for deconvolving individual absorption bands in the more complex orthopyroxene, clinopyroxene, pyroxene mixtures, and olivine spectra.

Book
21 Dec 1990
TL;DR: The text develops a global shape model and applies it to the analysis of real pictures acquired with a visible light camera under varying conditions of optical degradation to develop methods for image understanding based on structured restoration, for example automatic detection of abnormalities.
Abstract: The text develops a global shape model and applies it to the analysis of real pictures acquired with a visible light camera under varying conditions of optical degradation Computational feasibility of the algorithms derived from this model is achieved by analytical means The aim is to develop methods for image understanding based on structured restoration, for example automatic detection of abnormalities The limits of applicability of the algorithms are also traced by making the optical degradations more and more severe until the algorithms no longer succeed in their task This book is suitable for an advanced undergraduate or graduate seminar in pattern theory, or as an accompanying book for applied probability, computer vision or pattern recognition

Journal ArticleDOI
TL;DR: The marked differences in affinity for (+)-benzomorphans and molecular weight suggest that PC12 cells contain a molecular form of sigma receptor distinct from that predominant in guinea pig brain, which raises the possibility of multiple sigma receptors types.