scispace - formally typeset
Search or ask a question

Showing papers by "University of Southern California published in 1986"


Book
25 Jul 1986
TL;DR: In this paper, the authors propose a homogeneity test for linear regression models (analysis of covariance) and show that linear regression with variable intercepts is more consistent than simple regression with simple intercepts.
Abstract: 1. Introduction 2. Homogeneity test for linear regression models (analysis of covariance) 3. Simple regression with variable intercepts 4. Dynamic models with variable intercepts 5. Simultaneous-equations models 6. Variable-coefficient models 7. Discrete data 8. Truncated and censored data 9. Cross-sectional dependent panel data 10. Dynamic system 11. Incomplete panel data 12. Miscellaneous topics 13. A summary view.

6,234 citations


Book
30 Apr 1986
TL;DR: The authors presents a theoretical treatment of externalities (i.e., uncompensated interdependencies), public goods, and club goods, covering asymmetric information, underlying game-theoretic formulations, and intuitive and graphical presentations.
Abstract: This book presents a theoretical treatment of externalities (i.e. uncompensated interdependencies), public goods, and club goods. The new edition updates and expands the discussion of externalities and their implications, coverage of asymmetric information, underlying game-theoretic formulations, and intuitive and graphical presentations. Aimed at well-prepared undergraduates and graduate students making a serious foray into this branch of economics, the analysis should also interest professional economists wishing to survey recent advances in the field. No other single source for the range of materials explored is currently available. Topics investigated include Nash equilibrium, Lindahl equilibria, club theory, preference-revelation mechanism, Pigouvian taxes, the commons, Coase Theorem, and static and repeated games. The authors use mathematical techniques only as much as necessary to pursue the economic argument. They develop key principles of public economics that are useful for subfields such as public choice, labor economics, economic growth, international economics, environmental and natural resource economics, and industrial organization.

1,450 citations


Journal ArticleDOI
TL;DR: In this paper, stable-isotopic analyses have been performed on live and modern specimens of aragonitic foraminifera, gastropods and scaphopods.
Abstract: To better interpret the isotopic composition of ancient aragonitic fossils, stable-isotopic analyses have been performed on live and modern specimens of aragonitic foraminifera, gastropods and scaphopods. Samples were collected from the continental margins off southern California and Texas, U.S.A., and Mexico, and provide a range in ambient temperature of 2.6–22.0°C. We observed a strong covariance between the δ18O of the aragonitic foraminifera Hoeglundina elegans and that of coeval aragonitic mollusks. On the average, Hoeglundina was 0.2 ± 0.2‰ depleted in 18O relative to the mollusks, and 0.6 ± 0.3‰ enriched relative to the calcitic foraminifera Uvigerina. This enrichment in 18O of aragonite relative to calcite is similar to that observed in previous experimental and theoretical studies. The temperature dependences of mollusk and Hoeglundina δ18O-values were not notably different from that previously determined for inorganically precipitated calcite, and no significant temperature dependence in Hoeglundina-Uvigerina18O fractionation was observed. Of note is the temperature dependence of the δ13C of the biogenic aragonite. Relative to the dissolved inorganic carbon (DIC), the δ13C of Hoeglundina and the mollusks decreased by 0.11 and 0.13‰, respectively, per °C increase in temperature. The temperature dependence in Hoeglundina-DIC 13C enrichment, and the lack of it in Uvigerina-DIC enrichment, accounts for the temperature dependence in Hoeglundina-Uvigerina (calcitic) fractionation noted by us and previous workers. Isotopic differences between coeval specimens of these genera provide a rough measure of paleotemperature without requiring a knowledge of the isotopic composition of the paleo-ocean.

1,154 citations


Book
12 Sep 1986
TL;DR: In this article, the authors describe the evolution of control as the engine of the information society and present a vision of the control as an essential life process in the modern world, and present an approach towards an information society from control crisis to control revolution.
Abstract: 1. Introduction PART I: Living Systems, Technology, and the Evolution of Control 2. Programming and Control: The Essential Life Process 3. Evolution of Control: Culture and Society PART II: Industrialization, Processing Speed, and the Crisis of control 4. From tradition to rationally: Distributing Control 5. Toward Industrialization: Controlling Energy and Speed 6. Industrial Revolution and the Crisis of Control PART III: Toward an Information Society: From Control Crisis to Control Revolution 7. Revolution in Control of Mass Production and Distribution 8. Revolution in Control of Mass Consumption 9. Revolution in Generalized Control: Data Processing and Bureaucracy 10. Conclusions: Control as Engine of the Information Society References Index

1,108 citations


Journal ArticleDOI
TL;DR: In this article, a meta-analytic literature review testing cognitive, affective, and contingency models of the effects of participation in decision making on employees' satisfaction and productiv...
Abstract: This paper reports a meta-analytic literature review testing cognitive, affective, and contingency models of the effects of participation in decision making on employees' satisfaction and productiv...

955 citations


Journal ArticleDOI
TL;DR: In this article, a nonlinear relationship between electricity sales and temperature is estimated using a semiparametric regression procedure that easily allows linear transformations of the data and accommodates introduction of covariates, timing adjustments due to the actual billing schedules, and serial correlation.
Abstract: A nonlinear relationship between electricity sales and temperature is estimated using a semiparametric regression procedure that easily allows linear transformations of the data. This accommodates introduction of covariates, timing adjustments due to the actual billing schedules, and serial correlation. The procedure is an extension of smoothing splines with the smoothness parameter estimated from minimization of the generalized cross-validation criterion introduced by Craven and Wahba (1979). Estimates are presented for residential sales for four electric utilities and are compared with models that represent the weather using only heating and cooling degree days or with piecewise linear splines.

954 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examined weekly and intradaily patterns in common stock prices using transaction data and found that negative Monday close-to-close returns accrue between the Friday close and the Monday open; for smaller firms they accrue primarily during the Monday trading day.

855 citations


Journal ArticleDOI
TL;DR: This paper presents a model for analyzing the performance of transmission strategies in a multihop packet radio network where each station has adjustable transmission radius and shows that the network can achieve better performance by suitably controlling the transmission range.
Abstract: This paper presents a model for analyzing the performance of transmission strategies in a multihop packet radio network where each station has adjustable transmission radius. A larger transmission radius will increase the probability of finding a receiver in the desired direction and contribute bigger progress if the transmission is successful, but it also has a higher probability of collision with other transmissions. The converse is true for shorter transmission range. We illustrate our model by comparing three transmission strategies. Our results show that the network can achieve better performance by suitably controlling the transmission range. One of the transmission strategies, namely transmitting to the nearest forward neighbor by using adjustable transmission power, has desirable features in a high terminal density environment.

730 citations


Journal ArticleDOI
TL;DR: The concepts of efficacy and effectiveness are examined from the viewpoints of the traditions and philosophies of health-care research and social program evaluation, and eight phases of research are suggested for the development of health promotion programs.

729 citations


Journal ArticleDOI
TL;DR: This 'minimal modelling approach' fits two mathematical models with FSIGT glucose and insulin data: one of glucose disappearance and one of insulin kinetics, and MINMOD is the computer program which identifies the model parameters for each individual.

725 citations


Journal ArticleDOI
TL;DR: In this paper, the authors proposed a direct adaptive control algorithm which is robust with respect to additive and multiplicative plant unmodeled dynamics, which guarantees boundedness of all signals in the adaptive loop and small residual tracking errors for any bounded initial conditions.
Abstract: This paper proposes a new direct adaptive control algorithm which is robust with respect to additive and multiplicative plant unmodeled dynamics. The algorithm is designed based on the reduced-order plant, which is assumed to be minimum phase and of known order and relative degree, but is analyzed with respect to the overall plant which, due to the unmodeled dynamics, may be nonminimum phase and of unknown order and relative degree. It is shown that if the unmodeled dynamics are sufficiently small in the low-frequency range, then the algorithm guarantees boundedness of all signals in the adaptive loop and "small" residual tracking errors for any bounded initial conditions. In the absence of unmodeled dynamics, the residual tracking error is shown to be zero.

Journal ArticleDOI
TL;DR: In this paper, the authors consider a linear dynamic model with moving average errors, and consider a heteroscedastic model which represents an extension of the ARCH model introduced by Engle.
Abstract: In the context of a linear dynamic model with moving average errors, we consider a heteroscedastic model which represents an extension of the ARCH model introduced by Engle [4]. We discuss the properties of maximum likelihood and least squares estimates of the parameters of both the regression and ARCH equations, and also the properties of various tests of the model that are available. We do not assume that the errors are normally distributed.

Journal ArticleDOI
TL;DR: An approach is presented for the estimation of object motion parameters based on a sequence of noisy images that may be of use in situations where it is difficult to resolve large numbers of object match points, but relatively long sequences of images are available.
Abstract: An approach is presented for the estimation of object motion parameters based on a sequence of noisy images. The problem considered is that of a rigid body undergoing unknown rotational and translational motion. The measurement data consists of a sequence of noisy image coordinates of two or more object correspondence points. By modeling the object dynamics as a function of time, estimates of the model parameters (including motion parameters) can be extracted from the data using recursive and/or batch techniques. This permits a desired degree of smoothing to be achieved through the use of an arbitrarily large number of images. Some assumptions regarding object structure are presently made. Results are presented for a recursive estimation procedure: the case considered here is that of a sequence of one dimensional images of a two dimensional object. Thus, the object moves in one transverse dimension, and in depth, preserving the fundamental ambiguity of the central projection image model (loss of depth information). An iterated extended Kalman filter is used for the recursive solution. Noise levels of 5-10 percent of the object image size are used. Approximate Cramer-Rao lower bounds are derived for the model parameter estimates as a function of object trajectory and noise level. This approach may be of use in situations where it is difficult to resolve large numbers of object match points, but relatively long sequences of images (10 to 20 or more) are available.

Journal ArticleDOI
TL;DR: A system that takes a gray level image as input, locates edges with subpixel accuracy, and links them into lines and notes that the zero-crossings obtained from the full resolution image using a space constant ¿ for the Gaussian, are very similar, but the processing times are very different.
Abstract: We present a system that takes a gray level image as input, locates edges with subpixel accuracy, and links them into lines. Edges are detected by finding zero-crossings in the convolution of the image with Laplacian-of-Gaussian (LoG) masks. The implementation differs markedly from M.I.T.'s as we decompose our masks exactly into a sum of two separable filters instead of the usual approximation by a difference of two Gaussians (DOG). Subpixel accuracy is obtained through the use of the facet model [1]. We also note that the zero-crossings obtained from the full resolution image using a space constant ? for the Gaussian, and those obtained from the 1/n resolution image with 1/n pixel accuracy and a space constant of ?/n for the Gaussian, are very similar, but the processing times are very different. Finally, these edges are grouped into lines using the technique described in [2].

Journal ArticleDOI
TL;DR: In this paper, a theory of damage mechanics for brittle porous solids in multiaxial compression is proposed. But the model is restricted to the case of glass and is not suitable for brittle materials.

Journal ArticleDOI
TL;DR: The focus of this review is on recent developments in topical ocular drug delivery systems relative to their success in overcoming the constraints imposed by the eye and to the improvements that have yet to be made.
Abstract: Existing ocular drug delivery systems are fairly primitive and inefficient, but the stage is set for the rational design of newer and significantly improved systems The focus of this review is on recent developments in topical ocular drug delivery systems relative to their success in overcoming the constraints imposed by the eye and to the improvements that have yet to be made In addition, this review attempts to place in perspective the importance of pharmacokinetic modeling, ocular drug pharmacokinetic and bioavailability studies, and choice of animal models in the design and evaluation of these delivery systems Five future challenges are perceived to confront the field These are: (a) The extent to which the protective mechanisms of the eye can be safely altered to facilitate drug absorption, (b) Delivery of drugs to the posterior portion of the eye from topical dosing, (c) Topical delivery of macromolecular drugs including those derived from biotechnology, (d) Improved technology which wil

Journal ArticleDOI
23 May 1986-Cell
TL;DR: A transgenic mouse strain is constructed in which a mammary tumor virus LTR/c-myc fusion gene is anomalously expressed in a wide variety of tissues, and the deregulated c- myc transgene, now glucocorticoid inducible, contributes to an increased incidence of a variety of tumors.

Journal ArticleDOI
TL;DR: In this article, the adaptive control of a class of large-scale systems formed of an arbitrary interconnection of subsystems with unknown parameters, nonlinearities, and bounded disturbances is investigated.
Abstract: The adaptive control of a class of large-scale systems formed of an arbitrary interconnection of subsystems with unknown parameters, nonlinearities, and bounded disturbances is investigated. It is first shown that no matter how weak the interconnections are, a decentralized adaptive control scheme can become unstable. Approaches are then developed for stabilization and tracking using new decentralized adaptive controllers. In the case where the relative degree n^{*} of the transfer function of each decoupled subsystem is less than or equal to two, sufficient conditions are established which guarantee boundedness and exponential convergence of the state and parameter errors to bounded residual sets. In the absence of disturbances and interconnections, the decentralized adaptive control schemes guarantee exact convergence of the tracking errors to zero. The effectiveness of the proposed adaptive schemes is demonstrated using a simple example.

Journal ArticleDOI
09 May 1986-Science
TL;DR: HIV-1 RNA load testing is sometimes requested to resolve equivocal serologic findings or to facilitate the diagnosis of HIV-1 infection during the acute phase or in a pediatric setting.
Abstract: The human immunodeficiency virus (HIV) is the etiologic agent of AIDS. HIVs are enveloped plus-stranded RNA viruses. The HIV genome is organized similarly to other retroviruses. It contains the gag, pol, and env genes which encode structural proteins, viral enzymes, and envelope glycoproteins, respectively. The major structural proteins which are encoded by the gag gene include p17, p24, p7, and p9. Replication begins with the attachment of virus to the target cell via the interaction of gp120 and the cellular receptor CD4. Both HIV-1 and HIV-2 have the same modes of transmission. The most common mode of HIV infection is sexual transmission at the genital mucosa through direct contact with infected blood fluids, including blood, semen, and vaginal secretions. Serological testing for HIV antibody is used for various purposes, including primary diagnosis, screening of blood products, management of untested persons in labor and delivery, evaluation of occupational exposures to blood/body fluid, and epidemiological surveillance. The first generation of HIV antibody assays relied on the detection of antibody to HIV viral protein lysates. A test using a sandwich-capture format and significantly more blood than other methods was more sensitive in early seroconversion. HIV-1 RNA load testing is sometimes requested to resolve equivocal serologic findings or to facilitate the diagnosis of HIV-1 infection during the acute phase or in a pediatric setting.

Journal ArticleDOI
TL;DR: The use of 15N to measure the flux of nitrogen compounds has become increasingly popular as the techniques and instrumentation for stable isotope analysis have become more widely available as discussed by the authors, especially for the research conducted in oligotrophic regions.
Abstract: The use of 15N to measure the flux of nitrogen compounds has become increasingly popular as the techniques and instrumentation for stable isotope analysis have become more widely available. Questions concerning equations for calculating uptake, effect of isotope dilution (in the case of ammonium), duration of incubation, and relationship between disappearance of a nitrogen com- pound and the 15N uptake measurement have arisen, especially for the research conducted in oligotrophic regions. Fewer problems seem to have occurred in eutrophic areas. However, sufficient literature now exists to allow some generally accepted experimental procedures for 15N studies in eutrophic regions to be laid down. Incubation periods of 2-6 h appear to avoid problems related to isotope dilution and to overcome the bias introduced in some cases by initial high rate or surge uptake. During such incubation periods, assimilation is measured rather than uptake or transport into the cell. Incorporation of 15N into the particulate fraction is usually linear with time over the periods currently used. The 15N method provides a better estimate of incorporation into phytoplankton than 14N disappearance, but a small fraction appears to be lost. Although most workers suggest the loss to be a result of dissolved organic nitrogen production, direct evidence is lacking. If the considerations discussed here are applied with the lSN techniques currently available, reliable estimates of phytoplankton nitrogen flux in eutrophic areas can be obtained.

Journal ArticleDOI
TL;DR: In this article, a field study of the automobile distribution channel is used to examine how dealers' perceptions of their manufacturers' power are related to the latter's use of coercive and non-coercive power.
Abstract: Data from a field study of the automobile distribution channel are used to examine how dealers’ perceptions of their manufacturers’ power are related to the latter's use of coercive and noncoercive...

Journal ArticleDOI
TL;DR: Criteria for the classification of juvenile rheumatoid arthritis were analyzed in a detailed database of 250 children in order to assess the accuracy of diagnosis and validity of onset types and course subtypes.
Abstract: Criteria for the classification of juvenile rheumatoid arthritis were analyzed in a detailed database of 250 children in order to assess the accuracy of diagnosis and validity of onset types and course subtypes. A number of conclusions have been derived from this study: All definitions of the 1973 criteria for classification of juvenile rheumatoid arthritis should be retained. The addition of onset types to the 1976 revision of the criteria has been validated. The course of the disease after the onset period of 6 months is as important to the outcome of a group of children as is the onset type. The current classification should be broadened to include the course subtypes.

Posted Content
TL;DR: In this paper, the authors examined the impact of underpricing on investor uncertainty and on the investment bankers who take the firms public and found that the greater investor uncertainty in the value of the stock, the greater the underprice is expected to be.
Abstract: Examines the underpricing of initial public offerings (IPOs) and the impact of this underpricing on investor uncertainty and on the investment bankers who take the firms public. The firms going public lack the credibility to assert that the offering price is below the expected market price because they only go public once. As a result, these firms seek the help of investment bankers who, through their underwriting process, take many firms public. Data used in the analysis were collected from 1,028 firms that went public between 1977 and 1982. Support is shown for the proposition that the greater the investor uncertainty in the value of the stock, the greater the underpricing is expected to be. The results further show that investment bankers who cheat on underpricing equilibrium by underpricing too much or too little are penalized by the market. Three conditions must be met for investment bankers to be willing to strive for underpricing equilibrium. These are: (1) uncertainty as to the market price of the stock when it beings trading, (2) reputation capital of the investment banker that cannot be repaired, and (3) decline in return on reputation capital if investment banker cheats on underpricing. Given these results, it becomes evident that investment bankers enforce the underpricing equilibrium. (SRD)

Journal ArticleDOI
TL;DR: In this paper, a typology of moderators based on the mechanisms by which moderators operate is proposed and applied to several contingency theories of leadership and implications for future leadership research and practice are discussed.
Abstract: Much recent research on leadership has concerned moderator (contingency) variables. This research has yielded equivocal and/or conflicting results. Conceptually distinct variables have been treated as if they operate in the same fashion. This paper suggests a typology of moderators based on the mechanisms by which moderators operate. Moderators are classified as neutralizers/enhancers, substitutes/supplements, or mediators depending on how they affect leader behavior-criterion relationships. The typology is applied to several contingency theories of leadership and implications for future leadership research and practice are discussed.

Journal ArticleDOI
TL;DR: This article proposes an integrated product classification scheme that adds “preference” products to the conventional convenience, shopping, and specialty categories in terms of the effort and risk dimensions of price.
Abstract: This article proposes an integrated product classification scheme. It is argued that, in view of the 1985 definition of marketing, one classification for all products—goods, services, and ideas—is ...

Journal ArticleDOI
01 Jan 1986-Geology
TL;DR: In this paper, a trace-fossil tiering model was proposed to estimate changes in the degree of paleooxygenation of bottom waters recorded in fine-grained pelagic strata.
Abstract: Recognition of fluctuations in the degree of paleo-oxygenation of bottom waters recorded in fine-grained pelagic strata is important for interpretation of paleoceanographic and paleoclimatologic conditions. General sedimentary fabric, composition of trace-fossil assemblages, and burrow size and crosscutting relationships have been incorporated into a trace-fossil tiering model that permits detailed reconstruction of changes in paleo-oxygenation of bottom waters. Applications of this model to the Miocene Monterey Formation (California) and the Cretaceous Niobrara Formation (Colorado) indicate that the ichnologic approach is more sensitive to both magnitude and rates of change in oxygenation levels compared to macrobenthic body-fossil information.

Journal ArticleDOI
TL;DR: Examining two natural texts, it is seen that the relational propositions involve every clause, and that they occur in a pattern of propositions which connects all of the clauses together.
Abstract: In addition to the propositions represented explicitly by independent clauses in a text, there are almost as many propositions, here called relational propositions, which arise (often implicitly) out of combinations of these clauses. The predicates of these propositions are members of a small set of general, highly recurrent relational predicates, such as “cause,” “justification,” and “solutionhood.” Often unsignalled, these relational propositions can be shown to be the basis for various kinds of inferences and to function as elements of communicative acts. Examining two natural texts, we see that the relational propositions involve every clause, and that they occur in a pattern of propositions which connects all of the clauses together. This examination also shows how the relational propositions are essential to the functioning of the text.

Journal ArticleDOI
TL;DR: In this article, the authors investigate the effect of coherence spatiale particuliere on the performance of a mouvement du sol spatialement aleatoire in terms of its ability to produce effets similaires aux effets deterministes.
Abstract: Methode d'obtention de la reponse dynamique d'une fondation rigide etendue sur un demi-espace elastique lorsqu'elle est soumise a un mouvement du sol variant spatialement incluant a la fois des effets deterministes et aleatoires. Description des resultats numeriques pour une fondation carree rigide et pour un mouvement du sol caracterise par une fonction de coherence spatiale particuliere. Les resultats obtenus indiquent que le mouvement du sol spatialement aleatoire produit des effets similaires aux effets deterministes du passage d'une onde incluant la reduction des composantes de translation de la reponse aux hautes frequences et la creation d'un balancement

Journal ArticleDOI
TL;DR: The distribution of population and employment in metropolitan Los Angeles in 1970 and 1980 is examined in this article, where the geographical distribution of employment combined job clustering around a few major employment centers with a high degree of general job dispersion.
Abstract: The distribution of population and employment in metropolitan Los Angeles in 1970 and 1980 is examined in this paper. Population continued to disperse in the 1970s, whereas the geographical distribution of employment combined job clustering around a few major employment centers with a high degree of general job dispersion. In Los Angeles polycentrism has been associated with shorter work trips, particularly intracounty trips in the more peripheral counties.

Journal ArticleDOI
TL;DR: In this paper, the authors examine possible connections between the two technologies and discuss some issues related to their integration, and propose a method to integrate expert systems with decision support systems, which may enhance the quality and efficiency of both computerized systems.
Abstract: Expert systems are emerging as a powerful tool for decision making. Integrating expert systems with decision support systems may enhance the quality and efficiency of both computerized systems. This article examines possible connections between the two technologies and discusses some issues related to their integration.