scispace - formally typeset
Search or ask a question

Showing papers by "Stanford University published in 1980"


Journal ArticleDOI
TL;DR: An efficient and intuitive algorithm is presented for the design of vector quantizers based either on a known probabilistic model or on a long training sequence of data.
Abstract: An efficient and intuitive algorithm is presented for the design of vector quantizers based either on a known probabilistic model or on a long training sequence of data. The basic properties of the algorithm are discussed and demonstrated by examples. Quite general distortion measures and long blocklengths are allowed, as exemplified by the design of parameter vector quantizers of ten-dimensional vectors arising in Linear Predictive Coded (LPC) speech compression with a complicated distortion measure arising in LPC analysis that does not depend only on the error vector.

7,935 citations


Journal ArticleDOI
TL;DR: A structure for representation of patient outcome is presented, together with a method for outcome measurement and validation of the technique in rheumatoid arthritis, and these techniques appear extremely useful for evaluation of long term outcome of patients with rheumatic diseases.
Abstract: A structure for representation of patient outcome is presented, together with a method for outcome measurement and validation of the technique in rheumatoid arthritis. The paradigm represents outcome by five separate dimensions: death, discomfort, disability, drug (therapeutic) toxicity, and dollar cost. Each dimension represents an outcome directly related to patient welfare. Quantitation of these outcome dimensions may be performed at interview or by patient questionnaire. With standardized, validated questions, similar scores are achieved by both methods. The questionnaire technique is preferred since it is inexpensive and does not require interobserver validation. These techniques appear extremely useful for evaluation of long term outcome of patients with rheumatic diseases.

4,253 citations


Journal ArticleDOI
TL;DR: The average age at first infirmity can be raised, thereby making the morbidity curve more rectangular, and present data allow calculation of the ideal average life span, approximately 85 years.
Abstract: The average length of life has risen from 47 to 73 years in this century, but the maximum life span has not increased Therefore, survival curves have assumed an ever more rectangular form Eighty per cent of the years of life lost to nontraumatic, premature death have been eliminated, and most premature deaths are now due to the chronic diseases of the later years Present data allow calculation of the ideal average life span, approximately 85 years Chronic illness may presumably be postponed by changes in life style, and it has been shown that the physiologic and psychologic markers of aging may be modified Thus, the average age at first infirmity can be raised, thereby making the morbidity curve more rectangular Extension of adult vigor far into a fixed life span compresses the period of senescence near the end of life Health-research strategies to improve the quality of life require careful study of the variability of the phenomena of aging and how they may be modified

3,007 citations


Journal ArticleDOI
TL;DR: Plasmid cloning vectors that enable insertion of DNA fragments between the inducible ara (arabinose) promoter and the lac (lactose) structural genes have been constructed and used for the detection and analysis of signals that control gene transcription.

2,442 citations


Journal ArticleDOI
TL;DR: In this article, a systematic analysis in perturbative quantum chromodynamics (QCD) of large-momentum-transfer exclusive processes is presented, where the scaling behavior, angular dependence, helicity structure, and normalization of elastic and inelastic form factors and large-angle exclusive scattering amplitudes for hadrons and photons are given.
Abstract: We present a systematic analysis in perturbative quantum chromodynamics (QCD) of large-momentum-transfer exclusive processes. Predictions are given for the scaling behavior, angular dependence, helicity structure, and normalization of elastic and inelastic form factors and large-angle exclusive scattering amplitudes for hadrons and photons. We prove that these reactions are dominated by quark and gluon subprocesses at short distances, and thus that the dimensional-counting rules for the power-law falloff of these amplitudes with momentum transfer are rigorous predictions of QCD, modulo calculable logarithmic corrections from the behavior of the hadronic wave functions at short distances. These anomalous-dimension corrections are determined by evolution equations for process-independent meson and baryon "distribution amplitudes" $\ensuremath{\varphi}({x}_{i}, Q)$ which control the valence-quark distributions in high-momentum-transfer exclusive reactions. The analysis can be carried out systematically in powers of ${\ensuremath{\alpha}}_{s}({Q}^{2})$, the QCD running coupling constant. Although the calculations are most conveniently carried out using light-cone perturbation theory and the light-cone gauge, we also present a gauge-independent analysis and relate the distribution amplitude to a gauge-invariant Bethe-Salpeter amplitude.

2,239 citations


Journal ArticleDOI
TL;DR: The authors formalizes such conjectural reasoning and shows that the objects they can determine to have certain properties or relations are the only objects that do, which is a common assumption in human and intelligent computer programs.

1,892 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examined elements of an efficiency-based theory of the multiproduct firm and extended the theoretical framework developed by Williamson to explain vertical integration to explain diversification.
Abstract: This paper examines elements of an efficiency-based theory of the multiproduct firm. The theoretical framework developed by Williamson to explain vertical integration is extended to explain diversification. The proposition is advanced that a cost function displaying economies of scope has no direct implications for the scope of the business enterprise. However, if economies of scope are based upon the common and recurrent use of proprietary knowhow or the common and recurrent use of a specialized and indivisible physical asset, then multiproduct enterprise (diversification) is an efficient way of organizing economic activity. These propositions are first developed in a general context and then examined in the context of diversification in the U.S. Petroleum industry.

1,780 citations


Journal ArticleDOI
TL;DR: In this article, the Enrico Fermi Institute and Department of Physics, The University of Chicago, Chicago, Illinois, USA Peter B. GILKEY Fine Hall, Box 37.

1,514 citations


Journal ArticleDOI
TL;DR: In this article, the decay of the ground state of higher energy is modeled as a false vacuum, rendered unstable by barrier penetration, and the effect of gravitation on the decay process is considered.
Abstract: It is possible for a classical field theory to have two stable homogeneous ground states, only one of which is an absolute energy minimum. In the quantum version of the theory, the ground state of higher energy is a false vacuum, rendered unstable by barrier penetration. There exists a well-established semiclassical theory of the decay of such false vacuums. In this paper, we extend this theory to include the effects of gravitation. Contrary to naive expectation, these are not always negligible, and may sometimes be of critical importance, especially in the late stages of the decay process.

1,467 citations


Proceedings ArticleDOI
J. Salisbury1
01 Dec 1980
TL;DR: In this article, a method of actively controlling the apparent stiffness of a manipulator end effecter is presented, which allows the programmer to specify the three transnational and three rotational stiffness properties of a frame located arbitrarily in hand coordinates.
Abstract: A method of actively controlling the apparent stiffness of a manipulator end effecter is presented. The approach allows the programmer to specify the three transnational and three rotational stiffness of a frame located arbitrarily in hand coordinates. Control of the nominal position of the hand then permits simultaneous position and force control. Stiffness may be changed under program control to match varying task requirements. A rapid servo algorithm is made possible by transformation of the problem into joint space at run time. Applications examples are given.

1,212 citations


Journal ArticleDOI
TL;DR: In this paper, the small signal a-c impedance of the cell Li]LiAsF6 (0.75M) in propylene carbonatel Lip V~O3 thin film on tin oxide covered glass substrate has been measured at room temperature as a function of frequency from 5  10 -4 Hz to 5 X l0 s Hz at various open-circuit voltages.
Abstract: The small signal a-c impedance of the cell Li]LiAsF6 (0.75M) in propylene carbonatel Lip V~O3 thin film on tin oxide covered glass substrate has been measured at room temperature as a function of frequency from 5  10 -4 Hz to 5 X l0 s Hz at various open-circuit voltages. The diffusion equations have been solved for the appropriate finite boundary conditions, and analysis of the impedance data by the complex plane method yields values for the chemical diffusion coefficient, the component diffusion coefficient, the partial ionic conductivity of lithium, and the thermodynamic enhancement factor for LiyWO3 as a function of y. The films of WO3 were prepared by vacuum evaporation and were largely amorphous to x-rays. The chemical diffusion coefficient has a value of 2.4  10 -12 cm2/sec at y _-- 0.1, increasing to 2.8 X 10 -11 cm2/sec at y ---- 0.26. At short times (t ~ 0.5 sec) the interracial charge transfer reaction is important, but at longer times the rate of lithium injection is determined by the diffusion kinetics.

Journal ArticleDOI
TL;DR: In this paper, a linear polarizer or Brewster plate is placed inside the reference cavity, so that the reflected light acquires a frequency-dependent elliptical polarization, which can provide the error signal for electronic frequency stabilization without any need for modulation techniques.

Journal ArticleDOI
TL;DR: A status organizing process is any process in which evaluations of and beliefs about the characteristics of actors become the basis of observable inequalities in face-to-face social interaction as discussed by the authors, i.e., any characteristic of an actor around which evaluations about them come to be organized, such as age, sex, race, ethnicity, education, occupation, physical attractiveness, intelligence quotients, etc.
Abstract: This chapter reviews theory and research on status organizing processes. A status organizing process is any process in which evaluations of and beliefs about the characteristics of actors become the basis of observable inequalities in face-to-face social interaction. The key concept in the study of status organizing processes is the status characteristic, any characteristic of actors around which evaluations of and beliefs about them come to be organized. Examples include age, sex, race, ethnicity, education, occupation, physical attractiveness, intelligence quotients, reading ability-but there are many others. In the present article we review (a) the current state of the theory of such processes; (b) relevant theoretical research as of September, 1979; (c) a selection of the relevant applied research, with particular reference to sex, race, and physical attractiveness; and (d) some of the interventions that have been devel­ oped to reduce undesired consequences of the process. The phenomenon with which a theory of status organizing processes is concerned is most commonly observed in the study of problem-solving groups whose members differ in status characteristics significant in the larger society. Such groups do not create a social organization de novo, out of the interaction of their members, but instead maintain external status differences inside the group. That informal problem-solving groups evolve inequalities in participa­ tion, evaluation, and influence was shown by Bales in the early fifties

Journal ArticleDOI
12 Dec 1980-Science
TL;DR: Two- to threefold variations in sleep length were observed in 12 subjects living on self-selected schedules in an environment free of time cues and the duration of polygraphically recorded sleep episodes was highly correlated with the circadian phase of the body temperature rhythm at bedtime.
Abstract: Two- to threefold variations in sleep length were observed in 12 subjects living on self-selected schedules in an environment free of time cues. The duration of polygraphically recorded sleep episodes was highly correlated with the circadian phase of the body temperature rhythm at bedtime and not with the length of prior wakefulness. Furthermore, the rate of REM (rapid eye movement) sleep accumulation , REM latency, bedtime selection, and self-rated alertness assessments were also correlated with the body temperature rhythm.

Journal ArticleDOI
TL;DR: This paper presents an approach to hierarchical planning, termed constraint posting, that uses constraints to represent the interactions between subproblems and is illustrated with a computer program that plans gene-cloning experiments in molecular genetics.

Journal ArticleDOI
TL;DR: This article derived a relationship between prices changes and earnings changes by expanding the information upon which earnings expectations are conditioned to include data other than prior earnings history, and used price as a surrogate for additional information available to market participants.

Book ChapterDOI
01 Jan 1980
TL;DR: The problem of "solving" equations, the problem of proving termination of sets of rewrite rules, and the decidability and complexity of word problems and of combinations of equational theories are discussed.
Abstract: Equations occur frequently in mathematics, logic and computer science. In this paper, we survey the main results concerning equations, and the methods available for reasoning about them and computing with them. The survey is self-contained and unified, using traditional abstract algebra. Reasoning about equations may involve deciding if an equation follows from a given set of equations (axioms), or if an equation is true in a given theory. When used in this manner, equations state properties that hold between objects. Equations may also be used as definitions; this use is well known in computer science: programs written in applicative languages, abstract interpreter definitions, and algebraic data type definitions are clearly of this nature. When these equations are regarded as oriented "rewrite rules," we may actually use them to compute. In addition to covering these topics, we discuss the problem of "solving" equations (the "unification" problem), the problem of proving termination of sets of rewrite rules, and the decidability and complexity of word problems and of combinations of equational theories. We restrict ourselves to first-order equations, and do not treat equations which define non-terminating computations or recent work on rewrite rules applied to equational congruence classes.


Journal ArticleDOI
TL;DR: A probabilistic method is presented which cryptanalyzes any N key cryptosystem in N 2/3 operational with N2/3 words of memory after a precomputation which requires N operations, and works in a chosen plaintext attack and can also be used in a ciphertext-only attack.
Abstract: A probabilistic method is presented which cryptanalyzes any N key cryptosystem in N^{2/3} operational with N^{2/3} words of memory (average values) after a precomputation which requires N operations. If the precomputation can be performed in a reasonable time period (e.g, several years), the additional computation required to recover each key compares very favorably with the N operations required by an exhaustive search and the N words of memory required by table lookup. When applied to the Data Encryption Standard (DES) used in block mode, it indicates that solutions should cost between 1 and 100 each. The method works in a chosen plaintext attack and, if cipher block chaining is not used, can also be used in a ciphertext-only attack.

Journal ArticleDOI
19 Sep 1980-Science
TL;DR: Transfection of cultured monkey kidney cells with recombinant DNA constructed with a cloned Escherichia coli gene that codes for xanthine-guanine phosphoribosyltransferase and several different SV40 DNA-based vectors results in the synthesis of readily measurable quantities of the bacterial enzyme.
Abstract: Transfection of cultured monkey kidney cells with recombinant DNA constructed with a cloned Escherichia coli gene that codes for xanthine-guanine phosphoribosyltransferase and several different SV40 DNA-based vectors, results in the synthesis of readily measurable quantities of the bacterial enzyme. Moreover, the physiological defect in purine nucleotide synthesis characteristic of human Lesch-Nyhan cells can be overcome by the introduction of the bacterial gene into these cells.

Journal ArticleDOI
24 Oct 1980-Science
TL;DR: Applications to perceptual and semantic data illustrate how complementary aspects of the underlying psychological structure are revealed by different types of representations, including multidimensional spatial configurations and nondimensional tree-structures or clusterings.
Abstract: American mathematical psychologists have developed computer-based methods for constructing representations of the psychological structure of a set of stimuli on the basis of pairwise measures of similarity or confusability. Applications to perceptual and semantic data illustrate how complementary aspects of the underlying psychological structure are revealed by different types of representations, including multidimensional spatial configurations and nondimensional tree-structures or clusterings.

Journal ArticleDOI
TL;DR: In this article, a set of studies tested the explanatory and predictive generality of self-efficacy theory across additional treatment modalities and behavioral domains, including symbolic modeling, for the treatment of agoraphobia.
Abstract: The present set of studies tested the explanatory and predictive generality of self-efficacy theory across additional treatment modalities and behavioral domains. Microanalysis of changes accompanying symbolic modeling indicates that this mode of treatment enhances coping behavior partly through its effects on perceived efficacy. Cognizing modeled mastery of threats increased phobics' self-percepts of efficacy, which, in turn, predicted their specific performance attainments on tasks of varying threat value. Examination of efficacy probes revealed that making efficacy judgments has no effect on subsequent avoidance behavior or on fear arousal. The close congruence found between changes in self-efficacy and different forms of coping behavior in the treatment of agoraphobia provides some evidence for the generality of efficacy theory across different areas of functioning. Microanalysis of anticipatory and performance fear arousal accompanying varying strengths of self-efficacy also lends support for the social learning conception of fear arousal in terms of perceived coping inefficacy.

Journal ArticleDOI
TL;DR: In this article, a deductive approach to program synthesis is presented for the construction of recursive programs, which regards program synthesis as a theoremproving task and relies on a theorem-proving method that combines the features of transformation rules, unification and mathematical induction within a single framework.
Abstract: Program synthesis is the systematic derivation of a program from a given specification. A deductive approach to program synthesis is presented for the construction of recursive programs. This approach regards program synthesis as a theorem-proving task and relies on a theorem-proving method that combines the features of transformation rules, unification, and mathematical induction within a single framework.

Posted Content
TL;DR: In this paper, a structural life cycle model of labor supply is proposed to predict the response of hours of work to life cycle wage growth and shifts in the lifetime wage path, using theoretical characterizations derived from an economic model of life cycle behavior.
Abstract: This paper formulates and estimates a structural life cycle model of labor supply. Using theoretical characterizations derived from an economic model of life cycle behavior, a two-stage empirical analysis yields estimates of intertemporal and uncompensated substitution effects which provides the information needed to predict the response of hours of work to life cycle wage growth and shifts in the lifetime wage path. The empirical model developed here provides a natural framework for interpreting estimates found in other work on this topic. It also indicates how cross section specifications of hours of work can be modified to estimate parameters relevant for describing labor supply behavior in a lifetime setting.

Journal ArticleDOI
David Rogosa1
TL;DR: This paper showed that cross-lagged correlation is not a useful procedure for the analysis of longitudinal panel data and that the difference between CLCs is not sound basis for causal inference.
Abstract: Comments that cross-lagged correlation (CLC) is not a useful procedure for the analysis of longitudinal panel data. In particular, the difference between CLCs is not a sound basis for causal inference. Demonstrations of the failure of CLC are based mainly on results for the 2-wave, 2-variable longit

Journal ArticleDOI
TL;DR: It is proposed that the classificatory scheme is still useful in the assessment of contemporary emphases in psychology, such as the present prominence of cognitive psychology to the relative neglect of affection and conation.
Abstract: The tripartite classification of mental activities into cognition, affection, and conation originated in the German faculty psychology of the eighteenth century, but was adopted by the association psychologists of the nineteenth century of Scotland, England, and America. Its influence extended into the twentieth century through the writings of William McDougall. It is proposed that the classificatory scheme is still useful in the assessment of contemporary emphases in psychology, such as the present prominence of cognitive psychology to the relative neglect of affection and conation.


Journal ArticleDOI
TL;DR: In this article, a function f(x) defined on X = X 1 × X 2 × × × X n where each X i is totally ordered satisfying f (x ∨ y) f(xi ∧ y) ≥ f(y) f (y), where the lattice operations ∨ and ∧ refer to the usual ordering on X, is said to be multivariate totally positive of order 2 (MTP2).

Journal ArticleDOI
TL;DR: In this paper, an extensive set of experimental results on the behavior of electron surface mobility in thermally oxidized silicon structures are presented, which allow the calculation of electron mobility under a wide variety of substrate, process, and electrical conditions.
Abstract: Accurate modeling of MOS devices requires quantitative knowledge of carrier mobilities in surface inversion and accumulation layers. Optimization of device structures and accurate circuit simulation, particulary as technologies push toward fundamental limits, necessitate an understanding of how impurity doping levels, oxide charge densities, process techniques, and applied electric fields affect carrier surface mobilities. It is the purpose of this paper to present an extensive set experimental results on the behavior of electron surface mobility in thermally oxidized silicon structures. Empirical equations are developed which allow the calculation of electron mobility under a wide variety of substrate, process, and electrical conditions. The experimental results are interpreted in terms of the dominant physical mechanisms responsible for mobility degradation at the Si/SiO 2 interface. From the observed effects of process parameters on mobility roll-off under high vertical fields, conclusions are drawn about optimum process conditions for maximizing mobility. The implications of this work for performance limits of several types of MOS devices are described.

Journal ArticleDOI
01 Oct 1980-Genetics
TL;DR: The association of alleles among different loci was studied in natural populations of Hordeum spontaneum, the evolutionary progenitor of cultivated barley, and the variance of the number of heterozygous loci in two randomly chosen gametes affords a useful measure of such association.
Abstract: The association of alleles among different loci was studied in natural populations of Hordeum spontaneum, the evolutionary progenitor of cultivated barley. The variance of the number of heterozygous loci in two randomly chosen gametes affords a useful measure of such association. The behavior of this statistic in several particular models is described. Generally, linkage (gametic phase) disequilibrium tends to increase the variance above the value expected under complete independence. This increase is greatest when disequilibria are such as to maximize the sum of squares of the two-locus gametic frequencies.-When data on several loci per individual are available, the observed variance may be tested for its agreement with that expected under the hypothesis of complete interlocus independence, using the sampling theory of this model. When applied to allozyme data from 26 polymorphic populations of wild barley, this test demonstrated the presence of geographically widespread multilocus organization. On average, the variance was 80% higher than expected under random association. Gametic frequencies for four esterase loci in both of these populations of wild barley and two composite crosses of cultivated barley were analyzed. Most generations of the composites showed less multilocus structure, as measured by the indices of association, than the wild populations.