scispace - formally typeset
Search or ask a question

Showing papers by "University of Illinois at Urbana–Champaign published in 1990"


Journal ArticleDOI
TL;DR: In this article, it is shown that many particular choices among possible neurophysiological assumptions are equivalent, in the sense that for every net behaving under one assumption, there exists another net which behaves under another and gives the same results, although perhaps not in the same time.

14,937 citations


Journal ArticleDOI
TL;DR: Criteria for the classification of fibromyalgia are widespread pain in combination with 2) tenderness at 11 or more of the 18 specific tender point sites, and no exclusions are made for the presence of concomitant radiographic or laboratory abnormalities.
Abstract: To develop criteria for the classification of fibromyalgia, we studied 558 consecutive patients: 293 patients with fibromyalgia and 265 control patients. Interviews and examinations were performed by trained, blinded assessors. Control patients for the group with primary fibromyalgia were matched for age and sex, and limited to patients with disorders that could be confused with primary fibromyalgia. Control patients for the group with secondary-concomitant fibromyalgia were matched for age, sex, and concomitant rheumatic disorders. Widespread pain (axial plus upper and lower segment plus left- and right-sided pain) was found in 97.6% of all patients with fibromyalgia and in 69.1% of all control patients. The combination of widespread pain and mild or greater tenderness in greater than or equal to 11 of 18 tender point sites yielded a sensitivity of 88.4% and a specificity of 81.1%. Primary fibromyalgia patients and secondary-concomitant fibromyalgia patients did not differ statistically in any major study variable, and the criteria performed equally well in patients with and those without concomitant rheumatic conditions. The newly proposed criteria for the classification of fibromyalgia are 1) widespread pain in combination with 2) tenderness at 11 or more of the 18 specific tender point sites. No exclusions are made for the presence of concomitant radiographic or laboratory abnormalities. At the diagnostic or classification level, the distinction between primary fibromyalgia and secondary-concomitant fibromyalgia (as defined in the text) is abandoned.

9,289 citations


Journal ArticleDOI
TL;DR: It is proposed that a formal system of organisms be established in which above the level of kingdom there exists a new taxon called a "domain." Life on this planet would be seen as comprising three domains, the Bacteria, the Archaea, and the Eucarya, each containing two or more kingdoms.
Abstract: Molecular structures and sequences are generally more revealing of evolutionary relationships than are classical phenotypes (particularly so among microorganisms). Consequently, the basis for the definition of taxa has progressively shifted from the organismal to the cellular to the molecular level. Molecular comparisons show that life on this planet divides into three primary groupings, commonly known as the eubacteria, the archaebacteria, and the eukaryotes. The three are very dissimilar, the differences that separate them being of a more profound nature than the differences that separate typical kingdoms, such as animals and plants. Unfortunately, neither of the conventionally accepted views of the natural relationships among living systems--i.e., the five-kingdom taxonomy or the eukaryote-prokaryote dichotomy--reflects this primary tripartite division of the living world. To remedy this situation we propose that a formal system of organisms be established in which above the level of kingdom there exists a new taxon called a "domain." Life on this planet would then be seen as comprising three domains, the Bacteria, the Archaea, and the Eucarya, each containing two or more kingdoms. (The Eucarya, for example, contain Animalia, Plantae, Fungi, and a number of others yet to be defined). Although taxonomic structure within the Bacteria and Eucarya is not treated herein, Archaea is formally subdivided into the two kingdoms Euryarchaeota (encompassing the methanogens and their phenotypically diverse relatives) and Crenarchaeota (comprising the relatively tight clustering of extremely thermophilic archaebacteria, whose general phenotype appears to resemble most the ancestral phenotype of the Archaea.

5,689 citations


Journal ArticleDOI
TL;DR: Fluorescent oligonucleotide hybridization probes were used to label bacterial cells for analysis by flow cytometry and the intensity of fluorescence was increased additively by the combined use of two or three fluorescent probes complementary to different regions of the same 16S rRNA.
Abstract: Fluorescent oligonucleotide hybridization probes were used to label bacterial cells for analysis by flow cytometry. The probes, complementary to short sequence elements within the 16S rRNA common to phylogenetically coherent assemblages of microorganisms, were labeled with tetramethylrhodamine and hybridized to suspensions of fixed cells. Flow cytometry was used to resolve individual target and nontarget bacteria (1 to 5 microns) via probe-conferred fluorescence. Target cells were quantified in an excess of nontarget cells. The intensity of fluorescence was increased additively by the combined use of two or three fluorescent probes complementary to different regions of the same 16S rRNA. Images

4,110 citations


Journal ArticleDOI
TL;DR: Fluorescent-dye-conjugated oligonucleotides were used to classify 14 Fibrobacter strains by fluorescence microscopy and the direct detection of F. intestinalis in mouse cecum samples demonstrated the application of this technique to the characterization of complex natural samples.
Abstract: Fluorescent-dye-conjugated oligonucleotides were used to classify 14 Fibrobacter strains by fluorescence microscopy. On the basis of partial 16S rRNA sequences of six Fibrobacter strains, four hybridization probes were designed to discriminate between the species Fibrobacter succinogenes and Fibrobacter intestinalis and to identify F. succinogenes subsp. succinogenes. After in situ hybridization to whole cells of the six sequenced strains, epifluorescence microscopy confirmed probe specificity. The four probes were then used to make presumptive species and subspecies assignments of eight additional Fibrobacter strains not previously characterized by comparative sequencing. These assignments were confirmed by comparative sequencing of the 16S rRNA target regions from the additional organisms. Single-mismatch discrimination between certain probe and nontarget sequences was demonstrated, and fluorescent intensity was shown to be enhanced by hybridization to multiple probes of the same specificity. The direct detection of F. intestinalis in mouse cecum samples demonstrated the application of this technique to the characterization of complex natural samples.

2,272 citations


Book ChapterDOI
01 Jan 1990
TL;DR: In this paper, the authors focus on rewrite systems, which are directed equations used to compute by repeatedly replacing sub-terms of a given formula with equal terms until the simplest form possible is obtained.
Abstract: Publisher Summary This chapter focuses on rewrite systems, which are directed equations used to compute by repeatedly replacing sub-terms of a given formula with equal terms until the simplest form possible is obtained. As a formalism, rewrite systems have the full power of Turing machines and may be thought of as nondeterministic Markov algorithms over terms rather than strings. The theory of rewriting is in essence a theory of normal forms. To some extent, it is an outgrowth of the study of A. Church's Lambda Calculus and H. B. Curry's Combinatory Logic. The chapter discusses the syntax and semantics of equations from the algebraic, logical, and operational points of view. To use a rewrite system as a decision procedure, it must be convergent. The chapter describes this fundamental concept as an abstract property of binary relations. To use a rewrite system for computation or as a decision procedure for validity of identities, the termination property is crucial. The chapter presents the basic methods for proving termination. The chapter discusses the question of satisfiability of equations and the convergence property applied to rewriting.

1,551 citations


Journal ArticleDOI
TL;DR: In this article, the identification of diagenetic alteration of carbon and nitrogen isotope ratios of bone and tooth collagen prepared by a widely used method is presented for assessing sample quality.

1,449 citations


Journal ArticleDOI
TL;DR: The literature on tropical secondary forests, defined as those resulting from human disturbance (e.g., logged forests and forest fallows), is reviewed to address questions related to their extent, rates of formation, ecological characteristics, values and uses to humans, and potential for management as discussed by the authors.
Abstract: The literature on tropical secondary forests, defined as those resulting from human disturbance (e.g. logged forests and forest fallows), is reviewed to address questions related to their extent, rates of formation, ecological characteristics, values and uses to humans, and potential for management. Secondary forests are extensive in the tropics, accounting for about 40% of the total forest area and their rates of formation are about 9 million ha yr−1. Geographical differences in the extent, rates of formation and types of forest being converted exist.Secondary forests appear to accumulate woody plant species at a relatively rapid rate but the mechanisms involved are complex and no clear pattern emerged. Compared to mature forests, the structure of secondary forest vegetation is simple, although age, climate and soil type are modifying factors. Biomass accumulates rapidly in secondary forests, up to 100 t ha−1 during the first 15 yr or so, but history of disturbance may modify this trend. Like biomass, high rates of litter production are established relatively quickly, up to 12–13 t ha−1 yr−1 by age 12–15 yr. And, in younger secondary forests (< 20 yr), litter production is a higher fraction of the net primary productivity than stemwood biomass production. More organic matter is pro duced and transferred to the soil in younger secondary forests than is stored in above-ground vegetation. The impact of this on soil organic matter is significant and explains why the recovery of organic matter in the soil under secondary forests is relatively fast (50 yr or so). Nutrients are accumulated rapidly in secondary vegetation, and are returned quickly by litterfall and decomposition for uptake by roots.We propose a model of the gains and losses, yields and costs, and benefits and tradeoffs to people from the current land-use changes occurring in the tropics. When the conversion of forest lands to secondary forests and agriculture is too fast or land-use stages are skipped, society loses goods and services. To avoid such a loss, we advocate management of tropical forest lands within a landscape perspective, a possibility in the tropics because land tenures and development projects are often large.

1,257 citations


Journal ArticleDOI
TL;DR: The role of the cerebellar cortex in motor learning was investigated by comparing the paramedian lobule of adult rats given difficult acrobatic training to that of rats that had been given extensive physical exercise or had been inactive.
Abstract: The role of the cerebellar cortex in motor learning was investigated by comparing the paramedian lobule of adult rats given difficult acrobatic training to that of rats that had been given extensive physical exercise or had been inactive. The paramedian lobule is activated during limb movements used in both acrobatic training and physical exercise. Acrobatic animals had greater numbers of synapses per Purkinje cell than animals from the exercise or inactive groups. No significant difference in synapse number or size between the exercised and inactive groups was found. This indicates that motor learning required of the acrobatic animals, and not repetitive use of synapses during physical exercise, generates new synapses in cerebellar cortex. In contrast, exercise animals had a greater density of blood vessels in the molecular layer than did either the acrobatic or inactive animals, suggesting that increased synaptic activity elicited compensatory angiogenesis.

1,144 citations


Journal ArticleDOI
TL;DR: The distorted Born iterative method (DBIM) is used to solve two-dimensional inverse scattering problems, thereby providing another general method to solve the two- dimensional imaging problem when the Born and the Rytov approximations break down.
Abstract: The distorted Born iterative method (DBIM) is used to solve two-dimensional inverse scattering problems, thereby providing another general method to solve the two-dimensional imaging problem when the Born and the Rytov approximations break down. Numerical simulations are performed using the DBIM and the method proposed previously by the authors (Int. J. Imaging Syst. Technol., vol.1, no.1, p.100-8, 1989) called the Born iterative method (BIM) for several cases in which the conditions for the first-order Born approximation are not satisfied. The results show that each method has its advantages; the DBIM shows faster convergence rate compared to the BIM, while the BIM is more robust to noise contamination compared to the DBIM. >

1,026 citations


Journal ArticleDOI
TL;DR: Making new friends in the classroom was associated with gains in school performance, and early peer rejection forecasted less favorable school perceptions, higher levels of school avoidance, and lower performance levels over the school year.
Abstract: The potential role that children's classroom peer relations play in their school adjustment was investigated during the first 2 months of kindergarten and the remainder of the school year. Measures of 125 children's classroom peer relationships were obtained on 3 occasions: at school entrance, after 2 months of school, and at the end of the school year. Measures of school adjustment, including children's school perceptions, anxiety, avoidance, and performance, were obtained during the second and third assessment occasions. After controlling mental age, sex, and preschool experience, measures of children's classroom peer relationships were used to forecast later school adjustment. Results indicated that children with a larger number of classroom friends during school entrance developed more favorable school perceptions by the second month, and those who maintained these relationships liked school better as the year progressed. Making new friends in the classroom was associated with gains in school performance, and early peer rejection forecasted less favorable school perceptions, higher levels of school avoidance, and lower performance levels over the school year.


Journal ArticleDOI
TL;DR: In this paper, the contributions that economists have made to understanding standards-setting processes and their consequences for industry structure and economic welfare are surveyed, and major trajectories along which research has been moving are described and related to both the positive and the normative issues concerning compatibility standards that remain to be studied.
Abstract: This paper surveys the contributions that economists have made to understanding standards-setting processes and their consequences for industry structure and economic welfare. Standardization processes of four kinds are examined, namely: (1) market competition involving products embodying unsponsored standards, (2) market competition among sponsored (proprietary) standards, (3) agreements within voluntary standards-writing organizations, a18d (4) direct governmental promulgation. The major trajectories along which research has been moving are described and related to both the positive and the normative issues concerning compatibility standards that remain to be studied.

Journal ArticleDOI
TL;DR: A review indicates that dissatisfied spouses, compared with satisfied spouses, make attributions for the partner's behavior that cast it in a negative light, and experimental, clinical outcome, and longitudinal data suggest that attributions may influence marital satisfaction.
Abstract: The prevailing behavioral account of marriage must be expanded to include covert processes. This article therefore examines the attributions or explanations that spouses make for marital events. A review indicates that dissatisfied spouses, compared with satisfied spouses, make attributions for the partner's behavior that cast it in a negative light. Experimental, clinical outcome, and longitudinal data suggest further that attributions may influence marital satisfaction. Rival hypotheses for these findings are examined. Because continued empirical development in this domain depends on conceptual progress, a framework is presented that integrates attributions, behavior, and marital satisfaction. This framework points to several topics that require systematic study, and specific hypotheses are offered for research on these topics. It is concluded that the promising start made toward understanding marital attributions holds considerable potential for enriching behavioral conceptions of marriage.

Journal ArticleDOI
TL;DR: The committee's revised version of a nomenclature for protein-coding loci in fish closely parallels the one used for human genetics, but improves on it in several respects.
Abstract: The Fish Genetics Section of the American Fisheries Society established its Nomenclature Committee to develop and promote standardized genetic nomenclatures. Here, following public comments on previously published draft guidelines, we present the committee's revised version of a nomenclature for protein-coding loci in fish. This nomenclature closely parallels the one used for human genetics, but improves on it in several respects. The fish system (1) includes standardized abbreviations for commonly analyzed proteins, and provides formal symbols for gene loci encoding these proteins; (2) specifies typographic conventions for distinguishing between genes and proteins and for identifying alleles; (3) provides for multilocus isozyme systems, isoloci, regulatory loci, and pseudogenes; (4) allows important basic information (such as subcellular distributions of gene products, active substrate isomers, recent gene duplicates, and orthologous relationships among loci) to be specified in gene symbols via ...

Journal ArticleDOI
TL;DR: In this paper, a set of six non-dimensional parameters that are the most significant in optimizing particle image velocimeter performance are identified, which are the data validation criterion, the particle image density, the relative in-plane image displacement, a velocity gradient parameter, and the ratio of the mean image diameter to the interrogation spot diameter.
Abstract: The spatial resolution, detection rate, accuracy and reliability of a particle image velocimeter (PIV) depend critically upon the careful selection of a number of parameters of the PIV system and the fluid motion. An analytical model and a Monte Carlo computer simulation have been developed to analyse the effects of experimental parameters and to optimize the system parameters. A set of six nondimensional parameters that are the most significant in optimizing PIV performance are identified. They are the data validation criterion, the particle image density, the relative in-plane image displacement, the relative out-of-plane displacement, a velocity gradient parameter, and the ratio of the mean image diameter to the interrogation spot diameter. These parameters are studied for the case of interrogation by autocorrelation analysis. By a single transformation, these results can be applied to interrogation by two-dimensional Fourier transform analysis of the Young's fringes.

Journal ArticleDOI
TL;DR: In this article, a cognitive approach to the problem of competitor definition is outlined, which begins with a discussion of the information-processing demands implied by current models of competitive strategy, and how decision makers simplify the competitive environment by using a mental model of competitive groups.
Abstract: In this article, a cognitive approach to the problem of competitor definition is outlined, which begins with a discussion of the information-processing demands implied by current models of competitive strategy. How decision makers simplify the competitive environment by using a mental model of competitive groups is then discussed. Finally, the implications of a cognitive approach for the classifying organizations and organizational adaptation are commented on.

Journal ArticleDOI
TL;DR: This paper examined differences in knowledge, motives, and demographic characteristics of people who have the opportunity to recycle voluntarily and found that both non-recyclers and recyclers were more concerned with financial incentives to recycle, rewards for recycling, and with matters of personal convenience.
Abstract: Knowledge and motivational factors represent important but neglected topics in the study of recycling behavior. This article examines differences in knowledge, motives, and demographic characteristics of people who have the opportunity to recycle voluntarily. Information on these variables was obtained for 197 households in Illinois. The results indicated that recyclers in general were more aware of publicity about recycling and more knowledgeable about materials that were recyclable in the local area and the means for recycling these materials than were nonrecyclers. While both recyclers and nonrecyclers were motivated by concerns for the environment, non-recyclers were more concerned with financial incentives to recycle, rewards for recycling, and with matters of personal convenience. Few demographic characteristics distinguished recyclers from nonrecyclers.

Journal ArticleDOI
13 Apr 1990-Science
TL;DR: The organization of the visual cortex has been considered to be highly stable in adult mammals, however, 5 degrees to 10 degrees lesions of the retina in the contralateral eye markedly altered the systematic representations in primary and secondary visual cortex when matched inputs from the ipsilateral eye were removed.
Abstract: The organization of the visual cortex has been considered to be highly stable in adult mammals. However, 5 degrees to 10 degrees lesions of the retina in the contralateral eye markedly altered the systematic representations of the retina in primary and secondary visual cortex when matched inputs from the ipsilateral eye were also removed. Cortical neurons that normally have receptive fields in the lesioned region of the retina acquired new receptive fields in portions of the retina surrounding the lesions. The capacity for such changes may be important for normal adjustments of sensory systems to environmental contingencies and for recoveries from brain damage.

Journal ArticleDOI
TL;DR: A phenomenological model of a system of antiferromagnetically correlated spins is shown to give a good quantitative description of NMR, nuclear-quadrupole-resonance, and Knight-shift measurements on yttrium, planar copper, and planar oxygen sites in YBa{sub 2}Cu{sub 3}O{sub 7}.
Abstract: A phenomenological model of a system of antiferromagnetically correlated spins is shown to give a good quantitative description of NMR, nuclear-quadrupole-resonance, and Knight-shift measurements on yttrium, planar copper, and planar oxygen sites in ${\mathrm{YBa}}_{2}$${\mathrm{Cu}}_{3}$${\mathrm{O}}_{7}$. The antiferromagnetic correlation length is estimated to be \ensuremath{\sim}2.5 lattice constants at T=100 K. The temperature dependence of the correlation length ceases at ${\mathit{T}}_{\mathit{x}}$\ensuremath{\simeq}100 K. The enhancement of the observed relaxation rates over what is expected for weakly interacting electrons is calculated and shown to be large. Extension of the calculation to other cuprate superconductors is discussed.

Journal ArticleDOI
TL;DR: This paper describes a general-purpose programming technique, called Simulation of Simplicity, that can be used to cope with degenerate input data for geometric algorithms and it is believed that this technique will become a standard tool in writing geometric software.
Abstract: This paper describes a general-purpose programming technique, called Simulation of Simplicity, that can be used to cope with degenerate input data for geometric algorithms. It relieves the programmer from the task of providing a consistent treatment for every single special case that can occur. The programs that use the technique tend to be considerably smaller and more robust than those that do not use it. We believe that this technique will become a standard tool in writing geometric software.

Journal ArticleDOI
TL;DR: The actor model as a framework for concurrent systems1 and some concepts which are useful in building actor systems are discussed and some common patterns of concurrent problem solving are outlined.
Abstract: Three significant trends have underscored the central role of concurrency in computing. First, there is increased use of interacting processes by individual users, for example, application programs running on X windows. Second, workstation networks have become a cost-effective mechanism for resource sharing and distributed problem solving. For example, loosely coupled problems, such as finding all the factors of large prime numbers, have been solved by utilizing ideal cycles on networks of hundreds of workstations. A loosely coupled problem is one which can be easily partitioned into many smaller subproblems so that interactions between the subproblems is quite limited. Finally, multiprocessor technology has advanced to the point of providing supercomputing power at a fraction of the traditional cost.At the same time, software engineering considerations such as the need for data abstraction to promote program modularity underlie the rapid acceptance of object-oriented programming methodology. By separating the specification of what is done (the abstraction) from how it is done (the implementation), the concept of objects provides modularity necessary for programming in the large. It turns out that concurrency is a natural consequence of the concept of objects. In fact Simula, the first object-oriented language, simulated a simple form of concurrency using coroutines on conventional architectures. Current development of concurrent object-oriented programming (COOP) is providing a solid software foundation for concurrent computing on multiprocessors, Future generation computing systems are likely to be based on the foundations being developed by this emerging software technology.The goal of this article is to discuss the foundations and methodology of COOP. Concurrency refers to the potentially parallel execution of parts of a computation. In a concurrent computation, the components of a program may be executed sequentially, or they may be executed in parallel. Concurrency provides us with the flexibility to interleave the execution of components of a program on a single processor, or to distribute it among several processors. Concurrency abstracts away some of the details in an execution, allowing us to concentrate on conceptual issues without having to be concerned with a particular order of execution which may result from the quirks of a given system.Objects can be defined as entities which encapsulate data and operations into a single computational unit. Object models differ in how the internal behavior of objects is specified. Further, models of concurrent computation based on objects must specify how the objects interact, and different design concerns have led to different models of communication between objects. Object-oriented programming builds on the concepts of objects by supporting patterns of reuse and classification, for example, through the use of inheritance which allows all instances of a particular class to share the same method.In the following section, we outline some common patterns of concurrent problem solving. These patterns can be easily expressed in terms of the rich variety of structures provided by COOP. In particular, we discuss the actor model as a framework for concurrent systems1 and some concepts which are useful in building actor systems. We will then describe some other models of objects and their relation to the actor model along with novel techniques for supporting reusability and modularity in concurrent object-oriented programming. The last section briefly outlines some major on-going projects in COOP.It is important to note that the actor languages give special emphasis to developing flexible program structures which simplify reasoning about programs. By reasoning we do not narrowly restrict ourselves to the problem of program verification—an important program of research whose direct practical utility has yet to be established. Rather our interest is in the ability to understand the properties of software because of clarity in the structure of the code. Such an understanding may be gained by reasoning either informally or formally about programs. The ease with which we can carry out such reasoning is aided by two factors: by modularity in code which is the result of the ability to separate design concerns, and by the ability to abstract program structures which occur repeatedly. In particular, because of their flexible structure, actor languages are particularly well-suited to rapid prototyping applications.

Journal ArticleDOI
TL;DR: This paper explored the question of whether or not learners can consciously attend to both form and meaning when processing input and found that early stage learners have great difficulty in attending to both forms and content.
Abstract: This study explores the question of whether or not learners can consciously attend to both form and meaning when processing input. An experimental procedure is presented in which three levels of learners in four groups were asked to process information under four different conditions: attention to meaning alone; simultaneous attention to meaning and an important lexical item; simultaneous attention to meaning and a grammatical functor; and simultaneous attention to meaning and a verb form. Results suggest that learners, in particular early stage learners, have great difficulty in attending to both form and content. These results raise important questions for current discussions of the role of consciousness in input processing.

Journal ArticleDOI
TL;DR: A recent proposal for dealing with the sign problem due to Sorella leads to an uncontrolled approximation for the ground-state energy, and a method for calculating the correction needed to make it exact is presented.
Abstract: We discuss the problems that arise in the numerical simulation of many-electron systems when the measure of the functional integrals is not positive definite. We present theoretical arguments and numerical data which indicate that the expectation value of the sign of the measure decreases exponentially as the inverse temperature \ensuremath{\beta} increases, unless the measure is forced to be positive by an explicit symmetry. We therefore conclude that a recent proposal for dealing with the sign problem due to Sorella et al. Leads to an uncontrolled approximation. In the cases we have studied it is a good approximation for the ground-state energy, and we present a method for calculating the correction needed to make it exact. However, for some physical quantities, such as the d-wave pair field susceptibility, the neglect of signs can yield misleading results.

Journal ArticleDOI
TL;DR: In this article, a selective conversion of high composition (AlAs)x(GaAs)1−x layers into dense transparent native oxide by reaction with H2O vapor (N2 carrier gas) at elevated temperatures (400 °C) is presented.
Abstract: Data are presented on the conversion (selective conversion) of high‐composition (AlAs)x(GaAs)1−x layers, e.g., in AlxGa1−xAs‐AlAs‐GaAs quantum well heterostructures and superlattices (SLs), into dense transparent native oxide by reaction with H2O vapor (N2 carrier gas) at elevated temperatures (400 °C). Hydrolyzation oxidation of a fine‐scale AlAs(LB)‐GaAs(Lz) SL (LB +Lz≲100 A), or random alloy AlxGa1−xAs (x≳0.7), is observed to proceed more slowly and uniformly than a coarse‐scale ‘‘alloy’’ such as an AlAs‐GaAs superlattice with LB + Lz≳200 A.

Posted Content
TL;DR: In this paper, the authors test for the presence of implicit contractual features of bank loan sales contracts that could explain the inconsistency of bank loans and the effect of technological progress on the reduction of information asymmetries between loan buyers and loan sellers.
Abstract: A defining characteristic of bank loans is that they are not resold once created. Yet, in 1989 about $240 billion of commercial and industrial loans were sold, compared to trivial amounts five years earlier. Selling loans without explicit guarantee or recourse is inconsistent with theories of the existence of financial intermediation. What has changed to make bank loans marketable? In this paper we test for the presence of implicit contractual features of bank loan sales contracts that could explain this inconsistency. In addition, the effect of technological progress on the reduction of information asymmetries between loan buyers and loan sellers is considered. The paper tests for the presence of these features and effects using a sample of over 800 recent loan sales.

Journal ArticleDOI
02 Mar 1990-Science
TL;DR: The effects of high-intensity ultrasound on solid-liquid slurries were examined and Turbulent flow and shock waves produced by acoustic cavitation were found to drive metal particles together at sufficiently high velocities to induce melting upon collision.
Abstract: Ultrasound has become an important synthetic tool in liquid-solid chemical reactions, but the origins of the observed enhancements remained unknown. The effects of high-intensity ultrasound on solid-liquid slurries were examined. Turbulent flow and shock waves produced by acoustic cavitation were found to drive metal particles together at sufficiently high velocities to induce melting upon collision. A series of transition-metal powders were used to probe the maximum temperatures and speeds reached during such interparticle collisions. Metal particles that were irradiated in hydrocarbon liquids with ultrasound underwent collisions at roughly half the speed of sound and generated localized effective temperatures between 2600 degrees C and 3400 degrees C at the point of impact for particles with an average diameter of approximately 10 microns.

Journal ArticleDOI
TL;DR: The authors examine the relative entropy distance D/sub n/ between the true density and the Bayesian density and show that the asymptotic distance is (d/2)(log n)+c, where d is the dimension of the parameter vector.
Abstract: In the absence of knowledge of the true density function, Bayesian models take the joint density function for a sequence of n random variables to be an average of densities with respect to a prior. The authors examine the relative entropy distance D/sub n/ between the true density and the Bayesian density and show that the asymptotic distance is (d/2)(log n)+c, where d is the dimension of the parameter vector. Therefore, the relative entropy rate D/sub n//n converges to zero at rate (log n)/n. The constant c, which the authors explicitly identify, depends only on the prior density function and the Fisher information matrix evaluated at the true parameter value. Consequences are given for density estimation, universal data compression, composite hypothesis testing, and stock-market portfolio selection. >

Journal ArticleDOI
11 Oct 1990-Nature
TL;DR: It is established for the first time an in vivo function for a phospholipid transfer protein, namely a role in the compartment-specific stimulation of protein secretion, in yeast.
Abstract: PROGRESSION of proteins through the secretory pathway of eukaryotic cells involves a continuous rearrangement of macromolecular structures made up of proteins and phospholipids. The protein SEC14p is essential for transport of proteins from the yeast Golgi complex1. Independent characterization of the SEC14 gene2 and the PIT1 gene3, which encodes a phosphatidy-linositol/phosphatidylcholine transfer protein in yeast, indicated that these two genes are identical. Phospholipid transfer proteins are a class of cytosolic proteins that are ubiquitous among eukaryotic cells and are distinguished by their ability to catalyse the exchange of phospholipids between membranesin vitro4. We show here that the SEC14 and PIT1 genes are indeed identical and that the growth phenotype of a sec 14-1ts mutant extends to the inability of its transfer protein to effect phospholipid transfer in vitro4,5. These results therefore establish for the first time an in vivo function for a phospholipid transfer protein, namely a role in the compartment-specific stimulation of protein secretion.

Journal ArticleDOI
TL;DR: Control processes underlying response inhibition were examined and evidence was obtained for inhibitory mechanisms: inhibition of central activation processes and inhibition of transmission of motor commands from central to peripheral structures.
Abstract: Control processes underlying response inhibition were examined. Six Ss performed a visual choice reaction task and were occasionally presented with a tone that told them to withhold the response. Reaction time results were in agreement with a model that assumes a race between response activation and response inhibition processes. Event-related brain potentials, electromyogram, and continuous response measures showed that responses could be interrupted at any time. Evidence was obtained for two inhibitory mechanisms: inhibition of central activation processes and inhibition of transmission of motor commands from central to peripheral structures. Results have implications for the distinction between controlled and ballistic processes.