scispace - formally typeset
Search or ask a question

Showing papers by "University of Illinois at Urbana–Champaign published in 1992"


Journal ArticleDOI
TL;DR: The resource-based approach as discussed by the authors is an emerging framework that has stimulated discussion between scholars from three research perspectives: traditional strategy insights concerning a firm's distinctive competencies and heterogeneous capabilities.
Abstract: The resource-based approach is an emerging framework that has stimulated discussion between scholars from three research perspectives. First, the resource-based theory incorporates traditional strategy insights concerning a firm's distinctive competencies and heterogeneous capabilities. The resource-based approach also provides value-added theoretical propositions that are testable within the diversification strategy literature. Second, the resource-based view fits comfortably within the organizational economics paradigm. Third, the resource-based view is complementary to industrial organization research. The resource-based view provides a framework for increasing dialogue between scholars from these important research areas within the conversation of strategic management. Resource-based studies that give simultaneous attention to each of these research programs are suggested.

3,329 citations


Book ChapterDOI
07 Jul 1992
TL;DR: Comparison with other feature selection algorithms shows Relief's advantages in terms of learning time and the accuracy of the learned concept, suggesting Relief's practicality.
Abstract: In real-world concept learning problems, the representation of data often uses many features, only a few of which may be related to the target concept. In this situation, feature selection is important both to speed up learning and to improve concept quality. A new feature selection algorithm Relief uses a statistical method and avoids heuristic search. Relief requires linear time in the number of given features and the number of training instances regardless of the target concept to be learned. Although the algorithm does not necessarily find the smallest subset of features, the size tends to be small because only statistically relevant features are selected. This paper focuses on empirical test results in two artificial domains; the LED Display domain and the Parity domain with and without noise. Comparison with other feature selection algorithms shows Relief's advantages in terms of learning time and the accuracy of the learned concept, suggesting Relief's practicality.

2,908 citations


Journal ArticleDOI
TL;DR: These measurements support the claim that the lattice vibrations of these disordered crystals are essentially the same as those of an amorphous solid, based on a model originally due to Einstein.
Abstract: Measurements of the thermal conductivity above 30 K of mixed crystals with controlled disorder, (KBr${)}_{1\mathrm{\ensuremath{-}}\mathit{x}}$(KCN${)}_{\mathit{x}}$, (NaCl${)}_{1\mathrm{\ensuremath{-}}\mathit{x}}$, (NaCn${)}_{\mathit{x}}$ ${\mathrm{Zr}}_{1\mathrm{\ensuremath{-}}\mathit{x}}$${\mathrm{Y}}_{\mathit{x}}$${\mathrm{O}}_{2\mathrm{\ensuremath{-}}\mathit{x}/2}$, and ${\mathrm{Ba}}_{1\mathrm{\ensuremath{-}}\mathit{x}}$${\mathrm{La}}_{\mathit{x}}$${\mathrm{F}}_{2+\mathit{x}}$, support the idea of a lower limit to the thermal conductivity of disordered solids. In each case, as x is increased, the data approach the calculated minimum conductivity based on a model originally due to Einstein. These measurements support the claim that the lattice vibrations of these disordered crystals are essentially the same as those of an amorphous solid.

1,947 citations


Proceedings Article
12 Jul 1992
TL;DR: A new algorithm Rellef is introduced which selects relevant features using a statistical method and is accurate even if features interact, and is noise-tolerant, suggesting a practical approach to feature selection for real-world problems.
Abstract: For real-world concept learning problems, feature selection is important to speed up learning and to improve concept quality We review and analyze past approaches to feature selection and note their strengths and weaknesses We then introduce and theoretically examine a new algorithm Rellef which selects relevant features using a statistical method Relief does not depend on heuristics, is accurate even if features interact, and is noise-tolerant It requires only linear time in the number of given features and the number of training instances, regardless of the target concept complexity The algorithm also has certain limitations such as nonoptimal feature set size Ways to overcome the limitations are suggested We also report the test results of comparison between Relief and other feature selection algorithms The empirical results support the theoretical analysis, suggesting a practical approach to feature selection for real-world problems

1,910 citations


Journal ArticleDOI
TL;DR: In this paper, the authors used new data on the holdings of 769 tax-exempt (predominantly pension) funds, to evaluate the potential effect of their trading on stock prices.

1,700 citations


Journal ArticleDOI
TL;DR: The Ribosomal Database Project (RDP) is a curated database that offers ribosome data along with related programs and services that include phylogenetically ordered alignments of ribosomal RNA (rRNA) sequences, derived phylogenetic trees, rRNA secondary structure diagrams and various software packages for handling, analyzing and displaying alignments and trees.
Abstract: The Ribosomal Database Project (RDP) is a curated database that offers ribosome-related data, analysis services, and associated computer programs. The offerings include phylogenetically ordered alignments of ribosomal RNA (rRNA) sequences, derived phylogenetic trees, rRNA secondary structure diagrams, and various software for handling, analyzing and displaying alignments and trees. The data are available via anonymous ftp (rdp.life.uiuc.edu), electronic mail (server/rdp.life.uiuc.edu) and gopher (rdpgopher.life.uiuc.edu). The electronic mail server also provides ribosomal probe checking, approximate phylogenetic placement of user-submitted sequences, screening for chimeric nature of newly sequenced rRNAs, and automated alignment.

1,660 citations


Journal ArticleDOI
TL;DR: The results suggest that expectancies about the relative utility of the information extracted during the parallel and focused phases determine which phase is used to activate responses.
Abstract: Recent studies indicate that subjects may respond to visual information during either an early parallel phase or a later focused phase and that the selection of the relevant phase is data driven. Using the noise-compatibility paradigm, we tested the hypothesis that this selection may also be strategic and context driven. At least part of the interference effect observed in this paradigm is due to response activation during the parallel-processing phase. We manipulated subjects' expectancies for compatible and incompatible noise in 4 experiments and effectively modulated the interference effect. The results suggest that expectancies about the relative utility of the information extracted during the parallel and focused phases determine which phase is used to activate responses.

1,608 citations


Journal ArticleDOI
TL;DR: This article investigated elementary school children's conceptual knowledge about the earth and identified five alternative mental models of the earth: the rectangular earth, disc earth, dual earth, the hollow sphere, and the flattened sphere.

1,590 citations



Journal ArticleDOI
TL;DR: An experiment is reported to characterize the changes in operators' trust during an interaction with a semi-automatic pasteurization plant, and a regression model identifies the causes of changes in trust and a 'trust transfer function' is developed using time series analysis to describe the dynamics of trust.
Abstract: As automated controllers supplant human intervention in controlling complex systems, the operators' role often changes from that of an active controller to that of a supervisory controller. Acting as supervisors, operators can choose between automatic and manual control. Improperly allocating function between automatic and manual control can have negative consequences for the performance of a system. Previous research suggests that the decision to perform the job manually or automatically depends, in part, upon the trust the operators invest in the automatic controllers. This paper reports an experiment to characterize the changes in operators' trust during an interaction with a semi-automatic pasteurization plant, and investigates the relationship between changes in operators' control strategies and trust. A regression model identifies the causes of changes in trust, and a ‘trust transfer function’ is developed using lime series analysis to describe the dynamics of trust. Based on a detailed analysis of ...

1,280 citations



Proceedings ArticleDOI
01 Dec 1992
TL;DR: This article introduces the formal notion of the family of α-shapes of a finite point set in R, a well-defined polytope, derived from the Delaunay triangulation of the point set, with a parameter α ε R controlling the desired level of detail.
Abstract: Frequently, data in scientific computing is in its abstract form a finite point set in space, and it is sometimes useful or required to compute what one might call the “shape” of the set. For that purpose, this article introduces the formal notion of the family of a-shapes of a finite point set in R3. Each shape is a well-defined polytope, derived from the Delaunay triangulation of the point set, with a parameter a e R controlling the desired level of detail. An algorithm is presented that constructs the entire family of shapes for a given set of size n in time 0(n2), worst case. A robust implementation of the algorithm is discussed, and several applications in the area of scientific computing are mentioned.

Journal ArticleDOI
TL;DR: Initial screening of insects indicates that cytoplasmic incompatibility may be a more general phenomenon in insects than is currently recognized and Lack of congruence between the phylogeny of the symbionts and their insect hosts suggest that horizontal transfer of symbiont between insect species may occur.
Abstract: Bacterial endosymbionts of insects have long been implicated in the phenomenon of cytoplasmic incompatibility, in which certain crosses between symbiont-infected individuals lead to embryonic death or sex ratio distortion. The taxonomic position of these bacteria has, however, not been known with any certainty. Similarly, the relatedness of the bacteria infecting various insect hosts has been unclear. The inability to grow these bacteria on defined cell-free medium has been the major factor underlying these uncertainties. We circumvented this problem by selective PCR amplification and subsequent sequencing of the symbiont 16S rRNA genes directly from infected insect tissue. Maximum parsimony analysis of these sequences indicates that the symbionts belong in the alpha-subdivision of the Proteobacteria, where they are most closely related to the Rickettsia and their relatives. They are all closely related to each other and are assigned to the type species Wolbachia pipientis. Lack of congruence between the phylogeny of the symbionts and their insect hosts suggest that horizontal transfer of symbionts between insect species may occur. Comparison of the sequences for W. pipientis and for Wolbachia persica, an endosymbiont of ticks, shows that the genus Wolbachia is polyphyletic. A PCR assay based on 16S primers was designed for the detection of W. pipientis in insect tissue, and initial screening of insects indicates that cytoplasmic incompatibility may be a more general phenomenon in insects than is currently recognized.

Journal ArticleDOI
TL;DR: In this paper, cross-correlation methods of interrogation of successive single-exposure frames can be used to measure the separation of pairs of particle images between successive frames, which can be optimized in terms of spatial resolution, detection rate, accuracy and reliability.
Abstract: To improve the performance of particle image velocimetry in measuring instantaneous velocity fields, direct cross-correlation of image fields can be used in place of auto-correlation methods of interrogation of double- or multiple-exposure recordings. With improved speed of photographic recording and increased resolution of video array detectors, cross-correlation methods of interrogation of successive single-exposure frames can be used to measure the separation of pairs of particle images between successive frames. By knowing the extent of image shifting used in a multiple-exposure and by a priori knowledge of the mean flow-field, the cross-correlation of different sized interrogation spots with known separation can be optimized in terms of spatial resolution, detection rate, accuracy and reliability.

Journal ArticleDOI
TL;DR: This paper surveys the work on gross-motion planning, including motion planners for point robots, rigid robots, and manipulators in stationary, time-varying, constrained, and movable-object environments.
Abstract: Motion planning is one of the most important areas of robotics research. The complexity of the motion-planning problem has hindered the development of practical algorithms. This paper surveys the work on gross-motion planning, including motion planners for point robots, rigid robots, and manipulators in stationary, time-varying, constrained, and movable-object environments. The general issues in motion planning are explained. Recent approaches and their performances are briefly described, and possible future research directions are discussed.

Journal ArticleDOI
TL;DR: ZEUS-2D as mentioned in this paper is a numerical code for the simulation of fluid dynamical flows in astrophysics including a self-consistent treatment of the effects of magnetic fields and radiation transfer.
Abstract: In this, the second of a series of three papers, we continue a detailed description of ZEUS-2D, a numerical code for the simulation of fluid dynamical flows in astrophysics including a self-consistent treatment of the effects of magnetic fields and radiation transfer. Here, a detailed description of the magnetohydrodynamical (MHD) algorithms in ZEUS-2D is given. The recently developed constrained transport (CT) algorithm is implemented for the numerical evolution of the components of the magnetic field for MHD simulations. This formalism guarantees the numerically evolved field components will satisfy the divergence-free constraint at all times

Journal ArticleDOI
TL;DR: The Children's Perception of Interparental Conflict Scale (CPIC) appears to be a promising instrument for assessing perceived marital conflict, and several issues regarding its interpretation are discussed.
Abstract: Guided by Grych and Fincham's theoretical framework for investigating the relation between interparental conflict and child adjustment, a questionnaire was developed to assess children's views of several aspects of marital conflict. The Children's Perception of Interparental Conflict Scale (CPIC) was initially examined in a sample of 222 9-12-year-old children, and results were cross-validated in a second sample of 144 similarly aged children. 3 factor analytically derived subscales (Conflict Properties, Threat, Self-Blame) demonstrated acceptable levels of internal consistency and test-retest reliability. The validity of the Conflict Properties scale was supported by significant relations with parent reports of conflict and indices of child adjustment; the Threat and Self-Blame scales correlated with children's responses to specific conflict vignettes. The CPIC thus appears to be a promising instrument for assessing perceived marital conflict, and several issues regarding its interpretation are discussed.


Journal ArticleDOI
TL;DR: The results show that depression reaches its lowest level in the middle aged, at about age 45, and the fall of depression in early adulthood and rise in late life mostly reflects life-cycle gains and losses in marriage, employment, and economic well-being.
Abstract: In this study, the relationship between age and depression is analyzed, looking for effects of maturity, decline, life-cycle stage, survival, and historical trend. The data are from a 1990 sample of 2,031 U.S. adults and a 1985 sample of 809 Illinois adults. The results show that depression reaches its lowest level in the middle aged, at about age 45. The fall of depression in early adulthood and rise in late life mostly reflects life-cycle gains and losses in marriage, employment, and economic well-being. Depression reaches its highest level in adults 80 years old or older, because physical dysfunction and low personal control add to personal and status losses. Malaise from poor health does not create a spurious rise of measured depression in late adulthood. However, some of the differences among age groups in depression reflect higher education in younger generations, and some reflect different rates of survival across demographic groups that also vary in their levels of depression.

Journal ArticleDOI
TL;DR: Sickness behavior refers to the nonspecific symptoms (anorexia, depressed activity, loss of interest in usual activities, disappearance of body-care activities) that accompany the response to infection.

Journal ArticleDOI
TL;DR: In this paper, the authors find that extreme prior losers outperform extreme prior winners by 5-10% per year during the subsequent five years, and that the overreaction effect is substantially stronger for smaller firms than for larger firms.

Journal ArticleDOI
TL;DR: Additional research with cows consuming large amounts of feed are needed to identify combinations of feed ingredients that synchronize availabilities of energy and N for optimizing ruminal digestion, microbial protein synthesis, nutrient flow to the small intestine, and milk production and composition.

Journal ArticleDOI
TL;DR: In this article, a methodology for the design of reliable centralized and decentralized control systems is developed, and the resulting control systems are reliable in that they provide guaranteed stability and H/sub infinity / performance not only when all control components are operational, but also for sensor or actuator outages in the centralized case, or for control-channel outages for the decentralized case.
Abstract: A methodology for the design of reliable centralized and decentralized control systems is developed. The resulting control systems are reliable in that they provide guaranteed stability and H/sub infinity / performance not only when all control components are operational, but also for sensor or actuator outages in the centralized case, or for control-channel outages in the decentralized case. Reliability is guaranteed provided these outages occur only within a prespecified subset of control components. Strongly stabilizing designs are also developed, for both centralized and decentralized systems. >

Journal ArticleDOI
TL;DR: A lifting technique is used to describe the continuous-time (i.e., intersample) behavior of sampled-data systems, and to obtain a complete solution to the problem of parameterizing all controllers that constrain the L/sup 2/-induced norm of a sampled- data system to within a certain bound.
Abstract: The authors present a framework for dealing with continuous-time periodic systems. The main tool is a lifting technique which provides a strong correspondence between continuous-time periodic systems and certain types of discrete-time time-invariant systems with infinite-dimensional input and output spaces. Despite the infinite dimensionality of the input and output spaces, a lifted system has a finite-dimensional state space if the original system does. This fact permits rather constructive methods for analyzing these systems. As a demonstration of the utility of this framework, the authors use it to describe the continuous-time (i.e., intersample) behavior of sampled-data systems, and to obtain a complete solution to the problem of parameterizing all controllers that constrain the L/sup 2/-induced norm of a sampled-data system to within a certain bound. >

Journal ArticleDOI
TL;DR: In this paper, a general introduction to x-ray diffraction and its application to the study of surfaces and interfaces is presented, illustrated through five different techniques: crystal truncation rod analysis, two-dimensional crystallography, three-dimensional structure analysis, the evanescent wave method and lineshape analysis.
Abstract: A general introduction to x-ray diffraction and its application to the study of surfaces and interfaces is presented. The application of x-ray diffractkm to various problems in surface and interface science is illustrated through five different techniques: crystal truncation rod analysis, two-dimensional crystallography, three-dimensional structure analysis, the evanescent wave method and lineshape analysis. These techniques are explained with numerous examples from recent experiments and with the aid of an extensive bibliography.

Journal ArticleDOI
01 Feb 1992
TL;DR: A path-planning algorithm for the classical mover's problem in three dimensions using a potential field representation of obstacles is presented and solves a much wider class of problems than other heuristic algorithms and at the same time runs much faster than exact algorithms.
Abstract: A path-planning algorithm for the classical mover's problem in three dimensions using a potential field representation of obstacles is presented. A potential function similar to the electrostatic potential is assigned to each obstacle, and the topological structure of the free space is derived in the form of minimum potential valleys. Path planning is done at two levels. First, a global planner selects a robot's path from the minimum potential valleys and its orientations along the path that minimize a heuristic estimate of the path length and the chance of collision. Then, a local planner modifies the path and orientations to derive the final collision-free path and orientations. If the local planner fails, a new path and orientations are selected by the global planner and subsequently examined by the local planner. This process is continued until a solution is found or there are no paths left to be examined. The algorithm solves a much wider class of problems than other heuristic algorithms and at the same time runs much faster than exact algorithms (typically 5 to 30 min on a Sun 3/260). >

Journal ArticleDOI
TL;DR: Results indicated that nearly all children understood loneliness, that loneliness was reliably assessed in young children, and that poorly accepted children were more lonely than other children.
Abstract: Recent studies indicate that feelings of loneliness and social dissatisfaction can be reliably assessed with third- through sixth-grade children, and that children who are sociometrically rejected by their peers are significantly more lonely than other children. The present research was designed (a) to examine whether loneliness could be reliably assessed in a population younger than previously studied, (b) to learn whether young children who are poorly accepted by peers report elevated levels of loneliness and social dissatisfaction, (c) to assess whether young children understand the concept of loneliness, and (d) to examine the behavioral characteristics of lonely young children. Kindergarten and first-grade children (N = 440) responded to a questionnaire about feelings of loneliness and social dissatisfaction in school. A subset of children (N = 46) were individually interviewed to assess their understanding of loneliness. To assess sociometric status and behavior, peers were asked to respond to various sociometric measures and behavioral assessment items. Teachers also provided behavioral information about children using a newly developed instrument. Results indicated that nearly all children understood loneliness, that loneliness was reliably assessed in young children, and that poorly accepted children were more lonely than other children. In addition, children who reported the most loneliness were found to differ from others on several behavioral dimensions.

Journal ArticleDOI
TL;DR: Results suggest a role for genomic responses in neural processes linked to song pattern recognition, discrimination, or the formation of auditory associations in songbird brain response to birdsong.
Abstract: We investigated the participation of genomic regulatory events in the response of the songbird brain to a natural auditory stimulus of known physiological and behavioral relevance, birdsong. Using in situ hybridization, we detected a rapid increase in forebrain mRNA levels of an immediate-early gene encoding a transcriptional regulator (ZENK; also known as zif-268, egr-1, NGFI-A, or Krox-24) following presentation of tape-recorded songs to canaries (Serinus canaria) and zebra finches (Taeniopygia guttata). ZENK induction is most marked in a forebrain region believed to participate in auditory processing and is greatest when birds hear the song of their own species. A significantly lower level of induction occurs when birds hear the song of a different species and no induction is seen after exposure to tone bursts. Cellular analysis indicates that the level of induction reflects the proportion of neurons recruited to express the gene. These results suggest a role for genomic responses in neural processes linked to song pattern recognition, discrimination, or the formation of auditory associations.

Journal ArticleDOI
TL;DR: Some initial production priming explorations are reported that support the hypothesis that lemmas are buffered in longer utterances before they are phonologically specified, and a reconciliation of modular and interactive accounts of these stages is suggested.

Journal ArticleDOI
10 Dec 1992
TL;DR: In this paper, a new structure, referred to as the hyperblock, is proposed to combine speculative execution with predicated execution for both compile-time optimization and scheduling of conditional branches.
Abstract: Predicated execution is an effective technique for dealing with conditional branches in application programs. However, there are several problems associated with conventional compiler support for predicated execution. First, all paths of control are combined into a single path regardless of their execution frequency and size with conventional if-conversion techniques. Second, speculative execution is difficult to combine with predicated execution. In this paper, we propose the use of a new structure, referred to as the hyperblock, to overcome these problems. The hyperblock is an efficient structure to utilize predicated execution for both compiletime optimization and scheduling. Preliminary experimental results show that the hyperblock is highly effective for a wide range of superscalar and VLIW processors.