Institution
University of Texas at Austin
Education•Austin, Texas, United States•
About: University of Texas at Austin is a education organization based out in Austin, Texas, United States. It is known for research contribution in the topics: Population & Poison control. The organization has 94352 authors who have published 206297 publications receiving 9070052 citations. The organization is also known as: UT-Austin & UT Austin.
Topics: Population, Poison control, Galaxy, Stars, Finite element method
Papers published on a yearly basis
Papers
More filters
••
TL;DR: This paper considers deductive and subsequently inductive questions relating to a sample of genes from a selectively neutral locus, and the test of the hypothesis that the alleles being sampled are indeed selectively neutral will be considered.
2,176 citations
••
TL;DR: It is shown, using first principles calculations, that monolayer molybdenum disulphide is an ideal material for valleytronics, for which valley polarization is achievable via valley-selective circular dichroism arising from its unique symmetry.
Abstract: The monolayer transition-metal dichalcogenide molybdenum disulphide has recently attracted attention owing to its distinctive electronic properties. Cao and co-workers present numerical evidence suggesting that circularly polarized light can preferentially excite a single valley in the band structure of this system.
2,163 citations
••
16 Jun 2012TL;DR: This paper proposes a new kernel-based method that takes advantage of low-dimensional structures that are intrinsic to many vision datasets, and introduces a metric that reliably measures the adaptability between a pair of source and target domains.
Abstract: In real-world applications of visual recognition, many factors — such as pose, illumination, or image quality — can cause a significant mismatch between the source domain on which classifiers are trained and the target domain to which those classifiers are applied. As such, the classifiers often perform poorly on the target domain. Domain adaptation techniques aim to correct the mismatch. Existing approaches have concentrated on learning feature representations that are invariant across domains, and they often do not directly exploit low-dimensional structures that are intrinsic to many vision datasets. In this paper, we propose a new kernel-based method that takes advantage of such structures. Our geodesic flow kernel models domain shift by integrating an infinite number of subspaces that characterize changes in geometric and statistical properties from the source to the target domain. Our approach is computationally advantageous, automatically inferring important algorithmic parameters without requiring extensive cross-validation or labeled data from either domain. We also introduce a metric that reliably measures the adaptability between a pair of source and target domains. For a given target domain and several source domains, the metric can be used to automatically select the optimal source domain to adapt and avoid less desirable ones. Empirical studies on standard datasets demonstrate the advantages of our approach over competing methods.
2,154 citations
••
TL;DR: In this paper, the impact of acquisitions on the subsequent innovation performance of acquiring firms in the chemicals industry is examined, and the authors distinguish between technological acquisitions, acquisitions in which technology is a component of the acquired firm's assets, and non-technological acquisitions: acquisitions that do not involve a technological component.
Abstract: This paper examines the impact of acquisitions on the subsequent innovation performance of acquiring firms in the chemicals industry We distinguish between technological acquisitions, acquisitions in which technology is a component of the acquired firm's assets, and nontechnological acquisitions: acquisitions that do not involve a technological component We develop a framework relating acquisitions to firm innovation performance and develop a set of measures for quantifying the technological inputs a firm obtains through acquisitions We find that within technological acquisitions absolute size of the acquired knowledge base enhances innovation performance, while relative size of the acquired knowledge base reduces innovation output The relatedness of acquired and acquiring knowledge bases has a nonlinear impact on innovation output Nontechnological acquisitions do not have a significant effect on subsequent innovation output Copyright © 2001 John Wiley & Sons, Ltd
2,147 citations
••
TL;DR: In this article, the authors identify three organizational pathologies that inhibit breakthrough inventions: the familiarity trap, the maturity trap and the propinquity trap, and argue that by experimenting with novel technologies in which the firm lacks prior experience, emerging technologies that are recent or newly developed in the industry, and pioneering technologies that do not build on any existing technologies firms can overcome these traps and create breakthrough inventions.
Abstract: We present a model that explains how established firms create breakthrough inventions. We identify three organizational pathologies that inhibit breakthrough inventions: the familiarity trap – favoring the familiar; the maturity trap – favoring the mature; and the propinquity trap – favoring search for solutions near to existing solutions. We argue that by experimenting with novel (i.e., technologies in which the firm lacks prior experience), emerging (technologies that are recent or newly developed in the industry), and pioneering (technologies that do not build on any existing technologies) technologies firms can overcome these traps and create breakthrough inventions. Empirical evidence from the chemicals industry supports our model. Copyright © 2001 John Wiley & Sons, Ltd.
2,140 citations
Authors
Showing all 95138 results
Name | H-index | Papers | Citations |
---|---|---|---|
George M. Whitesides | 240 | 1739 | 269833 |
Eugene Braunwald | 230 | 1711 | 264576 |
Yi Chen | 217 | 4342 | 293080 |
Robert J. Lefkowitz | 214 | 860 | 147995 |
Joseph L. Goldstein | 207 | 556 | 149527 |
Eric N. Olson | 206 | 814 | 144586 |
Hagop M. Kantarjian | 204 | 3708 | 210208 |
Rakesh K. Jain | 200 | 1467 | 177727 |
Francis S. Collins | 196 | 743 | 250787 |
Gordon B. Mills | 187 | 1273 | 186451 |
Scott M. Grundy | 187 | 841 | 231821 |
Michael S. Brown | 185 | 422 | 123723 |
Eric Boerwinkle | 183 | 1321 | 170971 |
Aaron R. Folsom | 181 | 1118 | 134044 |
Jiaguo Yu | 178 | 730 | 113300 |