scispace - formally typeset
Search or ask a question
Institution

University of Texas at Austin

EducationAustin, Texas, United States
About: University of Texas at Austin is a education organization based out in Austin, Texas, United States. It is known for research contribution in the topics: Population & Poison control. The organization has 94352 authors who have published 206297 publications receiving 9070052 citations. The organization is also known as: UT-Austin & UT Austin.


Papers
More filters
Journal ArticleDOI
TL;DR: This paper considers deductive and subsequently inductive questions relating to a sample of genes from a selectively neutral locus, and the test of the hypothesis that the alleles being sampled are indeed selectively neutral will be considered.

2,176 citations

Journal ArticleDOI
TL;DR: It is shown, using first principles calculations, that monolayer molybdenum disulphide is an ideal material for valleytronics, for which valley polarization is achievable via valley-selective circular dichroism arising from its unique symmetry.
Abstract: The monolayer transition-metal dichalcogenide molybdenum disulphide has recently attracted attention owing to its distinctive electronic properties. Cao and co-workers present numerical evidence suggesting that circularly polarized light can preferentially excite a single valley in the band structure of this system.

2,163 citations

Proceedings ArticleDOI
16 Jun 2012
TL;DR: This paper proposes a new kernel-based method that takes advantage of low-dimensional structures that are intrinsic to many vision datasets, and introduces a metric that reliably measures the adaptability between a pair of source and target domains.
Abstract: In real-world applications of visual recognition, many factors — such as pose, illumination, or image quality — can cause a significant mismatch between the source domain on which classifiers are trained and the target domain to which those classifiers are applied. As such, the classifiers often perform poorly on the target domain. Domain adaptation techniques aim to correct the mismatch. Existing approaches have concentrated on learning feature representations that are invariant across domains, and they often do not directly exploit low-dimensional structures that are intrinsic to many vision datasets. In this paper, we propose a new kernel-based method that takes advantage of such structures. Our geodesic flow kernel models domain shift by integrating an infinite number of subspaces that characterize changes in geometric and statistical properties from the source to the target domain. Our approach is computationally advantageous, automatically inferring important algorithmic parameters without requiring extensive cross-validation or labeled data from either domain. We also introduce a metric that reliably measures the adaptability between a pair of source and target domains. For a given target domain and several source domains, the metric can be used to automatically select the optimal source domain to adapt and avoid less desirable ones. Empirical studies on standard datasets demonstrate the advantages of our approach over competing methods.

2,154 citations

Journal ArticleDOI
TL;DR: In this paper, the impact of acquisitions on the subsequent innovation performance of acquiring firms in the chemicals industry is examined, and the authors distinguish between technological acquisitions, acquisitions in which technology is a component of the acquired firm's assets, and non-technological acquisitions: acquisitions that do not involve a technological component.
Abstract: This paper examines the impact of acquisitions on the subsequent innovation performance of acquiring firms in the chemicals industry We distinguish between technological acquisitions, acquisitions in which technology is a component of the acquired firm's assets, and nontechnological acquisitions: acquisitions that do not involve a technological component We develop a framework relating acquisitions to firm innovation performance and develop a set of measures for quantifying the technological inputs a firm obtains through acquisitions We find that within technological acquisitions absolute size of the acquired knowledge base enhances innovation performance, while relative size of the acquired knowledge base reduces innovation output The relatedness of acquired and acquiring knowledge bases has a nonlinear impact on innovation output Nontechnological acquisitions do not have a significant effect on subsequent innovation output Copyright © 2001 John Wiley & Sons, Ltd

2,147 citations

Journal ArticleDOI
TL;DR: In this article, the authors identify three organizational pathologies that inhibit breakthrough inventions: the familiarity trap, the maturity trap and the propinquity trap, and argue that by experimenting with novel technologies in which the firm lacks prior experience, emerging technologies that are recent or newly developed in the industry, and pioneering technologies that do not build on any existing technologies firms can overcome these traps and create breakthrough inventions.
Abstract: We present a model that explains how established firms create breakthrough inventions. We identify three organizational pathologies that inhibit breakthrough inventions: the familiarity trap – favoring the familiar; the maturity trap – favoring the mature; and the propinquity trap – favoring search for solutions near to existing solutions. We argue that by experimenting with novel (i.e., technologies in which the firm lacks prior experience), emerging (technologies that are recent or newly developed in the industry), and pioneering (technologies that do not build on any existing technologies) technologies firms can overcome these traps and create breakthrough inventions. Empirical evidence from the chemicals industry supports our model. Copyright © 2001 John Wiley & Sons, Ltd.

2,140 citations


Authors

Showing all 95138 results

NameH-indexPapersCitations
George M. Whitesides2401739269833
Eugene Braunwald2301711264576
Yi Chen2174342293080
Robert J. Lefkowitz214860147995
Joseph L. Goldstein207556149527
Eric N. Olson206814144586
Hagop M. Kantarjian2043708210208
Rakesh K. Jain2001467177727
Francis S. Collins196743250787
Gordon B. Mills1871273186451
Scott M. Grundy187841231821
Michael S. Brown185422123723
Eric Boerwinkle1831321170971
Aaron R. Folsom1811118134044
Jiaguo Yu178730113300
Network Information
Related Institutions (5)
Stanford University
320.3K papers, 21.8M citations

97% related

Columbia University
224K papers, 12.8M citations

96% related

University of California, San Diego
204.5K papers, 12.3M citations

96% related

University of Michigan
342.3K papers, 17.6M citations

96% related

University of Washington
305.5K papers, 17.7M citations

95% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2023304
20221,209
202110,137
202010,331
20199,727
20188,973