scispace - formally typeset
Search or ask a question
Institution

Purdue University

EducationWest Lafayette, Indiana, United States
About: Purdue University is a education organization based out in West Lafayette, Indiana, United States. It is known for research contribution in the topics: Population & Heat transfer. The organization has 73219 authors who have published 163563 publications receiving 5775236 citations. The organization is also known as: Purdue & Purdue-West Lafayette.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors trace back to the 100 years research history on black phosphorus from the synthesis to material properties, and extend the topic from black phosphorus to phosphorene, aiming at further applications in electronic and optoelectronics devices.
Abstract: Phosphorus is one of the most abundant elements preserved in earth, constructing with a fraction of ~0.1% of the earth crust. In general, phosphorus has several allotropes. The two most commonly seen allotropes, white and red phosphorus, are widely used in explosives and safety matches. In addition, black phosphorus, though rarely mentioned, is a layered semiconductor and have great potentials in optical and electronic applications. Remarkably, this layered material can be reduced to one single atomic layer in the vertical direction owing to the van der Waals structure, known as phosphorene, where the physical properties can be tremendously different from its bulk counterpart. In this review article, we trace back to the 100 years research history on black phosphorus from the synthesis to material properties, and extend the topic from black phosphorus to phosphorene. The physical and transport properties are highlighted, aiming at further applications in electronic and optoelectronics devices.

766 citations

Journal ArticleDOI
TL;DR: It is suggested that speed of processing should be viewed as a fundamental part of the architecture of the cognitive system as it develops across the entire lifespan.

766 citations

Journal ArticleDOI
TL;DR: Transposable elements were first discovered in plants because they can have tremendous effects on genome structure and gene function and may be responsible for the rate at which incompatibility is generated in separated populations.
Abstract: Transposable elements were first discovered in plants because they can have tremendous effects on genome structure and gene function. Although only a few or no elements may be active within a genome at any time in any individual, the genomic alterations they cause can have major outcomes for a species. All major element types appear to be present in all plant species, but their quantitative and qualitative contributions are enormously variable even between closely related lineages. In some large-genome plants, mobile DNAs make up the majority of the nuclear genome. They can rearrange genomes and alter individual gene structure and regulation through any of the activities they promote: transposition, insertion, excision, chromosome breakage, and ectopic recombination. Many genes may have been assembled or amplified through the action of transposable elements, and it is likely that most plant genes contain legacies of multiple transposable element insertions into promoters. Because chromosomal rearrangements can lead to speciating infertility in heterozygous progeny, transposable elements may be responsible for the rate at which such incompatibility is generated in separated populations. For these reasons, understanding plant gene and genome evolution is only possible if we comprehend the contributions of transposable elements.

766 citations

Posted Content
TL;DR: This work forms a new neural operator by parameterizing the integral kernel directly in Fourier space, allowing for an expressive and efficient architecture and shows state-of-the-art performance compared to existing neural network methodologies.
Abstract: The classical development of neural networks has primarily focused on learning mappings between finite-dimensional Euclidean spaces. Recently, this has been generalized to neural operators that learn mappings between function spaces. For partial differential equations (PDEs), neural operators directly learn the mapping from any functional parametric dependence to the solution. Thus, they learn an entire family of PDEs, in contrast to classical methods which solve one instance of the equation. In this work, we formulate a new neural operator by parameterizing the integral kernel directly in Fourier space, allowing for an expressive and efficient architecture. We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation. The Fourier neural operator is the first ML-based method to successfully model turbulent flows with zero-shot super-resolution. It is up to three orders of magnitude faster compared to traditional PDE solvers. Additionally, it achieves superior accuracy compared to previous learning-based solvers under fixed resolution.

762 citations

Journal ArticleDOI
TL;DR: It is shown that in the state-feedback case one can come arbitrarily close to the optimal (even over full information controllers) mixed H/sub 2//H/sub infinity / performance measure using constant gain state feedback.
Abstract: The problem of finding an internally stabilizing controller that minimizes a mixed H/sub 2//H/sub infinity / performance measure subject to an inequality constraint on the H/sub infinity / norm of another closed-loop transfer function is considered. This problem can be interpreted and motivated as a problem of optimal nominal performance subject to a robust stability constraint. Both the state-feedback and output-feedback problems are considered. It is shown that in the state-feedback case one can come arbitrarily close to the optimal (even over full information controllers) mixed H/sub 2//H/sub infinity / performance measure using constant gain state feedback. Moreover, the state-feedback problem can be converted into a convex optimization problem over a bounded subset of (n*n and n*q, where n and q are, respectively, the state and input dimensions) real matrices. Using the central H/sub infinity / estimator, it is shown that the output feedback problem can be reduced to a state-feedback problem. In this case, the dimension of the resulting controller does not exceed the dimension of the generalized plant. >

762 citations


Authors

Showing all 73693 results

NameH-indexPapersCitations
Yi Cui2201015199725
Yi Chen2174342293080
David Miller2032573204840
Hongjie Dai197570182579
Chris Sander178713233287
Richard A. Gibbs172889249708
Richard H. Friend1691182140032
Charles M. Lieber165521132811
Jian-Kang Zhu161550105551
David W. Johnson1602714140778
Robert Stone1601756167901
Tobin J. Marks1591621111604
Joseph Wang158128298799
Ed Diener153401186491
Wei Zheng1511929120209
Network Information
Related Institutions (5)
University of Illinois at Urbana–Champaign
225.1K papers, 10.1M citations

98% related

Pennsylvania State University
196.8K papers, 8.3M citations

96% related

University of Wisconsin-Madison
237.5K papers, 11.8M citations

94% related

University of Minnesota
257.9K papers, 11.9M citations

94% related

Cornell University
235.5K papers, 12.2M citations

94% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2023194
2022834
20217,499
20207,699
20197,294
20186,840