Institution
Purdue University
Education•West Lafayette, Indiana, United States•
About: Purdue University is a education organization based out in West Lafayette, Indiana, United States. It is known for research contribution in the topics: Population & Heat transfer. The organization has 73219 authors who have published 163563 publications receiving 5775236 citations. The organization is also known as: Purdue & Purdue-West Lafayette.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: In this paper, the authors trace back to the 100 years research history on black phosphorus from the synthesis to material properties, and extend the topic from black phosphorus to phosphorene, aiming at further applications in electronic and optoelectronics devices.
Abstract: Phosphorus is one of the most abundant elements preserved in earth, constructing with a fraction of ~0.1% of the earth crust. In general, phosphorus has several allotropes. The two most commonly seen allotropes, white and red phosphorus, are widely used in explosives and safety matches. In addition, black phosphorus, though rarely mentioned, is a layered semiconductor and have great potentials in optical and electronic applications. Remarkably, this layered material can be reduced to one single atomic layer in the vertical direction owing to the van der Waals structure, known as phosphorene, where the physical properties can be tremendously different from its bulk counterpart. In this review article, we trace back to the 100 years research history on black phosphorus from the synthesis to material properties, and extend the topic from black phosphorus to phosphorene. The physical and transport properties are highlighted, aiming at further applications in electronic and optoelectronics devices.
766 citations
••
TL;DR: It is suggested that speed of processing should be viewed as a fundamental part of the architecture of the cognitive system as it develops across the entire lifespan.
766 citations
••
TL;DR: Transposable elements were first discovered in plants because they can have tremendous effects on genome structure and gene function and may be responsible for the rate at which incompatibility is generated in separated populations.
Abstract: Transposable elements were first discovered in plants because they can have tremendous effects on genome structure and gene function. Although only a few or no elements may be active within a genome at any time in any individual, the genomic alterations they cause can have major outcomes for a species. All major element types appear to be present in all plant species, but their quantitative and qualitative contributions are enormously variable even between closely related lineages. In some large-genome plants, mobile DNAs make up the majority of the nuclear genome. They can rearrange genomes and alter individual gene structure and regulation through any of the activities they promote: transposition, insertion, excision, chromosome breakage, and ectopic recombination. Many genes may have been assembled or amplified through the action of transposable elements, and it is likely that most plant genes contain legacies of multiple transposable element insertions into promoters. Because chromosomal rearrangements can lead to speciating infertility in heterozygous progeny, transposable elements may be responsible for the rate at which such incompatibility is generated in separated populations. For these reasons, understanding plant gene and genome evolution is only possible if we comprehend the contributions of transposable elements.
766 citations
•
TL;DR: This work forms a new neural operator by parameterizing the integral kernel directly in Fourier space, allowing for an expressive and efficient architecture and shows state-of-the-art performance compared to existing neural network methodologies.
Abstract: The classical development of neural networks has primarily focused on learning mappings between finite-dimensional Euclidean spaces. Recently, this has been generalized to neural operators that learn mappings between function spaces. For partial differential equations (PDEs), neural operators directly learn the mapping from any functional parametric dependence to the solution. Thus, they learn an entire family of PDEs, in contrast to classical methods which solve one instance of the equation. In this work, we formulate a new neural operator by parameterizing the integral kernel directly in Fourier space, allowing for an expressive and efficient architecture. We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation. The Fourier neural operator is the first ML-based method to successfully model turbulent flows with zero-shot super-resolution. It is up to three orders of magnitude faster compared to traditional PDE solvers. Additionally, it achieves superior accuracy compared to previous learning-based solvers under fixed resolution.
762 citations
••
TL;DR: It is shown that in the state-feedback case one can come arbitrarily close to the optimal (even over full information controllers) mixed H/sub 2//H/sub infinity / performance measure using constant gain state feedback.
Abstract: The problem of finding an internally stabilizing controller that minimizes a mixed H/sub 2//H/sub infinity / performance measure subject to an inequality constraint on the H/sub infinity / norm of another closed-loop transfer function is considered. This problem can be interpreted and motivated as a problem of optimal nominal performance subject to a robust stability constraint. Both the state-feedback and output-feedback problems are considered. It is shown that in the state-feedback case one can come arbitrarily close to the optimal (even over full information controllers) mixed H/sub 2//H/sub infinity / performance measure using constant gain state feedback. Moreover, the state-feedback problem can be converted into a convex optimization problem over a bounded subset of (n*n and n*q, where n and q are, respectively, the state and input dimensions) real matrices. Using the central H/sub infinity / estimator, it is shown that the output feedback problem can be reduced to a state-feedback problem. In this case, the dimension of the resulting controller does not exceed the dimension of the generalized plant. >
762 citations
Authors
Showing all 73693 results
Name | H-index | Papers | Citations |
---|---|---|---|
Yi Cui | 220 | 1015 | 199725 |
Yi Chen | 217 | 4342 | 293080 |
David Miller | 203 | 2573 | 204840 |
Hongjie Dai | 197 | 570 | 182579 |
Chris Sander | 178 | 713 | 233287 |
Richard A. Gibbs | 172 | 889 | 249708 |
Richard H. Friend | 169 | 1182 | 140032 |
Charles M. Lieber | 165 | 521 | 132811 |
Jian-Kang Zhu | 161 | 550 | 105551 |
David W. Johnson | 160 | 2714 | 140778 |
Robert Stone | 160 | 1756 | 167901 |
Tobin J. Marks | 159 | 1621 | 111604 |
Joseph Wang | 158 | 1282 | 98799 |
Ed Diener | 153 | 401 | 186491 |
Wei Zheng | 151 | 1929 | 120209 |