scispace - formally typeset
Search or ask a question
Institution

Cornell University

EducationIthaca, New York, United States
About: Cornell University is a education organization based out in Ithaca, New York, United States. It is known for research contribution in the topics: Population & Gene. The organization has 102246 authors who have published 235546 publications receiving 12283673 citations. The organization is also known as: Cornell & CUI.


Papers
More filters
Journal ArticleDOI
Claude Amsler1, Michael Doser2, Mario Antonelli, D. M. Asner3  +173 moreInstitutions (86)
TL;DR: This biennial Review summarizes much of particle physics, using data from previous editions.

12,798 citations

Book
31 Jan 1986
TL;DR: Numerical Recipes: The Art of Scientific Computing as discussed by the authors is a complete text and reference book on scientific computing with over 100 new routines (now well over 300 in all), plus upgraded versions of many of the original routines, with many new topics presented at the same accessible level.
Abstract: From the Publisher: This is the revised and greatly expanded Second Edition of the hugely popular Numerical Recipes: The Art of Scientific Computing. The product of a unique collaboration among four leading scientists in academic research and industry, Numerical Recipes is a complete text and reference book on scientific computing. In a self-contained manner it proceeds from mathematical and theoretical considerations to actual practical computer routines. With over 100 new routines (now well over 300 in all), plus upgraded versions of many of the original routines, this book is more than ever the most practical, comprehensive handbook of scientific computing available today. The book retains the informal, easy-to-read style that made the first edition so popular, with many new topics presented at the same accessible level. In addition, some sections of more advanced material have been introduced, set off in small type from the main body of the text. Numerical Recipes is an ideal textbook for scientists and engineers and an indispensable reference for anyone who works in scientific computing. Highlights of the new material include a new chapter on integral equations and inverse methods; multigrid methods for solving partial differential equations; improved random number routines; wavelet transforms; the statistical bootstrap method; a new chapter on "less-numerical" algorithms including compression coding and arbitrary precision arithmetic; band diagonal linear systems; linear algebra on sparse matrices; Cholesky and QR decomposition; calculation of numerical derivatives; Pade approximants, and rational Chebyshev approximation; new special functions; Monte Carlo integration in high-dimensional spaces; globally convergent methods for sets of nonlinear equations; an expanded chapter on fast Fourier methods; spectral analysis on unevenly sampled data; Savitzky-Golay smoothing filters; and two-dimensional Kolmogorov-Smirnoff tests. All this is in addition to material on such basic top

12,662 citations

Journal ArticleDOI
Adam Auton1, Gonçalo R. Abecasis2, David Altshuler3, Richard Durbin4  +514 moreInstitutions (90)
01 Oct 2015-Nature
TL;DR: The 1000 Genomes Project set out to provide a comprehensive description of common human genetic variation by applying whole-genome sequencing to a diverse set of individuals from multiple populations, and has reconstructed the genomes of 2,504 individuals from 26 populations using a combination of low-coverage whole-generation sequencing, deep exome sequencing, and dense microarray genotyping.
Abstract: The 1000 Genomes Project set out to provide a comprehensive description of common human genetic variation by applying whole-genome sequencing to a diverse set of individuals from multiple populations. Here we report completion of the project, having reconstructed the genomes of 2,504 individuals from 26 populations using a combination of low-coverage whole-genome sequencing, deep exome sequencing, and dense microarray genotyping. We characterized a broad spectrum of genetic variation, in total over 88 million variants (84.7 million single nucleotide polymorphisms (SNPs), 3.6 million short insertions/deletions (indels), and 60,000 structural variants), all phased onto high-quality haplotypes. This resource includes >99% of SNP variants with a frequency of >1% for a variety of ancestries. We describe the distribution of genetic variation across the global sample, and discuss the implications for common disease studies.

12,661 citations

Proceedings ArticleDOI
Tsung-Yi Lin1, Priya Goyal2, Ross Girshick2, Kaiming He2, Piotr Dollár2 
07 Aug 2017
TL;DR: This paper proposes to address the extreme foreground-background class imbalance encountered during training of dense detectors by reshaping the standard cross entropy loss such that it down-weights the loss assigned to well-classified examples, and develops a novel Focal Loss, which focuses training on a sparse set of hard examples and prevents the vast number of easy negatives from overwhelming the detector during training.
Abstract: The highest accuracy object detectors to date are based on a two-stage approach popularized by R-CNN, where a classifier is applied to a sparse set of candidate object locations. In contrast, one-stage detectors that are applied over a regular, dense sampling of possible object locations have the potential to be faster and simpler, but have trailed the accuracy of two-stage detectors thus far. In this paper, we investigate why this is the case. We discover that the extreme foreground-background class imbalance encountered during training of dense detectors is the central cause. We propose to address this class imbalance by reshaping the standard cross entropy loss such that it down-weights the loss assigned to well-classified examples. Our novel Focal Loss focuses training on a sparse set of hard examples and prevents the vast number of easy negatives from overwhelming the detector during training. To evaluate the effectiveness of our loss, we design and train a simple dense detector we call RetinaNet. Our results show that when trained with the focal loss, RetinaNet is able to match the speed of previous one-stage detectors while surpassing the accuracy of all existing state-of-the-art two-stage detectors.

12,161 citations

Journal ArticleDOI
TL;DR: Members of the Chamber Quantification Writing Group are: Roberto M. Lang, MD, Fase, Michelle Bierig, MPH, RDCS, FASE, Richard B. Devereux,MD, Frank A. Flachskampf, MD and Elyse Foster, MD.
Abstract: Members of the Chamber Quantification Writing Group are: Roberto M. Lang, MD, FASE, Michelle Bierig, MPH, RDCS, FASE, Richard B. Devereux, MD, Frank A. Flachskampf, MD, Elyse Foster, MD, Patricia A. Pellikka, MD, Michael H. Picard, MD, Mary J. Roman, MD, James Seward, MD, Jack S. Shanewise, MD, FASE, Scott D. Solomon, MD, Kirk T. Spencer, MD, FASE, Martin St John Sutton, MD, FASE, and William J. Stewart, MD

10,834 citations


Authors

Showing all 103081 results

NameH-indexPapersCitations
Eric S. Lander301826525976
David Miller2032573204840
Lewis C. Cantley196748169037
Charles A. Dinarello1901058139668
Scott M. Grundy187841231821
Paul G. Richardson1831533155912
Chris Sander178713233287
David R. Williams1782034138789
David L. Kaplan1771944146082
Kari Alitalo174817114231
Richard K. Wilson173463260000
George F. Koob171935112521
Avshalom Caspi170524113583
Derek R. Lovley16858295315
Stephen B. Baylin168548188934
Network Information
Related Institutions (5)
University of Pennsylvania
257.6K papers, 14.1M citations

96% related

Stanford University
320.3K papers, 21.8M citations

96% related

University of Washington
305.5K papers, 17.7M citations

96% related

Columbia University
224K papers, 12.8M citations

96% related

Yale University
220.6K papers, 12.8M citations

95% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2023309
20221,362
202112,457
202012,139
201910,787
20189,905