scispace - formally typeset
Search or ask a question
Institution

Stanford University

EducationStanford, California, United States
About: Stanford University is a education organization based out in Stanford, California, United States. It is known for research contribution in the topics: Population & Transplantation. The organization has 125751 authors who have published 320347 publications receiving 21892059 citations. The organization is also known as: Leland Stanford Junior University & University of Stanford.
Topics: Population, Transplantation, Cancer, Gene, Health care


Papers
More filters
Journal ArticleDOI
TL;DR: The open-source software package DADA2 for modeling and correcting Illumina-sequenced amplicon errors is presented, revealing a diversity of previously undetected Lactobacillus crispatus variants.
Abstract: We present the open-source software package DADA2 for modeling and correcting Illumina-sequenced amplicon errors (https://github.com/benjjneb/dada2). DADA2 infers sample sequences exactly and resolves differences of as little as 1 nucleotide. In several mock communities, DADA2 identified more real variants and output fewer spurious sequences than other methods. We applied DADA2 to vaginal samples from a cohort of pregnant women, revealing a diversity of previously undetected Lactobacillus crispatus variants.

14,505 citations

Journal ArticleDOI
TL;DR: In this article, the authors discuss the problem of estimating the sampling distribution of a pre-specified random variable R(X, F) on the basis of the observed data x.
Abstract: We discuss the following problem given a random sample X = (X 1, X 2,…, X n) from an unknown probability distribution F, estimate the sampling distribution of some prespecified random variable R(X, F), on the basis of the observed data x. (Standard jackknife theory gives an approximate mean and variance in the case R(X, F) = \(\theta \left( {\hat F} \right) - \theta \left( F \right)\), θ some parameter of interest.) A general method, called the “bootstrap”, is introduced, and shown to work satisfactorily on a variety of estimation problems. The jackknife is shown to be a linear approximation method for the bootstrap. The exposition proceeds by a series of examples: variance of the sample median, error rates in a linear discriminant analysis, ratio estimation, estimating regression parameters, etc.

14,483 citations

Journal ArticleDOI
TL;DR: The 1971 preliminary criteria for the classification of systemic lupus erythematosus (SLE) were revised and updated to incorporate new immunologic knowledge and improve disease classification and showed gains in sensitivity and specificity.
Abstract: The 1971 preliminary criteria for the classification of systemic lupus erythematosus (SLE) were revised and updated to incorporate new immunologic knowledge and improve disease classification. The 1982 revised criteria include fluorescence antinuclear antibody and antibody to native DNA and Sm antigen. Some criteria involving the same organ systems were aggregated into single criteria. Raynaud's phenomenon and alopecia were not included in the 1982 revised criteria because of low sensitivity and specificity. The new criteria were 96% sensitive and 96% specific when tested with SLE and control patient data gathered from 18 participating clinics. When compared with the 1971 criteria, the 1982 revised criteria showed gains in sensitivity and specificity.

14,272 citations

Journal ArticleDOI
TL;DR: In comparative timings, the new algorithms are considerably faster than competing methods and can handle large problems and can also deal efficiently with sparse features.
Abstract: We develop fast algorithms for estimation of generalized linear models with convex penalties. The models include linear regression, two-class logistic regression, and multinomial regression problems while the penalties include l(1) (the lasso), l(2) (ridge regression) and mixtures of the two (the elastic net). The algorithms use cyclical coordinate descent, computed along a regularization path. The methods can handle large problems and can also deal efficiently with sparse features. In comparative timings we find that the new algorithms are considerably faster than competing methods.

13,656 citations

Journal ArticleDOI
22 Dec 2000-Science
TL;DR: An approach to solving dimensionality reduction problems that uses easily measured local metric information to learn the underlying global geometry of a data set and efficiently computes a globally optimal solution, and is guaranteed to converge asymptotically to the true structure.
Abstract: Scientists working with large volumes of high-dimensional data, such as global climate patterns, stellar spectra, or human gene distributions, regularly confront the problem of dimensionality reduction: finding meaningful low-dimensional structures hidden in their high-dimensional observations. The human brain confronts the same problem in everyday perception, extracting from its high-dimensional sensory inputs-30,000 auditory nerve fibers or 10(6) optic nerve fibers-a manageably small number of perceptually relevant features. Here we describe an approach to solving dimensionality reduction problems that uses easily measured local metric information to learn the underlying global geometry of a data set. Unlike classical techniques such as principal component analysis (PCA) and multidimensional scaling (MDS), our approach is capable of discovering the nonlinear degrees of freedom that underlie complex natural observations, such as human handwriting or images of a face under different viewing conditions. In contrast to previous algorithms for nonlinear dimensionality reduction, ours efficiently computes a globally optimal solution, and, for an important class of data manifolds, is guaranteed to converge asymptotically to the true structure.

13,652 citations


Authors

Showing all 127468 results

NameH-indexPapersCitations
Eric S. Lander301826525976
George M. Whitesides2401739269833
Yi Cui2201015199725
Yi Chen2174342293080
David Miller2032573204840
David Baltimore203876162955
Edward Witten202602204199
Irving L. Weissman2011141172504
Hongjie Dai197570182579
Robert M. Califf1961561167961
Frank E. Speizer193636135891
Thomas C. Südhof191653118007
Gad Getz189520247560
Mark Hallett1861170123741
John P. A. Ioannidis1851311193612
Network Information
Related Institutions (5)
Columbia University
224K papers, 12.8M citations

97% related

University of Washington
305.5K papers, 17.7M citations

97% related

University of Pennsylvania
257.6K papers, 14.1M citations

96% related

Harvard University
530.3K papers, 38.1M citations

96% related

University of Michigan
342.3K papers, 17.6M citations

96% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2023504
20222,786
202117,862
202018,235
201916,190
201814,684