scispace - formally typeset
Institution

Stanford University

EducationStanford, California, United States
About: Stanford University is a(n) education organization based out in Stanford, California, United States. It is known for research contribution in the topic(s): Population & Transplantation. The organization has 125751 authors who have published 320347 publication(s) receiving 21892059 citation(s). The organization is also known as: Leland Stanford Junior University & University of Stanford.
Topics: Population, Transplantation, Cancer, Gene, Health care
Papers
More filters

Journal ArticleDOI
01 May 2000-Nature Genetics
TL;DR: The goal of the Gene Ontology Consortium is to produce a dynamic, controlled vocabulary that can be applied to all eukaryotes even as knowledge of gene and protein roles in cells is accumulating and changing.
Abstract: Genomic sequencing has made it clear that a large fraction of the genes specifying the core biological functions are shared by all eukaryotes. Knowledge of the biological role of such shared proteins in one organism can often be transferred to other organisms. The goal of the Gene Ontology Consortium is to produce a dynamic, controlled vocabulary that can be applied to all eukaryotes even as knowledge of gene and protein roles in cells is accumulating and changing. To this end, three independent ontologies accessible on the World-Wide Web (http://www.geneontology.org) are being constructed: biological process, molecular function and cellular component.

30,473 citations


Journal ArticleDOI
TL;DR: This work proposes a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams, and Hofmann's aspect model.
Abstract: We describe latent Dirichlet allocation (LDA), a generative probabilistic model for collections of discrete data such as text corpora. LDA is a three-level hierarchical Bayesian model, in which each item of a collection is modeled as a finite mixture over an underlying set of topics. Each topic is, in turn, modeled as an infinite mixture over an underlying set of topic probabilities. In the context of text modeling, the topic probabilities provide an explicit representation of a document. We present efficient approximate inference techniques based on variational methods and an EM algorithm for empirical Bayes parameter estimation. We report results in document modeling, text classification, and collaborative filtering, comparing to a mixture of unigrams model and the probabilistic LSI model.

27,392 citations


Journal ArticleDOI
Abstract: The ImageNet Large Scale Visual Recognition Challenge is a benchmark in object category classification and detection on hundreds of object categories and millions of images. The challenge has been run annually from 2010 to present, attracting participation from more than fifty institutions. This paper describes the creation of this benchmark dataset and the advances in object recognition that have been possible as a result. We discuss the challenges of collecting large-scale ground truth annotation, highlight key breakthroughs in categorical object recognition, provide a detailed analysis of the current state of the field of large-scale image classification and object detection, and compare the state-of-the-art computer vision accuracy with human accuracy. We conclude with lessons learned in the 5 years of the challenge, and propose future directions and improvements.

25,260 citations


Proceedings ArticleDOI
01 Oct 2014-
TL;DR: A new global logbilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods and produces a vector space with meaningful substructure.
Abstract: Recent methods for learning vector space representations of words have succeeded in capturing fine-grained semantic and syntactic regularities using vector arithmetic, but the origin of these regularities has remained opaque. We analyze and make explicit the model properties needed for such regularities to emerge in word vectors. The result is a new global logbilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods. Our model efficiently leverages statistical information by training only on the nonzero elements in a word-word cooccurrence matrix, rather than on the entire sparse matrix or on individual context windows in a large corpus. The model produces a vector space with meaningful substructure, as evidenced by its performance of 75% on a recent word analogy task. It also outperforms related models on similarity tasks and named entity recognition.

23,307 citations


Book
01 Jan 1957-
Abstract: Cognitive dissonance theory links actions and attitudes It holds that dissonance is experienced whenever one cognition that a person holds follows from the opposite of at least one other cognition that the person holds The magnitude of dissonance is directly proportional to the number of discrepant cognitions and inversely proportional to the number of consonant cognitions that a person has The relative weight of any discrepant or consonant element is a function of its Importance

22,512 citations


Authors

Showing all 125751 results

NameH-indexPapersCitations
Eric S. Lander301826525976
George M. Whitesides2401739269833
Yi Cui2201015199725
Yi Chen2174342293080
David Miller2032573204840
David Baltimore203876162955
Edward Witten202602204199
Irving L. Weissman2011141172504
Hongjie Dai197570182579
Robert M. Califf1961561167961
Frank E. Speizer193636135891
Thomas C. Südhof191653118007
Gad Getz189520247560
Mark Hallett1861170123741
John P. A. Ioannidis1851311193612
Network Information
Related Institutions (5)
University of California, San Diego

204.5K papers, 12.3M citations

98% related

University of Southern California

169.9K papers, 7.8M citations

97% related

Columbia University

224K papers, 12.8M citations

97% related

University of California, Irvine

113.6K papers, 5.5M citations

97% related

Duke University

200.3K papers, 10.7M citations

97% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2022177
202117,830
202018,226
201916,189
201814,684
201714,653

Top Attributes

Show by:

Institution's top 5 most impactful journals

Social Science Research Network

6.8K papers, 333.2K citations

bioRxiv

3.6K papers, 19.5K citations

Science

2.5K papers, 719.5K citations

Nature

2.2K papers, 787.1K citations