scispace - formally typeset
Search or ask a question
Institution

Stanford University

EducationStanford, California, United States
About: Stanford University is a education organization based out in Stanford, California, United States. It is known for research contribution in the topics: Population & Transplantation. The organization has 125751 authors who have published 320347 publications receiving 21892059 citations. The organization is also known as: Leland Stanford Junior University & University of Stanford.
Topics: Population, Transplantation, Medicine, Cancer, Gene


Papers
More filters
Journal ArticleDOI
S. Agostinelli1, John Allison2, K. Amako3, J. Apostolakis4, Henrique Araujo5, P. Arce4, Makoto Asai6, D. Axen4, S. Banerjee7, G. Barrand, F. Behner4, Lorenzo Bellagamba8, J. Boudreau9, L. Broglia10, A. Brunengo8, H. Burkhardt4, Stephane Chauvie, J. Chuma11, R. Chytracek4, Gene Cooperman12, G. Cosmo4, P. V. Degtyarenko13, Andrea Dell'Acqua4, G. Depaola14, D. Dietrich15, R. Enami, A. Feliciello, C. Ferguson16, H. Fesefeldt4, Gunter Folger4, Franca Foppiano, Alessandra Forti2, S. Garelli, S. Gianì4, R. Giannitrapani17, D. Gibin4, J. J. Gomez Y Cadenas4, I. González4, G. Gracia Abril4, G. Greeniaus18, Walter Greiner15, Vladimir Grichine, A. Grossheim4, Susanna Guatelli, P. Gumplinger11, R. Hamatsu19, K. Hashimoto, H. Hasui, A. Heikkinen20, A. S. Howard5, Vladimir Ivanchenko4, A. Johnson6, F.W. Jones11, J. Kallenbach, Naoko Kanaya4, M. Kawabata, Y. Kawabata, M. Kawaguti, S.R. Kelner21, Paul R. C. Kent22, A. Kimura23, T. Kodama24, R. P. Kokoulin21, M. Kossov13, Hisaya Kurashige25, E. Lamanna26, Tapio Lampén20, V. Lara4, Veronique Lefebure4, F. Lei16, M. Liendl4, W. S. Lockman, Francesco Longo27, S. Magni, M. Maire, E. Medernach4, K. Minamimoto24, P. Mora de Freitas, Yoshiyuki Morita3, K. Murakami3, M. Nagamatu24, R. Nartallo28, Petteri Nieminen28, T. Nishimura, K. Ohtsubo, M. Okamura, S. W. O'Neale29, Y. Oohata19, K. Paech15, J Perl6, Andreas Pfeiffer4, Maria Grazia Pia, F. Ranjard4, A.M. Rybin, S.S Sadilov4, E. Di Salvo8, Giovanni Santin27, Takashi Sasaki3, N. Savvas2, Y. Sawada, Stefan Scherer15, S. Sei24, V. Sirotenko4, David J. Smith6, N. Starkov, H. Stoecker15, J. Sulkimo20, M. Takahata23, Satoshi Tanaka30, E. Tcherniaev4, E. Safai Tehrani6, M. Tropeano1, P. Truscott31, H. Uno24, L. Urbán, P. Urban32, M. Verderi, A. Walkden2, W. Wander33, H. Weber15, J.P. Wellisch4, Torre Wenaus34, D.C. Williams, Douglas Wright6, T. Yamada24, H. Yoshida24, D. Zschiesche15 
TL;DR: The Gelfant 4 toolkit as discussed by the authors is a toolkit for simulating the passage of particles through matter, including a complete range of functionality including tracking, geometry, physics models and hits.
Abstract: G eant 4 is a toolkit for simulating the passage of particles through matter. It includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. It has been designed and constructed to expose the physics models utilised, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.

18,904 citations

Book
D.L. Donoho1
01 Jan 2004
TL;DR: It is possible to design n=O(Nlog(m)) nonadaptive measurements allowing reconstruction with accuracy comparable to that attainable with direct knowledge of the N most important coefficients, and a good approximation to those N important coefficients is extracted from the n measurements by solving a linear program-Basis Pursuit in signal processing.
Abstract: Suppose x is an unknown vector in Ropfm (a digital image or signal); we plan to measure n general linear functionals of x and then reconstruct. If x is known to be compressible by transform coding with a known transform, and we reconstruct via the nonlinear procedure defined here, the number of measurements n can be dramatically smaller than the size m. Thus, certain natural classes of images with m pixels need only n=O(m1/4log5/2(m)) nonadaptive nonpixel samples for faithful recovery, as opposed to the usual m pixel samples. More specifically, suppose x has a sparse representation in some orthonormal basis (e.g., wavelet, Fourier) or tight frame (e.g., curvelet, Gabor)-so the coefficients belong to an lscrp ball for 0

18,609 citations

Journal ArticleDOI
TL;DR: A general gradient descent boosting paradigm is developed for additive expansions based on any fitting criterion, and specific algorithms are presented for least-squares, least absolute deviation, and Huber-M loss functions for regression, and multiclass logistic likelihood for classification.
Abstract: Function estimation/approximation is viewed from the perspective of numerical optimization in function space, rather than parameter space. A connection is made between stagewise additive expansions and steepest-descent minimization. A general gradient descent “boosting” paradigm is developed for additive expansions based on any fitting criterion.Specific algorithms are presented for least-squares, least absolute deviation, and Huber-M loss functions for regression, and multiclass logistic likelihood for classification. Special enhancements are derived for the particular case where the individual additive components are regression trees, and tools for interpreting such “TreeBoost” models are presented. Gradient boosting of regression trees produces competitive, highly robust, interpretable procedures for both regression and classification, especially appropriate for mining less than clean data. Connections between this approach and the boosting methods of Freund and Shapire and Friedman, Hastie and Tibshirani are discussed.

17,764 citations

Journal ArticleDOI
19 Apr 2000-JAMA
TL;DR: A checklist contains specifications for reporting of meta-analyses of observational studies in epidemiology, including background, search strategy, methods, results, discussion, and conclusion should improve the usefulness ofMeta-an analyses for authors, reviewers, editors, readers, and decision makers.
Abstract: ObjectiveBecause of the pressure for timely, informed decisions in public health and clinical practice and the explosion of information in the scientific literature, research results must be synthesized. Meta-analyses are increasingly used to address this problem, and they often evaluate observational studies. A workshop was held in Atlanta, Ga, in April 1997, to examine the reporting of meta-analyses of observational studies and to make recommendations to aid authors, reviewers, editors, and readers.ParticipantsTwenty-seven participants were selected by a steering committee, based on expertise in clinical practice, trials, statistics, epidemiology, social sciences, and biomedical editing. Deliberations of the workshop were open to other interested scientists. Funding for this activity was provided by the Centers for Disease Control and Prevention.EvidenceWe conducted a systematic review of the published literature on the conduct and reporting of meta-analyses in observational studies using MEDLINE, Educational Research Information Center (ERIC), PsycLIT, and the Current Index to Statistics. We also examined reference lists of the 32 studies retrieved and contacted experts in the field. Participants were assigned to small-group discussions on the subjects of bias, searching and abstracting, heterogeneity, study categorization, and statistical methods.Consensus ProcessFrom the material presented at the workshop, the authors developed a checklist summarizing recommendations for reporting meta-analyses of observational studies. The checklist and supporting evidence were circulated to all conference attendees and additional experts. All suggestions for revisions were addressed.ConclusionsThe proposed checklist contains specifications for reporting of meta-analyses of observational studies in epidemiology, including background, search strategy, methods, results, discussion, and conclusion. Use of the checklist should improve the usefulness of meta-analyses for authors, reviewers, editors, readers, and decision makers. An evaluation plan is suggested and research areas are explored.

17,663 citations

Book
23 May 2011
TL;DR: It is argued that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas.
Abstract: Many problems of recent interest in statistics and machine learning can be posed in the framework of convex optimization. Due to the explosion in size and complexity of modern datasets, it is increasingly important to be able to solve problems with a very large number of features or training examples. As a result, both the decentralized collection or storage of these datasets as well as accompanying distributed solution methods are either necessary or at least highly desirable. In this review, we argue that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas. The method was developed in the 1970s, with roots in the 1950s, and is equivalent or closely related to many other algorithms, such as dual decomposition, the method of multipliers, Douglas–Rachford splitting, Spingarn's method of partial inverses, Dykstra's alternating projections, Bregman iterative algorithms for l1 problems, proximal methods, and others. After briefly surveying the theory and history of the algorithm, we discuss applications to a wide variety of statistical and machine learning problems of recent interest, including the lasso, sparse logistic regression, basis pursuit, covariance selection, support vector machines, and many others. We also discuss general distributed optimization, extensions to the nonconvex setting, and efficient implementation, including some details on distributed MPI and Hadoop MapReduce implementations.

17,433 citations


Authors

Showing all 127468 results

NameH-indexPapersCitations
Eric S. Lander301826525976
George M. Whitesides2401739269833
Yi Cui2201015199725
Yi Chen2174342293080
David Miller2032573204840
David Baltimore203876162955
Edward Witten202602204199
Irving L. Weissman2011141172504
Hongjie Dai197570182579
Robert M. Califf1961561167961
Frank E. Speizer193636135891
Thomas C. Südhof191653118007
Gad Getz189520247560
Mark Hallett1861170123741
John P. A. Ioannidis1851311193612
Network Information
Related Institutions (5)
Columbia University
224K papers, 12.8M citations

97% related

University of Washington
305.5K papers, 17.7M citations

97% related

University of Pennsylvania
257.6K papers, 14.1M citations

96% related

Harvard University
530.3K papers, 38.1M citations

96% related

University of Michigan
342.3K papers, 17.6M citations

96% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2023504
20222,786
202117,867
202018,236
201916,190
201814,684