Institution
Stanford University
Education•Stanford, California, United States•
About: Stanford University is a education organization based out in Stanford, California, United States. It is known for research contribution in the topics: Population & Transplantation. The organization has 125751 authors who have published 320347 publications receiving 21892059 citations. The organization is also known as: Leland Stanford Junior University & University of Stanford.
Topics: Population, Transplantation, Medicine, Cancer, Gene
Papers published on a yearly basis
Papers
More filters
••
University of Genoa1, University of Manchester2, KEK3, CERN4, Imperial College London5, Stanford University6, Tata Institute of Fundamental Research7, Istituto Nazionale di Fisica Nucleare8, University of Pittsburgh9, Lyon College10, TRIUMF11, Northeastern University12, Thomas Jefferson National Accelerator Facility13, University of Córdoba (Spain)14, Goethe University Frankfurt15, University of Southampton16, University of Udine17, University of Alberta18, Tokyo Metropolitan University19, Helsinki Institute of Physics20, National Research Nuclear University MEPhI21, University of Bath22, Niigata University23, Naruto University of Education24, Kobe University25, University of Calabria26, University of Trieste27, European Space Agency28, University of Birmingham29, Ritsumeikan University30, Qinetiq31, École Polytechnique Fédérale de Lausanne32, Massachusetts Institute of Technology33, Brookhaven National Laboratory34
01 Jul 2003-Nuclear Instruments & Methods in Physics Research Section A-accelerators Spectrometers Detectors and Associated Equipment
TL;DR: The Gelfant 4 toolkit as discussed by the authors is a toolkit for simulating the passage of particles through matter, including a complete range of functionality including tracking, geometry, physics models and hits.
Abstract: G eant 4 is a toolkit for simulating the passage of particles through matter. It includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. It has been designed and constructed to expose the physics models utilised, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.
18,904 citations
•
[...]
TL;DR: It is possible to design n=O(Nlog(m)) nonadaptive measurements allowing reconstruction with accuracy comparable to that attainable with direct knowledge of the N most important coefficients, and a good approximation to those N important coefficients is extracted from the n measurements by solving a linear program-Basis Pursuit in signal processing.
Abstract: Suppose x is an unknown vector in Ropfm (a digital image or signal); we plan to measure n general linear functionals of x and then reconstruct. If x is known to be compressible by transform coding with a known transform, and we reconstruct via the nonlinear procedure defined here, the number of measurements n can be dramatically smaller than the size m. Thus, certain natural classes of images with m pixels need only n=O(m1/4log5/2(m)) nonadaptive nonpixel samples for faithful recovery, as opposed to the usual m pixel samples. More specifically, suppose x has a sparse representation in some orthonormal basis (e.g., wavelet, Fourier) or tight frame (e.g., curvelet, Gabor)-so the coefficients belong to an lscrp ball for 0
18,609 citations
••
TL;DR: A general gradient descent boosting paradigm is developed for additive expansions based on any fitting criterion, and specific algorithms are presented for least-squares, least absolute deviation, and Huber-M loss functions for regression, and multiclass logistic likelihood for classification.
Abstract: Function estimation/approximation is viewed from the perspective of numerical optimization in function space, rather than parameter space. A connection is made between stagewise additive expansions and steepest-descent minimization. A general gradient descent “boosting” paradigm is developed for additive expansions based on any fitting criterion.Specific algorithms are presented for least-squares, least absolute deviation, and Huber-M loss functions for regression, and multiclass logistic likelihood for classification. Special enhancements are derived for the particular case where the individual additive components are regression trees, and tools for interpreting such “TreeBoost” models are presented. Gradient boosting of regression trees produces competitive, highly robust, interpretable procedures for both regression and classification, especially appropriate for mining less than clean data. Connections between this approach and the boosting methods of Freund and Shapire and Friedman, Hastie and Tibshirani are discussed.
17,764 citations
••
TL;DR: A checklist contains specifications for reporting of meta-analyses of observational studies in epidemiology, including background, search strategy, methods, results, discussion, and conclusion should improve the usefulness ofMeta-an analyses for authors, reviewers, editors, readers, and decision makers.
Abstract: ObjectiveBecause of the pressure for timely, informed decisions in public health
and clinical practice and the explosion of information in the scientific literature,
research results must be synthesized. Meta-analyses are increasingly used
to address this problem, and they often evaluate observational studies. A
workshop was held in Atlanta, Ga, in April 1997, to examine the reporting
of meta-analyses of observational studies and to make recommendations to aid
authors, reviewers, editors, and readers.ParticipantsTwenty-seven participants were selected by a steering committee, based
on expertise in clinical practice, trials, statistics, epidemiology, social
sciences, and biomedical editing. Deliberations of the workshop were open
to other interested scientists. Funding for this activity was provided by
the Centers for Disease Control and Prevention.EvidenceWe conducted a systematic review of the published literature on the
conduct and reporting of meta-analyses in observational studies using MEDLINE,
Educational Research Information Center (ERIC), PsycLIT, and the Current Index
to Statistics. We also examined reference lists of the 32 studies retrieved
and contacted experts in the field. Participants were assigned to small-group
discussions on the subjects of bias, searching and abstracting, heterogeneity,
study categorization, and statistical methods.Consensus ProcessFrom the material presented at the workshop, the authors developed a
checklist summarizing recommendations for reporting meta-analyses of observational
studies. The checklist and supporting evidence were circulated to all conference
attendees and additional experts. All suggestions for revisions were addressed.ConclusionsThe proposed checklist contains specifications for reporting of meta-analyses
of observational studies in epidemiology, including background, search strategy,
methods, results, discussion, and conclusion. Use of the checklist should
improve the usefulness of meta-analyses for authors, reviewers, editors, readers,
and decision makers. An evaluation plan is suggested and research areas are
explored.
17,663 citations
•
23 May 2011TL;DR: It is argued that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas.
Abstract: Many problems of recent interest in statistics and machine learning can be posed in the framework of convex optimization. Due to the explosion in size and complexity of modern datasets, it is increasingly important to be able to solve problems with a very large number of features or training examples. As a result, both the decentralized collection or storage of these datasets as well as accompanying distributed solution methods are either necessary or at least highly desirable. In this review, we argue that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas. The method was developed in the 1970s, with roots in the 1950s, and is equivalent or closely related to many other algorithms, such as dual decomposition, the method of multipliers, Douglas–Rachford splitting, Spingarn's method of partial inverses, Dykstra's alternating projections, Bregman iterative algorithms for l1 problems, proximal methods, and others. After briefly surveying the theory and history of the algorithm, we discuss applications to a wide variety of statistical and machine learning problems of recent interest, including the lasso, sparse logistic regression, basis pursuit, covariance selection, support vector machines, and many others. We also discuss general distributed optimization, extensions to the nonconvex setting, and efficient implementation, including some details on distributed MPI and Hadoop MapReduce implementations.
17,433 citations
Authors
Showing all 127468 results
Name | H-index | Papers | Citations |
---|---|---|---|
Eric S. Lander | 301 | 826 | 525976 |
George M. Whitesides | 240 | 1739 | 269833 |
Yi Cui | 220 | 1015 | 199725 |
Yi Chen | 217 | 4342 | 293080 |
David Miller | 203 | 2573 | 204840 |
David Baltimore | 203 | 876 | 162955 |
Edward Witten | 202 | 602 | 204199 |
Irving L. Weissman | 201 | 1141 | 172504 |
Hongjie Dai | 197 | 570 | 182579 |
Robert M. Califf | 196 | 1561 | 167961 |
Frank E. Speizer | 193 | 636 | 135891 |
Thomas C. Südhof | 191 | 653 | 118007 |
Gad Getz | 189 | 520 | 247560 |
Mark Hallett | 186 | 1170 | 123741 |
John P. A. Ioannidis | 185 | 1311 | 193612 |