scispace - formally typeset
Search or ask a question
Institution

RAND Corporation

NonprofitSanta Monica, California, United States
About: RAND Corporation is a nonprofit organization based out in Santa Monica, California, United States. It is known for research contribution in the topics: Population & Health care. The organization has 9602 authors who have published 18570 publications receiving 744658 citations.


Papers
More filters
Journal ArticleDOI
19 Apr 2000-JAMA
TL;DR: A checklist contains specifications for reporting of meta-analyses of observational studies in epidemiology, including background, search strategy, methods, results, discussion, and conclusion should improve the usefulness ofMeta-an analyses for authors, reviewers, editors, readers, and decision makers.
Abstract: ObjectiveBecause of the pressure for timely, informed decisions in public health and clinical practice and the explosion of information in the scientific literature, research results must be synthesized. Meta-analyses are increasingly used to address this problem, and they often evaluate observational studies. A workshop was held in Atlanta, Ga, in April 1997, to examine the reporting of meta-analyses of observational studies and to make recommendations to aid authors, reviewers, editors, and readers.ParticipantsTwenty-seven participants were selected by a steering committee, based on expertise in clinical practice, trials, statistics, epidemiology, social sciences, and biomedical editing. Deliberations of the workshop were open to other interested scientists. Funding for this activity was provided by the Centers for Disease Control and Prevention.EvidenceWe conducted a systematic review of the published literature on the conduct and reporting of meta-analyses in observational studies using MEDLINE, Educational Research Information Center (ERIC), PsycLIT, and the Current Index to Statistics. We also examined reference lists of the 32 studies retrieved and contacted experts in the field. Participants were assigned to small-group discussions on the subjects of bias, searching and abstracting, heterogeneity, study categorization, and statistical methods.Consensus ProcessFrom the material presented at the workshop, the authors developed a checklist summarizing recommendations for reporting meta-analyses of observational studies. The checklist and supporting evidence were circulated to all conference attendees and additional experts. All suggestions for revisions were addressed.ConclusionsThe proposed checklist contains specifications for reporting of meta-analyses of observational studies in epidemiology, including background, search strategy, methods, results, discussion, and conclusion. Use of the checklist should improve the usefulness of meta-analyses for authors, reviewers, editors, readers, and decision makers. An evaluation plan is suggested and research areas are explored.

17,663 citations

Journal ArticleDOI
TL;DR: A PRISMA extension for scoping reviews was needed to provide reporting guidance for this specific type of knowledge synthesis and was developed according to published guidance by the EQUATOR (Enhancing the QUAlity and Transparency of health Research) Network for the development of reporting guidelines.
Abstract: Scoping reviews, a type of knowledge synthesis, follow a systematic approach to map evidence on a topic and identify main concepts, theories, sources, and knowledge gaps. Although more scoping reviews are being done, their methodological and reporting quality need improvement. This document presents the PRISMA-ScR (Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews) checklist and explanation. The checklist was developed by a 24-member expert panel and 2 research leads following published guidance from the EQUATOR (Enhancing the QUAlity and Transparency Of health Research) Network. The final checklist contains 20 essential reporting items and 2 optional items. The authors provide a rationale and an example of good reporting for each item. The intent of the PRISMA-ScR is to help readers (including researchers, publishers, commissioners, policymakers, health care providers, guideline developers, and patients or consumers) develop a greater understanding of relevant terminology, core concepts, and key items to report for scoping reviews.

11,709 citations

Journal ArticleDOI
TL;DR: A publicly available algorithm that requires only the same order of magnitude of computational effort as ordinary least squares applied to the full set of covariates is described.
Abstract: The purpose of model selection algorithms such as All Subsets, Forward Selection and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to select a parsimonious set for the efficient prediction of a response variable. Least Angle Regression (LARS), a new model selection algorithm, is a useful and less greedy version of traditional forward selection methods. Three main properties are derived: (1) A simple modification of the LARS algorithm implements the Lasso, an attractive version of ordinary least squares that constrains the sum of the absolute regression coefficients; the LARS modification calculates all possible Lasso estimates for a given problem, using an order of magnitude less computer time than previous methods. (2) A different LARS modification efficiently implements Forward Stagewise linear regression, another promising new model selection method; this connection explains the similar numerical results previously observed for the Lasso and Stagewise, and helps us understand the properties of both methods, which are seen as constrained versions of the simpler LARS algorithm. (3) A simple approximation for the degrees of freedom of a LARS estimate is available, from which we derive a Cp estimate of prediction error; this allows a principled choice among the range of possible LARS estimates. LARS and its variants are computationally efficient: the paper describes a publicly available algorithm that requires only the same order of magnitude of computational effort as ordinary least squares applied to the full set of covariates.

7,828 citations

Journal ArticleDOI
Daniel Ellsberg1
TL;DR: The notion of "degrees of belief" was introduced by Knight as mentioned in this paper, who argued that people tend to behave "as though" they assigned numerical probabilities to events, or degrees of belief to the events impinging on their actions.
Abstract: Are there uncertainties that are not risks? There has always been a good deal of skepticism about the behavioral significance of Frank Knight's distinction between “measurable uncertainty” or “risk”, which may be represented by numerical probabilities, and “unmeasurable uncertainty” which cannot. Knight maintained that the latter “uncertainty” prevailed – and hence that numerical probabilities were inapplicable – in situations when the decision-maker was ignorant of the statistical frequencies of events relevant to his decision; or when a priori calculations were impossible; or when the relevant events were in some sense unique; or when an important, once-and-for-all decision was concerned. Yet the feeling has persisted that, even in these situations, people tend to behave “as though” they assigned numerical probabilities, or “degrees of belief,” to the events impinging on their actions. However, it is hard either to confirm or to deny such a proposition in the absence of precisely-defined procedures for measuring these alleged “degrees of belief.” What might it mean operationally, in terms of refutable predictions about observable phenomena, to say that someone behaves “as if” he assigned quantitative likelihoods to events: or to say that he does not? An intuitive answer may emerge if we consider an example proposed by Shackle, who takes an extreme form of the Knightian position that statistical information on frequencies within a large, repetitive class of events is strictly irrelevant to a decision whose outcome depends on a single trial.

7,005 citations

Book
01 Jan 1988
TL;DR: In this article, two models of procedural justice are presented: Procedural Justice in Law I and Procedural justice in Law II, and the Generality of Procedural Jurisprudence.
Abstract: 1. Introduction.- 2. Early Research in Procedural Justice.- 3. Research Methods in Procedural Justice Research.- 4. Procedural Justice in Law I: Legal Attitudes and Behavior.- 5. Procedural Justice in Law II: Sources and Implications of Procedural Justice Judgments.- 6. The Generality of Procedural Justice.- 7. Procedural Justice in the Political Arena.- 8. Procedural Justice in Organizations.- 9. Conclusions and Hypotheses.- 10. Two Models of Procedural Justice.- References.- Author Index.

5,785 citations


Authors

Showing all 9660 results

NameH-indexPapersCitations
Kathleen N. Lohr9639845458
Janet Currie9642036340
Carol M. Mangione9639835616
Richard R. Nelson95313101744
James C. Smith9343637251
Samuel Karlin8939641432
Joanne Lynn8928831189
David N. Kennedy8839648377
William H. Rogers8724937259
Margaret G. Kivelson8743725806
Mark S. Litwin8646429048
Moshe Ben-Akiva8445631805
Marc N. Elliott8350926203
David E. Bloom8357533536
Neil S. Wenger8336225048
Network Information
Related Institutions (5)
Columbia University
224K papers, 12.8M citations

88% related

Johns Hopkins University
249.2K papers, 14M citations

88% related

University of Michigan
342.3K papers, 17.6M citations

88% related

University of Washington
305.5K papers, 17.7M citations

88% related

Stanford University
320.3K papers, 21.8M citations

86% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
202311
202277
2021640
2020574
2019548
2018491