scispace - formally typeset
Search or ask a question
Author

Ian Roberts

Bio: Ian Roberts is an academic researcher from University of London. The author has contributed to research in topics: Poison control & Tranexamic acid. The author has an hindex of 112, co-authored 714 publications receiving 51933 citations. Previous affiliations of Ian Roberts include John Radcliffe Hospital & Manchester Academic Health Science Centre.


Papers
More filters
Journal ArticleDOI
18 May 2002-BMJ
TL;DR: Health researchers using postal questionnaires can improve the quality of their research by using the strategies shown to be effective in this systematic review, which includes more randomised controlled trials than any previously published review or meta-analysis.
Abstract: Objective: To identify methods to increase response to postal questionnaires. Design: Systematic review of randomised controlled trials of any method to influence response to postal questionnaires. Studies reviewed: 292 randomised controlled trials including 258 315 participants Intervention reviewed: 75 strategies for influencing response to postal questionnaires. Main outcome measure: The proportion of completed or partially completed questionnaires returned. Results: The odds of response were more than doubled when a monetary incentive was used (odds ratio 2.02; 95% confidence interval 1.79 to 2.27) and almost doubled when incentives were not conditional on response (1.71; 1.29 to 2.26). Response was more likely when short questionnaires were used (1.86; 1.55 to 2.24). Personalised questionnaires and letters increased response (1.16; 1.06 to 1.28), as did the use of coloured ink (1.39; 1.16 to 1.67). The odds of response were more than doubled when the questionnaires were sent by recorded delivery (2.21; 1.51 to 3.25) and increased when stamped return envelopes were used (1.26; 1.13 to 1.41) and questionnaires were sent by first class post (1.12; 1.02 to 1.23). Contacting participants before sending questionnaires increased response (1.54; 1.24 to 1.92), as did follow up contact (1.44; 1.22 to 1.70) and providing non-respondents with a second copy of the questionnaire (1.41; 1.02 to 1.94). Questionnaires designed to be of more interest to participants were more likely to be returned (2.44; 1.99 to 3.01), but questionnaires containing questions of a sensitive nature were less likely to be returned (0.92; 0.87 to 0.98). Questionnaires originating from universities were more likely to be returned than were questionnaires from other sources, such as commercial organisations (1.31; 1.11 to 1.54). Conclusions: Health researchers using postal questionnaires can improve the quality of their research by using the strategies shown to be effective in this systematic review. What is already known on this topic Postal questionnaires are widely used in the collection of data in epidemiological studies and health research Non-response to postal questionnaires reduces the effective sample size and can introduce bias What this study adds This systematic review includes more randomised controlled trials than any previously published review or meta-analysis no questionnaire response The review has identified effective ways to increase response to postal questionnaires The review will be updated regularly in the Cochrane Library

1,955 citations

Journal ArticleDOI
TL;DR: A randomised controlled trials of methods to increase response to postal or electronic questionnaires found substantial heterogeneity among trial results in half of the strategies, which could improve the quality of health research.
Abstract: BACKGROUND: Postal and electronic questionnaires are widely used for data collection in epidemiological studies but non-response reduces the effective sample size and can introduce bias. Finding ways to increase response to postal and electronic questionnaires would improve the quality of health research. OBJECTIVES: To identify effective strategies to increase response to postal and electronic questionnaires. SEARCH STRATEGY: We searched 14 electronic databases to February 2008 and manually searched the reference lists of relevant trials and reviews, and all issues of two journals. We contacted the authors of all trials or reviews to ask about unpublished trials. Where necessary, we also contacted authors to confirm methods of allocation used and to clarify results presented. We assessed the eligibility of each trial using pre-defined criteria. SELECTION CRITERIA: Randomised controlled trials of methods to increase response to postal or electronic questionnaires. DATA COLLECTION AND ANALYSIS: We extracted data on the trial participants, the intervention, the number randomised to intervention and comparison groups and allocation concealment. For each strategy, we estimated pooled odds ratios (OR) and 95% confidence intervals (CI) in a random-effects model. We assessed evidence for selection bias using Egger's weighted regression method and Begg's rank correlation test and funnel plot. We assessed heterogeneity among trial odds ratios using a Chi(2) test and the degree of inconsistency between trial results was quantified using the I(2) statistic. MAIN RESULTS: PostalWe found 481 eligible trials. The trials evaluated 110 different ways of increasing response to postal questionnaires. We found substantial heterogeneity among trial results in half of the strategies. The odds of response were at least doubled using monetary incentives (odds ratio 1.87; 95% CI 1.73 to 2.04; heterogeneity P < 0.00001, I(2) = 84%), recorded delivery (1.76; 95% CI 1.43 to 2.18; P = 0.0001, I(2) = 71%), a teaser on the envelope - e.g. a comment suggesting to participants that they may benefit if they open it (3.08; 95% CI 1.27 to 7.44) and a more interesting questionnaire topic (2.00; 95% CI 1.32 to 3.04; P = 0.06, I(2) = 80%). The odds of response were substantially higher with pre-notification (1.45; 95% CI 1.29 to 1.63; P < 0.00001, I(2) = 89%), follow-up contact (1.35; 95% CI 1.18 to 1.55; P < 0.00001, I(2) = 76%), unconditional incentives (1.61; 1.36 to 1.89; P < 0.00001, I(2) = 88%), shorter questionnaires (1.64; 95% CI 1.43 to 1.87; P < 0.00001, I(2) = 91%), providing a second copy of the questionnaire at follow up (1.46; 95% CI 1.13 to 1.90; P < 0.00001, I(2) = 82%), mentioning an obligation to respond (1.61; 95% CI 1.16 to 2.22; P = 0.98, I(2) = 0%) and university sponsorship (1.32; 95% CI 1.13 to 1.54; P < 0.00001, I(2) = 83%). The odds of response were also increased with non-monetary incentives (1.15; 95% CI 1.08 to 1.22; P < 0.00001, I(2) = 79%), personalised questionnaires (1.14; 95% CI 1.07 to 1.22; P < 0.00001, I(2) = 63%), use of hand-written addresses (1.25; 95% CI 1.08 to 1.45; P = 0.32, I(2) = 14%), use of stamped return envelopes as opposed to franked return envelopes (1.24; 95% CI 1.14 to 1.35; P < 0.00001, I(2) = 69%), an assurance of confidentiality (1.33; 95% CI 1.24 to 1.42) and first class outward mailing (1.11; 95% CI 1.02 to 1.21; P = 0.78, I(2) = 0%). The odds of response were reduced when the questionnaire included questions of a sensitive nature (0.94; 95% CI 0.88 to 1.00; P = 0.51, I(2) = 0%).ElectronicWe found 32 eligible trials. The trials evaluated 27 different ways of increasing response to electronic questionnaires. We found substantial heterogeneity among trial results in half of the strategies. The odds of response were increased by more than a half using non-monetary incentives (1.72; 95% CI 1.09 to 2.72; heterogeneity P < 0.00001, I(2) = 95%), shorter e-questionnaires (1.73; 1.40 to 2.13; P = 0.08, I(2) = 68%), including a statement that others had responded (1.52; 95% CI 1.36 to 1.70), and a more interesting topic (1.85; 95% CI 1.52 to 2.26). The odds of response increased by a third using a lottery with immediate notification of results (1.37; 95% CI 1.13 to 1.65), an offer of survey results (1.36; 95% CI 1.15 to 1.61), and using a white background (1.31; 95% CI 1.10 to 1.56). The odds of response were also increased with personalised e-questionnaires (1.24; 95% CI 1.17 to 1.32; P = 0.07, I(2) = 41%), using a simple header (1.23; 95% CI 1.03 to 1.48), using textual representation of response categories (1.19; 95% CI 1.05 to 1.36), and giving a deadline (1.18; 95% CI 1.03 to 1.34). The odds of response tripled when a picture was included in an e-mail (3.05; 95% CI 1.84 to 5.06; P = 0.27, I(2) = 19%). The odds of response were reduced when "Survey" was mentioned in the e-mail subject line (0.81; 95% CI 0.67 to 0.97; P = 0.33, I(2) = 0%), and when the e-mail included a male signature (0.55; 95% CI 0.38 to 0.80; P = 0.96, I(2) = 0%). AUTHORS' CONCLUSIONS: Health researchers using postal and electronic questionnaires can increase response using the strategies shown to be effective in this systematic review.

1,312 citations

Journal ArticleDOI
TL;DR: The large-scale evidence from randomised trials indicates that it is unlikely that large absolute excesses in other serious adverse events still await discovery, and any further findings that emerge about the effects of statin therapy would not be expected to alter materially the balance of benefits and harms.

1,245 citations

Journal ArticleDOI
TL;DR: There is no evidence from randomised controlled trials that resuscitation with colloids reduces the risk of death, compared to resuscitate with crystalloids, in patients with trauma, burns or following surgery.
Abstract: ‘This review is published as a Cochrane Review in the Cochrane Database of Systematic Reviews 2004, Issue 4. Cochrane Reviews are regularly updated as new evidence emerges and in response to comments and criticisms, and the Cochrane Database of Systematic Reviews should be consulted for the most recent version of the Review.’ Roberts, I., Alderson, P., Bunn, F., Chinnock, P., Ker, K., Schierhout, G. 'Colloids versus crystalloids for fluid resuscitation in critically ill patients' Cochrane Database of Systematic Reviews 2004 (4) CD000567. DOI:10.1002/14651858.CD000567.pub2

1,046 citations

Journal ArticleDOI
TL;DR: The 8th Banff Conference on Allograft Pathology was held in Edmonton, Canada, 15–21 July 2005, and major outcomes included the elimination of the non‐specific term ‘chronic allograft nephropathy’ (CAN) and the recognition of the entity of chronic antibody‐mediated rejection.

1,036 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: An Explanation and Elaboration of the PRISMA Statement is presented and updated guidelines for the reporting of systematic reviews and meta-analyses are presented.
Abstract: Systematic reviews and meta-analyses are essential to summarize evidence relating to efficacy and safety of health care interventions accurately and reliably. The clarity and transparency of these reports, however, is not optimal. Poor reporting of systematic reviews diminishes their value to clinicians, policy makers, and other users. Since the development of the QUOROM (QUality Of Reporting Of Meta-analysis) Statement—a reporting guideline published in 1999—there have been several conceptual, methodological, and practical advances regarding the conduct and reporting of systematic reviews and meta-analyses. Also, reviews of published systematic reviews have found that key information about these studies is often poorly reported. Realizing these issues, an international group that included experienced authors and methodologists developed PRISMA (Preferred Reporting Items for Systematic reviews and Meta-Analyses) as an evolution of the original QUOROM guideline for systematic reviews and meta-analyses of evaluations of health care interventions. The PRISMA Statement consists of a 27-item checklist and a four-phase flow diagram. The checklist includes items deemed essential for transparent reporting of a systematic review. In this Explanation and Elaboration document, we explain the meaning and rationale for each checklist item. For each item, we include an example of good reporting and, where possible, references to relevant empirical studies and methodological literature. The PRISMA Statement, this document, and the associated Web site (http://www.prisma-statement.org/) should be helpful resources to improve reporting of systematic reviews and meta-analyses.

25,711 citations

Journal ArticleDOI
21 Jul 2009-BMJ
TL;DR: The meaning and rationale for each checklist item is explained, and an example of good reporting is included and, where possible, references to relevant empirical studies and methodological literature are included.
Abstract: Systematic reviews and meta-analyses are essential to summarise evidence relating to efficacy and safety of healthcare interventions accurately and reliably. The clarity and transparency of these reports, however, are not optimal. Poor reporting of systematic reviews diminishes their value to clinicians, policy makers, and other users. Since the development of the QUOROM (quality of reporting of meta-analysis) statement—a reporting guideline published in 1999—there have been several conceptual, methodological, and practical advances regarding the conduct and reporting of systematic reviews and meta-analyses. Also, reviews of published systematic reviews have found that key information about these studies is often poorly reported. Realising these issues, an international group that included experienced authors and methodologists developed PRISMA (preferred reporting items for systematic reviews and meta-analyses) as an evolution of the original QUOROM guideline for systematic reviews and meta-analyses of evaluations of health care interventions. The PRISMA statement consists of a 27-item checklist and a four-phase flow diagram. The checklist includes items deemed essential for transparent reporting of a systematic review. In this explanation and elaboration document, we explain the meaning and rationale for each checklist item. For each item, we include an example of good reporting and, where possible, references to relevant empirical studies and methodological literature. The PRISMA statement, this document, and the associated website (www.prisma-statement.org/) should be helpful resources to improve reporting of systematic reviews and meta-analyses.

13,813 citations

Journal ArticleDOI
TL;DR: Reading a book as this basics of qualitative research grounded theory procedures and techniques and other references can enrich your life quality.

13,415 citations

Journal ArticleDOI
02 Jan 2015-BMJ
TL;DR: The PRISMA-P checklist as mentioned in this paper provides 17 items considered to be essential and minimum components of a systematic review or meta-analysis protocol, as well as a model example from an existing published protocol.
Abstract: Protocols of systematic reviews and meta-analyses allow for planning and documentation of review methods, act as a guard against arbitrary decision making during review conduct, enable readers to assess for the presence of selective reporting against completed reviews, and, when made publicly available, reduce duplication of efforts and potentially prompt collaboration. Evidence documenting the existence of selective reporting and excessive duplication of reviews on the same or similar topics is accumulating and many calls have been made in support of the documentation and public availability of review protocols. Several efforts have emerged in recent years to rectify these problems, including development of an international register for prospective reviews (PROSPERO) and launch of the first open access journal dedicated to the exclusive publication of systematic review products, including protocols (BioMed Central's Systematic Reviews). Furthering these efforts and building on the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-analyses) guidelines, an international group of experts has created a guideline to improve the transparency, accuracy, completeness, and frequency of documented systematic review and meta-analysis protocols--PRISMA-P (for protocols) 2015. The PRISMA-P checklist contains 17 items considered to be essential and minimum components of a systematic review or meta-analysis protocol.This PRISMA-P 2015 Explanation and Elaboration paper provides readers with a full understanding of and evidence about the necessity of each item as well as a model example from an existing published protocol. This paper should be read together with the PRISMA-P 2015 statement. Systematic review authors and assessors are strongly encouraged to make use of PRISMA-P when drafting and appraising review protocols.

9,361 citations