scispace - formally typeset
Search or ask a question
Author

Daniel Wight

Bio: Daniel Wight is an academic researcher from University of Glasgow. The author has contributed to research in topics: Population & Reproductive health. The author has an hindex of 48, co-authored 160 publications receiving 9510 citations. Previous affiliations of Daniel Wight include University of Edinburgh & University of Sussex.


Papers
More filters
Journal ArticleDOI
19 Mar 2015-BMJ
TL;DR: New MRC guidance provides a framework for conducting and reporting process evaluation studies that will help improve the quality of decision-making in the design and testing of complex interventions.
Abstract: Process evaluation is an essential part of designing and testing complex interventions. New MRC guidance provides a framework for conducting and reporting process evaluation studies

3,662 citations

Journal ArticleDOI
TL;DR: Increasing modern contraceptive method use requires community-wide, multifaceted interventions and the combined provision of information, life skills, support and access to youth-friendly services.
Abstract: Improving the reproductive health of young women in developing countries requires access to safe and effective methods of fertility control, but most rely on traditional rather than modern contraceptives such as condoms or oral/injectable hormonal methods. We conducted a systematic review of qualitative research to examine the limits to modern contraceptive use identified by young women in developing countries. Focusing on qualitative research allows the assessment of complex processes often missed in quantitative analyses. Literature searches of 23 databases, including Medline, Embase and POPLINE®, were conducted. Literature from 1970–2006 concerning the 11–24 years age group was included. Studies were critically appraised and meta-ethnography was used to synthesise the data. Of the 12 studies which met the inclusion criteria, seven met the quality criteria and are included in the synthesis (six from sub-Saharan Africa; one from South-East Asia). Sample sizes ranged from 16 to 149 young women (age range 13–19 years). Four of the studies were urban based, one was rural, one semi-rural, and one mixed (predominantly rural). Use of hormonal methods was limited by lack of knowledge, obstacles to access and concern over side effects, especially fear of infertility. Although often more accessible, and sometimes more attractive than hormonal methods, condom use was limited by association with disease and promiscuity, together with greater male control. As a result young women often relied on traditional methods or abortion. Although the review was limited to five countries and conditions are not homogenous for all young women in all developing countries, the overarching themes were common across different settings and contexts, supporting the potential transferability of interventions to improve reproductive health. Increasing modern contraceptive method use requires community-wide, multifaceted interventions and the combined provision of information, life skills, support and access to youth-friendly services. Interventions should aim to counter negative perceptions of modern contraceptive methods and the dual role of condoms for contraception and STI prevention should be exploited, despite the challenges involved.

353 citations

Journal ArticleDOI
15 Jun 2002-BMJ
TL;DR: Compared with conventional sex education this specially designed intervention did not reduce sexual risk taking in adolescents and Lack of behavioural effect could not be linked to differential quality of delivery of intervention.
Abstract: OBJECTIVE: To determine whether a theoretically based sex education programme for adolescents (SHARE) delivered by teachers reduced unsafe sexual intercourse compared with current practice. DESIGN: Cluster randomised trial with follow up two years after baseline (six months after intervention). A process evaluation investigated the delivery of sex education and broader features of each school. SETTING: Twenty five secondary schools in east Scotland. PARTICIPANTS: 8430 pupils aged 13-15 years; 7616 completed the baseline questionnaire and 5854 completed the two year follow up questionnaire. INTERVENTION: SHARE programme (intervention group) versus existing sex education (control programme). MAIN OUTCOME MEASURES: Self reported exposure to sexually transmitted disease, use of condoms and contraceptives at first and most recent sexual intercourse, and unwanted pregnancies. RESULTS: When the intervention group was compared with the conventional sex education group in an intention to treat analysis there were no differences in sexual activity or sexual risk taking by the age of 16 years. However, those in the intervention group reported less regret of first sexual intercourse with most recent partner (young men 9.9% difference, 95% confidence interval -18.7 to -1.0; young women 7.7% difference, -16.6 to 1.2). Pupils evaluated the intervention programme more positively, and their knowledge of sexual health improved. Lack of behavioural effect could not be linked to differential quality of delivery of intervention. CONCLUSIONS: Compared with conventional sex education this specially designed intervention did not reduce sexual risk taking in adolescents.

278 citations

Journal ArticleDOI
TL;DR: This paper presents a pragmatic guide to six essential Steps for Quality Intervention Development (6SQuID), the focus is on public health interventions but the model should have wider applicability.
Abstract: Improving the effectiveness of public health interventions relies as much on the attention paid to their design and feasibility as to their evaluation. Yet, compared to the vast literature on how to evaluate interventions, there is little to guide researchers or practitioners on how best to develop such interventions in practical, logical, evidence based ways to maximise likely effectiveness. Existing models for the development of public health interventions tend to have a strong social-psychological, individual behaviour change orientation and some take years to implement. This paper presents a pragmatic guide to six essential Steps for Quality Intervention Development (6SQuID). The focus is on public health interventions but the model should have wider applicability. Once a problem has been identified as needing intervention, the process of designing an intervention can be broken down into six crucial steps: (1) defining and understanding the problem and its causes; (2) identifying which causal or contextual factors are modifiable: which have the greatest scope for change and who would benefit most; (3) deciding on the mechanisms of change; (4) clarifying how these will be delivered; (5) testing and adapting the intervention; and (6) collecting sufficient evidence of effectiveness to proceed to a rigorous evaluation. If each of these steps is carefully addressed, better use will be made of scarce public resources by avoiding the costly evaluation, or implementation, of unpromising interventions.

264 citations

Journal ArticleDOI
TL;DR: Current Medical Research Council (MRC) Population Health Sciences Research Network (PHSRN) funded work to develop guidance for process evaluations of complex public health interventions is described.
Abstract: Public health interventions aim to improve the health of populations or at-risk subgroups. Problems targeted by such interventions, such as diet and smoking, involve complex multifactorial aetiology. Interventions will often aim to address more than one cause simultaneously, targeting factors at multiple levels (eg, individual, interpersonal, organisational), and comprising several components which interact to affect more than one outcome.1 They will often be delivered in systems which respond in unpredictable ways to the new intervention.2 Recognition is growing that evaluations need to understand this complexity if they are to inform future intervention development, or efforts to apply the same intervention in another setting or population.1 Achieving this will require evaluators to move beyond a ‘does it work?’ focus, towards combining outcomes and process evaluation. There is no such thing as a typical process evaluation, with the term applied to studies which range from a few simple quantitative items on satisfaction, to complex mixed-method studies exploring issues such as the process of implementation, or contextual influences on implementation and outcomes. As recognised within MRC guidance for evaluating complex interventions, process evaluation may be used to ‘assess fidelity and quality of implementation, clarify causal mechanisms and identify contextual factors associated with variation in outcomes’.1 This paper briefly discusses each of these core aims for process evaluation, before describing current Medical Research Council (MRC) Population Health Sciences Research Network (PHSRN) funded work to develop guidance for process evaluations of complex public health interventions.

247 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Focus group methodology is introduced, ways of conducting such groups are explored and what this technique of data collection can offer researchers in general and medical sociologists in particular are examined.
Abstract: What are focus groups? How are they distinct from ordinary group discussions and what use are they anyway? This article introduces focus group methodology, explores ways of conducting such groups and examines what this technique of data collection can offer researchers in general and medical sociologists in particular. It concentrates on the one feature which inevitably distinguishes focus groups from one-to-one interviews or questionnaires – namely the interaction between research participants - and argues for the overt exploration and exploitation of such interaction in the research process.

3,872 citations

Journal ArticleDOI
19 Mar 2015-BMJ
TL;DR: New MRC guidance provides a framework for conducting and reporting process evaluation studies that will help improve the quality of decision-making in the design and testing of complex interventions.
Abstract: Process evaluation is an essential part of designing and testing complex interventions. New MRC guidance provides a framework for conducting and reporting process evaluation studies

3,662 citations

01 Jan 2005
TL;DR: The authors call for applied research to better understand service delivery processes and contextual factors to improve the efficiency and effectiveness of program implementation at local state and national levels.
Abstract: In the past few years several major reports highlighted the gap between our knowledge of effective treatments and services currently being received by consumers. These reports agree that we know much about interventions that are effective but make little use of them to help achieve important behavioral health outcomes for children families and adults nationally. This theme is repeated in reports by the Surgeon General (United States Department of Health and Human Services 1999; 2001) the National Institute of Mental Health [NIMH] National Advisory Mental Health Council Workgroup on Child and Adolescent Mental Health Intervention Development and Deployment (2001) Bernfeld Farrington & Leschied (2001) Institute of Medicine (2001) and the Presidents New Freedom Commission on Mental Health (2003). The authors call for applied research to better understand service delivery processes and contextual factors to improve the efficiency and effectiveness of program implementation at local state and national levels. Our understanding of how to develop and evaluate evidence-based intervention programs has been furthered by on-going efforts to research and refine programs and practices to define "evidence bases" and to designate and catalogue "evidence-based programs or practices". However the factors involved in successful implementation of these programs are not as well understood. Current views of implementation are based on the scholarly foundations prepared by Pressman & Wildavskys (1973) study of policy implementation Havelock & Havelocks (1973) classic curriculum for training change agents and Rogers (1983; 1995) series of analyses of factors influencing decisions to choose a given innovation. These foundations were tested and further informed by the experience base generated by pioneering attempts to implement Fairweather Lodges and National Follow-Through education models among others. Petersilia (1990) concluded that "The ideas embodied in innovative social programs are not self-executing." Instead what is needed is an "implementation perspective on innovation--an approach that views postadoption events as crucial and focuses on the actions of those who convert it into practice as the key to success or failure". (excerpt)

3,603 citations

Journal ArticleDOI

2,707 citations

Book ChapterDOI
TL;DR: In this article, a review analyzes whether realization of goal intentions is facilitated by forming an implementation intention that spells out the when, where, and how of goal striving in advance (i.e., if situation Y is encountered, then I will initiate goal-directed behavior X!).
Abstract: Holding a strong goal intention (“I intend to reach Z !”) does not guarantee goal achievement, because people may fail to deal effectively with self‐regulatory problems during goal striving. This review analyzes whether realization of goal intentions is facilitated by forming an implementation intention that spells out the when, where, and how of goal striving in advance (“If situation Y is encountered, then I will initiate goal‐directed behavior X !”). Findings from 94 independent tests showed that implementation intentions had a positive effect of medium‐to‐large magnitude ( d = .65) on goal attainment. Implementation intentions were effective in promoting the initiation of goal striving, the shielding of ongoing goal pursuit from unwanted influences, disengagement from failing courses of action, and conservation of capability for future goal striving. There was also strong support for postulated component processes: Implementation intention formation both enhanced the accessibility of specified opportunities and automated respective goal‐directed responses. Several directions for future research are outlined.

2,663 citations