scispace - formally typeset
Search or ask a question

Showing papers by "University of London published in 1998"


Journal ArticleDOI
TL;DR: It is shown that it is feasible to develop a checklist that can be used to assess the methodological quality not only of randomised controlled trials but also non-randomised studies and it is possible to produce a Checklist that provides a profile of the paper, alerting reviewers to its particular methodological strengths and weaknesses.
Abstract: OBJECTIVE: To test the feasibility of creating a valid and reliable checklist with the following features: appropriate for assessing both randomised and non-randomised studies; provision of both an overall score for study quality and a profile of scores not only for the quality of reporting, internal validity (bias and confounding) and power, but also for external validity. DESIGN: A pilot version was first developed, based on epidemiological principles, reviews, and existing checklists for randomised studies. Face and content validity were assessed by three experienced reviewers and reliability was determined using two raters assessing 10 randomised and 10 non-randomised studies. Using different raters, the checklist was revised and tested for internal consistency (Kuder-Richardson 20), test-retest and inter-rater reliability (Spearman correlation coefficient and sign rank test; kappa statistics), criterion validity, and respondent burden. MAIN RESULTS: The performance of the checklist improved considerably after revision of a pilot version. The Quality Index had high internal consistency (KR-20: 0.89) as did the subscales apart from external validity (KR-20: 0.54). Test-retest (r 0.88) and inter-rater (r 0.75) reliability of the Quality Index were good. Reliability of the subscales varied from good (bias) to poor (external validity). The Quality Index correlated highly with an existing, established instrument for assessing randomised studies (r 0.90). There was little difference between its performance with non-randomised and with randomised studies. Raters took about 20 minutes to assess each paper (range 10 to 45 minutes). CONCLUSIONS: This study has shown that it is feasible to develop a checklist that can be used to assess the methodological quality not only of randomised controlled trials but also non-randomised studies. It has also shown that it is possible to produce a checklist that provides a profile of the paper, alerting reviewers to its particular methodological strengths and weaknesses. Further work is required to improve the checklist and the training of raters in the assessment of external validity.

6,849 citations


Journal ArticleDOI
TL;DR: In this article, a hypothesis is formulated to explain how microorganisms may become affected by gradually increasing soil metal concentrations and this is discussed in relation to defining safe or critical soil metal loadings for soil protection.
Abstract: An increasing body of evidence suggests that microorganisms are far more sensitive to heavy metal stress than soil animals or plants growing on the same soils. Not surprisingly, most studies of heavy metal toxicity to soil microorganisms have concentrated on effects where loss of microbial function can be observed and yet such studies may mask underlying effects on biodiversity within microbial populations and communities. The types of evidence which are available for determining critical metal concentrations or loadings for microbial processes and populations in agricultural soil are assessed, particularly in relation to the agricultural use of sewage sludge. Much of the confusion in deriving critical toxic concentrations of heavy metals in soils arises from comparison of experimental results based on short-term laboratory ecotoxicological studies with results from monitoring of long-term exposures of microbial populations to heavy metals in field experiments. The laboratory studies in effect measure responses to immediate, acute toxicity (disturbance) whereas the monitoring of field experiments measures responses to long-term chronic toxicity (stress) which accumulates gradually. Laboratory ecotoxicological studies are the most easily conducted and by far the most numerous, but are difficult to extrapolate meaningfully to toxic effects likely to occur in the field. Using evidence primarily derived from long-term field experiments, a hypothesis is formulated to explain how microorganisms may become affected by gradually increasing soil metal concentrations and this is discussed in relation to defining “safe” or “critical” soil metal loadings for soil protection.

1,887 citations


Journal ArticleDOI
TL;DR: The Team Climate Inventory (TCI) as mentioned in this paper is a multi-dimensional measure of facet-specific climate for innovation within groups at work, which measures the level of the proximal work group.
Abstract: Summary This paper reports the development and psychometric validation of a multi-dimensional measure of facet-specific climate for innovation within groups at work: the Team Climate Inventory (TCI). Brief reviews of the organizational climate and work group innovation literatures are presented initially, and the need for measures of facet-specific climate at the level of the proximal work group asserted. The four-factor theory of facet-specific climate for innovation, which was derived from these reviews, is described, and the procedures used to operationalize this model into the original version measure described. Data attesting to underlying factor structure, internal homogeneity, predictive validity and factor replicability across groups of the summarized measure are presented. An initial sample of 155 individuals from 27 hospital management teams provided data for the exploratory factor analysis of this measure. Responses from 121 further groups in four occupations (35 primary health care teams, 42 social services teams, 20 psychiatric teams and 24 oil company teams; total Na 971) were used to apply confirmatory factor analysis techniques. This five-factor, 38-item summarized version demonstrates robust psychometric properties, with acceptable levels of reliability and validity. Potential applications of this measure are described and the implication of these findings for the measurement of proximal work group climate are discussed. #1998 John Wiley & Sons, Ltd.

1,786 citations


Journal Article
TL;DR: This assessment aims to identify the factors that affect the decisions that emerge from consensus development methods and to assess the implications of the findings for the development of clinical guidelines.
Abstract: Record Status This is a bibliographic record of a published health technology assessment from a member of INAHTA. No evaluation of the quality of this assessment has been made for the HTA database. Citation Murphy M K, Black N A, Lamping D L, McKee C M, Sanderson C F B, Askham J, Marteau T. Consensus development methods, and their use in clinical guideline development: a review. Health Technology Assessment 1998; 2(3): 1-88 Authors' objectives To identify the factors that affect the decisions that emerge from consensus development methods. To assess the implications of the findings for the development of clinical guidelines. To recommend further methodological research for improving the use of consensus development methods as a basis for guideline production.

1,597 citations


Journal ArticleDOI
TL;DR: The correlations between self-report SDQ scores and teacher or parent-ratedSDQ scores compared favourably with the average cross-informant correlations in previous studies of a range of measures.
Abstract: The self-report version of the Strengths and Difficulties Questionnaire (SDQ) was administered to two samples of 11-16 year olds: 83 young people in the community and 116 young people attending a mental health clinic. The questionnaire discriminated satisfactorily between the two samples. For example, the clinic mean for the total difficulties score was 1.4 standard deviations above the community mean, with clinic cases being over six times more likely to have a score in the abnormal range. The correlations between self-report SDQ scores and teacher- or parent-rated SDQ scores compared favourably with the average cross-informant correlations in previous studies of a range of measures. The self-report SDQ appears promising and warrants further evaluation.

1,505 citations


Journal ArticleDOI
22 Jan 1998-Nature
TL;DR: The data reveal that NO2− may regulate inflammatory processes through oxidative mechanisms, perhaps by contributing to the tyrosine nitration and chlorination observed in vivo.
Abstract: Nitric oxide (.NO) plays a central role in the pathogenesis of diverse inflammatory and infectious disorders. The toxicity of .NO is thought to be engendered, in part, by its reaction with superoxide (O2.-), yielding the potent oxidant peroxynitrite (ONOO-). However, evidence for a role of ONOO- in vivo is based largely upon detection of 3-nitrotyrosine in injured tissues. We have recently demonstrated that nitrite (NO2-), a major end-product of .NO metabolism, readily promotes tyrosine nitration through formation of nitryl chloride (NO2Cl) and nitrogen dioxide (.NO2) by reaction with the inflammatory mediators hypochlorous acid (HOCl) or myeloperoxidase. We now show that activated human polymorphonuclear neutrophils convert NO2- into NO2Cl and .NO2 through myeloperoxidase-dependent pathways. Polymorphonuclear neutrophil-mediated nitration and chlorination of tyrosine residues or 4-hydroxyphenylacetic acid is enhanced by addition of NO2- or by fluxes of .NO. Addition of 15NO2- led to 15N enrichment of nitrated phenolic substrates, confirming its role in polymorphonuclear neutrophil-mediated nitration reactions. Polymorphonuclear neutrophil-mediated inactivation of endothelial cell angiotensin-converting enzyme was exacerbated by NO2-, illustrating the physiological significance of these reaction pathways to cellular dysfunction. Our data reveal that NO2- may regulate inflammatory processes through oxidative mechanisms, perhaps by contributing to the tyrosine nitration and chlorination observed in vivo.

1,450 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a cross-sectional survey of the state of the art in the field of psychiatry, focusing on the effects of medication on mental health, including depression.
Abstract: (1998). The cross-sectional survey. International Review of Psychiatry: Vol. 10, No. 4, pp. 272-277.

977 citations


Journal ArticleDOI
TL;DR: A universalist approach to the cross-cultural adaptation of HRQoL instruments requires that six types of equivalence be taken into account and this approach requires careful qualitative research in target cultures, particularly in the assessment of conceptual equivalence.
Abstract: The health-related quality of life (HRQoL) literature presents a confused picture of what ‘equivalence’ in the cross-cultural use of HRQoL questionnaires means and how it can be assessed. Much of this confusion can be attributed to the ‘absolutist’ approach to the cross-cultural adaptation of HRQoL questionnaires. The purpose of this paper is to provide a model of equivalence from a universalist perspective and to link this to the translation and adaptation of HRQoL questionnaires. The model evolved from reviews of the HRQoL and other literatures, interviews and discussions with researchers working in HRQoL and related areas and practical experience in the adaptation and development of HRQoL instruments. The model incorporates six key types of equivalence. For each type of equivalence the paper provides a definition, proposes various strategies for examining whether and how types of equivalence can be achieved, illustrates the relationships between them and suggests the order in which they should be tested. The principal conclusions are: (1) that a universalist approach to the cross-cultural adaptation of HRQoL instruments requires that six types of equivalence be taken into account; (2) that these are sufficient to describe and explain the nature of the cross-cultural adaptation process; (3) that this approach requires careful qualitative research in target cultures, particularly in the assessment of conceptual equivalence; and (4) that this qualitative work will provide information which will be fundamental in deciding whether to adapt an existing instrument and which instrument to adapt. It should also result in a more sensitive adaptation of existing instruments and provide valuable information for interpreting the results obtained using HRQoL Instruments in the target culture.

914 citations


Book
01 Apr 1998
TL;DR: The FinCEN AI System: Finding Financial Crimes in a Large Database of Cash Transactions and Adding Value with Intelligent Agents in Financial Services are highlighted.
Abstract: 1: Introductory Papers- 1 Applications of Intelligent Agents- 2 A Brief Introduction to Software Agent Technology- 3 Agent Software for Near-Term Success in Distributed Applications- 2: Vision Papers- 4 Practical Design of Intelligent Agent Systems- 5 Vendors of Intelligent Agent Technologies: A Market Overview- 6 Brokering the Info-Underworld- 7 Personal Agents: A Walk on the Client Side- 3: Systems and Their Applications- 8 Rational Software Agents: From Theory to Practice- 9 Agent-Oriented Techniques for Traffic and Manufacturing Applications: Progress Report- 10 Co-operating Agents: Concepts and Applications- 11 Intelligent Agents in Telecommunications- 12 Managing Heterogeneous Transaction Workflows with Co-operating Agents- 13 Software Technologies for Building Agent Based Systems in Telecommunication Networks- 14 Intelligent Agents in Portfolio Management- 15 The FinCEN AI System: Finding Financial Crimes in a Large Database of Cash Transactions- 16 Adding Value with Intelligent Agents in Financial Services

877 citations


Journal ArticleDOI
25 Jul 1998-BMJ
TL;DR: This study provides by far the most persuasive evidence of a real association between size at birth and mortality from ischaemic heart disease in men, which cannot be explained by methodological artefact or socioeconomic confounding and strongly suggests that it is variation in fetal growth rate rather thansize at birth that is aetiologically important.
Abstract: Objective: To establish whether fetal growth rate (as distinct from size at birth) is associated with mortality from ischaemic heart disease. Design: Cohort study based on uniquely detailed obstetric records with 97% follow up over the entire life course and linkage to census data in adult life. Subjects: All 14 611 babies delivered at the Uppsala Academic Hospital, Sweden, during 1915-29 followed up to end of 1995. Main outcome measures: Mortality from ischaemic heart disease and other causes. Results: Cardiovascular disease showed an inverse association with birth weight for both men and women, although this was significant only for men. In men a 1000 g increase in birth weight was associated with a proportional reduction in the rate of ischaemic heart disease of 0.77 (95% confidence interval 0.67 to 0.90). Adjustment for socioeconomic circumstances at birth and in adult life led to slight attenuation of this effect. Relative to the lowest fourth of birth weight for gestational age, mortality from ischaemic heart disease in men in the second, third, and fourth fourths was 0.81 (0.66 to 0.98), 0.63 (0.50 to 0.78), and 0.67 (0.54 to 0.82), respectively. The inclusion of birth weight per se and birth weight for gestational age in the same model strengthened the association with birth weight for gestational age but removed the association with birth weight. Conclusions: This study provides by far the most persuasive evidence of a real association between size at birth and mortality from ischaemic heart disease in men, which cannot be explained by methodological artefact or socioeconomic confounding. It strongly suggests that it is variation in fetal growth rate rather than size at birth that is aetiologically important.

818 citations


Journal ArticleDOI
TL;DR: This review suggests that forms of physical activity play serve primarily immediate developmental functions, with consecutive age peaks: rhythmic stereotypies peaking in infancy, exercise play peaking during the preschool years, and rough-and-tumble playpeaking in middle childhood.
Abstract: In this review, we consider the nature and possible developmental functions of physical activity play, defined as a playful context combined with a dimension of physical vigor. We distinguish 3 kinds of physical activity play, with consecutive age peaks: rhythmic stereotypies peaking in infancy, exercise play peaking during the preschool years, and rough-and-tumble play peaking in middle childhood. Gender differences (greater prevalence in males) characterize the latter 2 forms. Function is considered in terms of beneficial immediate and deferred consequences in physical, cognitive, and social domains. Whereas most theories assume that children's play has deferred benefits, we suggest that forms of physical activity play serve primarily immediate developmental functions. Rhythmic stereotypies in infancy are hypothesized to improve control of specific motor patterns. Exercise play is hypothesized to function primarily for strength and endurance training; less clear evidence exists for possible benefits for fat reduction and thermoregulation. In addition, there may be cognitive benefits of exercise play that we hypothesize to be largely incidental to its playful or physical nature. Rough-and-tumble play has a distinctive social component; we hypothesize that it serves primarily dominance functions; evidence for benefits to fighting skills or to emotional coding are more equivocal. Further research is indicated, given the potentially important implications for children's education, health, and development.

Journal ArticleDOI
TL;DR: The field is advancing with the articulation of the linkages between human activity, regional and global environmental change, reduction in ecological services and the consequences for human health, economic opportunity and human communities.
Abstract: Evaluating ecosystem health in relation to the ecological, economic and human health spheres requires integrating human values with biophysical processes, an integration that has been explicitly avoided by conventional science. The field is advancing with the articulation of the linkages between human activity, regional and global environmental change, reduction in ecological services and the consequences for human health, economic opportunity and human communities. Increasing our understanding of these interactions will involve more active collaboration between the ecological, social and health sciences. In this, ecologists will have substantive and catalytic roles.

Book ChapterDOI
01 Apr 1998
TL;DR: The aim in this article is to help the reader to understand why agent technology is seen as a fundamentally important new tool for building such a wide array of systems.
Abstract: Intelligent agents are a new paradigm for developing software applications. More than this, agent-based computing has been hailed as ‘the next significant breakthrough in software development’ (Sargent, 1992), and ‘the new revolution in software’ (Ovum, 1994). Currently, agents are the focus of intense interest on the part of many sub-fields of computer science and artificial intelligence. Agents are being used in an increasingly wide variety of applications, ranging from comparatively small systems such as email filters to large, open, complex, mission critical systems such as air traffic control. At first sight, it may appear that such extremely different types of system can have little in common. And yet this is not the case: in both, the key abstraction used is that of an agent Our aim in this article is to help the reader to understand why agent technology is seen as a fundamentally important new tool for building such a wide array of systems. More precisely, our aims are five-fold: to introduce the reader to the concept of an agent and agent-based systems, to help the reader to recognize the domain characteristics that indicate the appropriateness of an agent-based solution, to introduce the main application areas in which agent technology has been successfully deployed to date, to identify the main obstacles that lie in the way of the agent system developer, and finally to provide a guide to the remainder of this book.

Journal ArticleDOI
TL;DR: The extent of developmental deficit and catch-up following adoption after severe global early privation was examined at 4 years in a sample of 111 Romanian children who came to the U.K. before the age of 2 years.
Abstract: The extent of developmental deficit and catch-up following adoption after severe global early privation was examined at 4 years in a sample of 111 Romanian children who came to the U.K. before the age of 2 years, and compared with respect to their functioning at the same age to a sample of 52 U.K. adopted children placed before the age of 6 months. The measures at 4 years included height, head circumference, and general cognitive level (assessed on both the McCarthy and Denver Scales). The children from Romania were severely developmentally impaired at the time of U.K. entry, with about half below the third percentile on height, on weight, on head circumference, and on developmental quotient. Many were also in a poor physical state with recurrent intestinal and respiratory infections. The catch-up in both physical growth and cognitive level appeared nearly complete at 4 years for those children who came to the U.K. before the age of 6 months, despite the fact that their background prior to U.K. entry was similar to the children who came to the U.K. when older. The developmental catch-up was also impressive, but not complete, in those placed after 6 months of age. The mean McCarthy General Cognitive Index was 92 compared with 109 for the within-U.K. adoptees. The strongest predictor of level of cognitive functioning at 4 years was the children's age at entry to the U.K. It was concluded that the remaining cognitive deficit was likely to be a consequence of gross early privation, with psychological privation probably more important than nutritional privation. A further follow-up at age 6 years will determine whether there is continuing recovery after 4 years.

Journal ArticleDOI
TL;DR: The cognitive effects of acute and chronic moderate intake of ethanol is reviewed, and although a number of studies have noted a measurable diminution in neuropsychologic parameters in habitual consumers of moderate amounts of ethanol, others have not found such changes.
Abstract: The concept of moderate consumption of ethanol (beverage alcohol) has evolved over time from considering this level of intake to be nonintoxicating and noninjurious, to encompassing levels defined as "statistically" normal in particular populations, and the public health-driven concepts that define moderate drinking as the level corresponding to the lowest overall rate of morbidity or mortality in a population. The various approaches to defining moderate consumption of ethanol provide for a range of intakes that can result in blood ethanol concentrations ranging from 5 to 6 mg/dl, to levels of over 90 mg/dl (i.e., approximately 20 mM). This review summarizes available information regarding the effects of moderate consumption of ethanol on the adult and the developing nervous systems. The metabolism of ethanol in the human is reviewed to allow for proper appreciation of the important variables that interact to influence the level of exposure of the brain to ethanol once ethanol is orally consumed. At the neurochemical level, the moderate consumption of ethanol selectively affects the function of GABA, glutamatergic, serotonergic, dopaminergic, cholinergic, and opioid neuronal systems. Ethanol can affect these systems directly, and/or the interactions between and among these systems become important in the expression of ethanol's actions. The behavioral consequences of ethanol's actions on brain neurochemistry, and the neurochemical effects themselves, are very much dose- and time-related, and the collage of ethanol's actions can change significantly even on the rising and falling phases of the blood ethanol curve. The behavioral effects of moderate ethanol intake can encompass events that the human or other animal can perceive as reinforcing through either positive (e.g., pleasurable, activating) or negative (e.g., anxiolysis, stress reduction) reinforcement mechanisms. Genetic factors and gender play an important role in the metabolism and behavioral actions of ethanol, and doses of ethanol producing pleasurable feelings, activation, and reduction of anxiety in some humans/animals can have aversive, sedative, or no effect in others. Research on the cognitive effects of acute and chronic moderate intake of ethanol is reviewed, and although a number of studies have noted a measurable diminution in neuropsychologic parameters in habitual consumers of moderate amounts of ethanol, others have not found such changes. Recent studies have also noted some positive effects of moderate ethanol consumption on cognitive performance in the aging human. The moderate consumption of ethanol by pregnant women can have significant consequences on the developing nervous system of the fetus. Consumption of ethanol during pregnancy at levels considered to be in the moderate range can generate fetal alcohol effects (behavioral, cognitive anomalies) in the offspring. A number of factors--including gestational period, the periodicity of the mother's drinking, genetic factors, etc.--play important roles in determining the effect of ethanol on the developing central nervous system. A series of recommendations for future research endeavors, at all levels, is included with this review as part of the assessment of the effects of moderate ethanol consumption on the central nervous system.

Book ChapterDOI
04 Jul 1998
TL;DR: Within the ATAL community, the belief-desire-intention (BDI) model has come to be possibly the best known and best studied model of practical reasoning agents.
Abstract: Within the ATAL community, the belief-desire-intention (BDI) model has come to be possibly the best known and best studied model of practical reasoning agents. There are several reasons for its success, but perhaps the most compelling are that the BDI model combines a respectable philosophical model of human practical reasoning, (originally developed by Michael Bratman [1]), a number of implementations (in the IRMA architecture [2] and the various PRS-like systems currently available [7]), several successful applications (including the now-famous fault diagnosis system for the space shuttle, as well as factory process control systems and business process management [8]), and finally, an elegant abstract logical semantics, which have been taken up and elaborated upon widely within the agent research community [14, 16].

Journal ArticleDOI
TL;DR: A result is presented that allows one to trade off errors on the training sample against improved generalization performance, and a more general result in terms of "luckiness" functions, which provides a quite general way for exploiting serendipitous simplicity in observed data to obtain better prediction accuracy from small training sets.
Abstract: The paper introduces some generalizations of Vapnik's (1982) method of structural risk minimization (SRM). As well as making explicit some of the details on SRM, it provides a result that allows one to trade off errors on the training sample against improved generalization performance. It then considers the more general case when the hierarchy of classes is chosen in response to the data. A result is presented on the generalization performance of classifiers with a "large margin". This theoretically explains the impressive generalization performance of the maximal margin hyperplane algorithm of Vapnik and co-workers (which is the basis for their support vector machines). The paper concludes with a more general result in terms of "luckiness" functions, which provides a quite general way for exploiting serendipitous simplicity in observed data to obtain better prediction accuracy from small training sets. Four examples are given of such functions, including the Vapnik-Chervonenkis (1971) dimension measured on the sample.

Journal ArticleDOI
TL;DR: Secular and cohort trends in mortality from cancer in Scotland during 1953-93, and incidence during 1960-90, were analysed using individual records from the national mortality and registration files.
Abstract: Secular and cohort trends in mortality from cancer in Scotland during 1953-93, and incidence during 1960-90, were analysed using individual records from the national mortality and registration files. For certain cancer sites, the secular analyses of mortality were extended back to 1911 by use of published data. Mortality from cancer at older ages in Scotland has increased over the last 40 years. In each sex, this trend has been dominated by the effects of smoking: all-cancer rates and rates of lung cancer, now the most common fatal cancer in men and in women in Scotland, reached a peak in the cohort of men born at the turn of the century and the cohort of women born in the 1920s. For much of the period, the Scottish all-age rates of lung cancer were the highest reported in the world; they are now decreasing on a secular basis in men, but are still increasing in women. There have also been large increases at older ages in the incidence and mortality rates for cancer of the prostate in recent years. bladder cancer, nervous system cancer, non-Hodgkin's lymphoma, myeloma and leukaemia; for each there is likely to be a considerable artefactual element to the increase, with differing degrees of possibility that there may in addition be an element of real increase. Substantial decreases in mortality at all ages have occurred for stomach and colorectal cancers and substantial increases at all ages for pleural cancer and melanoma. Rates of mortality from breast cancer, the most common cancer in women in Scotland, have generally increased over the past 80 years; a temporary cessation in this upward trend occurred in the years during and after the Second World War, and recently rates have turned downward, probably at least in part because of better treatment. Mortality from ovarian cancer, the second most common reproductive-related female tumour in Scotland, has also increased at older ages. At younger ages, mortality from cancer in Scotland has decreased, especially in men, whereas incidence has not. This divergence, which has been a consequence of better treatment, has occurred especially for cancers of the testis and ovary, Hodgkin's disease and leukaemia. There have been increases at young adult ages, however, in both mortality from and incidence of oral and pharyngeal, oesophageal and laryngeal cancers in men, and melanoma and non-Hodgkin's lymphoma in each sex. Cervical cancer rates at young ages also increased, but this trend has reversed for incidence in the most recent birth cohorts. Incidence rates have also increased for testicular cancer in young adults and leukaemia in children. With the possible exceptions of non-Hodgkin's lymphoma and childhood leukaemia, the increasing rates are likely largely to reflect real rises in incidence, and they highlight the need for investigation of the causes of these cancers, and, when causes are known, for preventive action.

Journal ArticleDOI
TL;DR: Methods are available to correct for bias (but not generally power loss) due to measurement error, if information on the magnitude and type of error is available, however, these methods can be complicated to use, and should be used cautiously as "correction" can magnify confounding if it is present.
Abstract: Random error (misclassification) in exposure measurements usually biases a relative risk, regression coefficient, or other effect measure towards the null value (no association). The most important exception is Berkson type error, which causes little or no bias. Berkson type error arises, in particular, due to use of group average exposure in place of individual values. Random error in exposure measurements, Berkson or otherwise, reduces the power of a study, making it more likely that real associations are not detected. Random error in confounding variables compromises the control of their effect, leaving residual confounding. Random error in a variable that modifies the effect of exposure on health--for example, an indicator of susceptibility--tends to diminish the observed modification of effect, but error in the exposure can create a supurious appearance of modification. Methods are available to correct for bias (but not generally power loss) due to measurement error, if information on the magnitude and type of error is available. These methods can be complicated to use, however, and should be used cautiously as "correction" can magnify confounding if it is present.

Journal ArticleDOI
TL;DR: Based on the observed increased risk of malignant development, OLP patients should be offered regular follow-up examination from two to four times annually and asked to report any changes in their lesions and/or symptoms.
Abstract: Lichen planus (LP) is a relatively common disorder of the stratified squamous epithelia, which is, in many ways, an enigma. This paper is the consensus outcome of a workshop held in Switzerland in 1995, involving a selection of clinicians and scientists with an interest in the condition and its management. The oral (OLP) eruptions usually have a distinct clinical morphology and characteristic distribution, but OLP may also present a confusing array of patterns and forms, and other disorders may clinically simulate OLP. Lesions may affect other mucosae and/or skin. Lichen planus is probably of multifactorial origin, sometimes induced by drugs or dental materials, often idiopathic, and with an immunopathogenesis involving T-cells in particular. The etiopathogenesis appears to be complex, with interactions between and among genetic, environmental, and lifestyle factors, but much has now been clarified about the mechanisms involved, and interesting new associations, such as with liver disease, have emerged. T...


Journal ArticleDOI
TL;DR: It is concluded that women with polycystic ovary syndrome do not have markedly higher than average mortality from circulatory disease, even though the condition is strongly associated with diabetes, lipid abnormalities, and other cardiovascular risk factors.

Journal ArticleDOI
18 Apr 1998-BMJ
TL;DR: The example of asthma treatment is used to illustrate how qualitative methods can broaden the scope of evidence based medicine and help bridge the gap between scientific evidence and clinical practice.
Abstract: Qualitative research may seem unscientific and anecdotal to many medical scientists. However, as the critics of evidence based medicine are quick to point out, medicine itself is more than the application of scientific rules.1 Clinical experience, based on personal observation, reflection, and judgment, is also needed to translate scientific results into treatment of individual patients.2 Personal experience is often characterised as being anecdotal, ungeneralisable, and a poor basis for making scientific decisions. However, it is often a more powerful persuader than scientific publication in changing clinical practice,3-5 as illustrated by the occasional series “A patient who changed my practice” in the BMJ. 6 In an attempt to widen the scope of evidence based medicine, recent workshops have included units on other subjects, including economic analysis and qualitative research.7 However, to do so is to move beyond the discipline of clinical epidemiology that underpins evidence based medicine. Qualitative research, in particular, addresses research questions that are different from those considered by clinical epidemiology. Qualitative research can investigate practitioners' and patients' attitudes, beliefs, and preferences, and the whole question of how evidence is turned into practice. The value of qualitative methods lies in their ability to pursue systematically the kinds of research questions that are not easily answerable by experimental methods. We use the example of asthma treatment to illustrate how qualitative methods can broaden the scope of evidence based medicine. Although there is consensus over evidence based practice in the treatment of asthma,8 questions remain about general practitioners' use of clinical guidelines and patients' use of prescribed medication.9 #### Summary points Qualitative methods can help bridge the gap between scientific evidence and clinical practice …

Journal ArticleDOI
TL;DR: The results suggest the “EPR effect” in solid tumor primarily arises from in the difference in clearance rate between the solid tumor and the normal tissues after initial penetration of the polymers into these tissues.
Abstract: The objective of this study was to investigate the molecular weight (MW) and time-dependence of the phenomenon termed "the enhanced permeability and retention" (EPR) effect in solid tumor, in particular to determine and define the early phase accumulation of macromolecules in tumor and normal tissues and the relationship between blood concentration and tissue clearance. As a model, radioiodinated N-(2-hydroxypropyl)methacrylamide (HPMA) copolymers of MW ranging from 4.5 K to 800 K were administered i.v. to mice bearing sarcoma 180 tumor. Within 10 min all HPMA copolymers accumulated effectively in the tumor regardless of MW (1.0-1.5% of injected dose per g of tumor). However, higher MW copolymers (> 50 K) showed significantly increased tumor accumulation after 6 h, while the lower MW copolymers (< 40 K) were cleared rapidly from tumor tissue due to rapid diffusion back into the bloodstream. Blood clearance was also MW-dependent; the lower MW copolymers displayed rapid clearance, with kidney radioactivity of the copolymers of MW < 20 K representing 24% of injected dose per g kidney at 1 min after i.v. administration. Within 10 min these copolymers passed through the kidney and were excreted in the urine. Higher MW copolymers consistently showed kidney levels of 3-5% dose per g kidney in the early phase with no time-dependent accumulation in kidney. There was also no progressive accumulation in muscle or liver, regardless of polymer MW. These results suggest the "EPR effect" in solid tumor primarily arises from in the difference in clearance rate between the solid tumor and the normal tissues after initial penetration of the polymers into these tissues.

Journal ArticleDOI
TL;DR: The authors used Latent Variable Analysis (LVA) to assess whether practices identified with high commitment management do form a unity, and found that four progressive styles of HCM were discovered.
Abstract: Are the practices widely associated with the high commitment or involvement model, such as job flexibility and minimal status differences, actually used in conjunction with each other? Or rather are they being used, as some commentators speculate, in a fragmented or ad hoc manner? The authors use Latent Variable Analysis to assess whether practices identified with high commitment management do form a unity. They are simultaneously attempting to see if such practices can be used as indicators for measuring an underlying high commitment orientation on the part of management. The analysis uses data from the 1990 UK Workplace Industrial Relations Survey and its sister survey, the Employers' Manpower and Skills Practices Survey, on the use of a range of high commitment practices across the whole economy. The evidence suggests that there is an identifiable pattern to the use of high commitment practices. Four progressive styles of high commitment management (HCM) were discovered. Though the use of it in its ent...

Journal ArticleDOI
TL;DR: A general algebraic formulation for a wide range of combinatorial problems including Satisfiability, Graph Colorability and Graph Isomorphism is described, and it is demonstrated that the complexity of solving this decision problem is determined in many cases by simple algebraic properties of the relational structures involved.

Journal ArticleDOI
TL;DR: Results indicate an abnormality in social orientation in autism even at the early age of 20 months, with infants with autism showing fewer shifts of attention between an object and a person, and between person and person, than did the two control groups.
Abstract: Spontaneous shifts of attention were observed in autistic, typically developing, and nonautistic developmentally delayed infants. Three types of attention shifting behaviour were observed; (1) between an object and another object, (2) between an object and a person, and (3) between a person and another person. The two control groups shifted attention more frequently between an object and a person than between an object and another object or between a person and another person. The infants with autism showed a different pattern, shifting attention between an object and another object more than any other type of shift. Furthermore, infants with autism showed fewer shifts of attention between an object and a person, and between person and person, than did the two control groups. They also spent less time overall looking at people and looked more briefly at people and for longer durations at objects, compared to the two control groups. These results indicate an abnormality in social orientation in autism even at the early age of 20 months.

Journal ArticleDOI
TL;DR: The proposed contact detection algorithm involves no binary search at any stage, and the performance of the algorithm in terms of total detection time is not influenced by packing density, while memory requirements are insignificant.
Abstract: Large-scale discrete element simulations, as well as a whole range of related problems, involve contact of a large number of separate bodies. In this context an efficient and robust contact detection algorithm is necessary. There has been a number of contact detection algorithms with total detection time (CPU time needed to detect all couples close to each other) proportional to Nln(N) (where N is the total number of separate bodies) reported in recent years. In this work a contact detection algorithm with total detection time proportional to N is reported. The algorithm is termed NBS, which stands for no binary search. In other words, the proposed algorithm involves no binary search at any stage. In addition the performance of the algorithm in terms of total detection time is not influenced by packing density, while memory requirements are insignificant. The only limitation of the algorithm is its applicability to the systems comprising bodies of similar size. © 1998 John Wiley & Sons, Ltd.

Journal ArticleDOI
Ann Oakley1
TL;DR: The authors examines the character of the debate about ''quantitative'' and ''qualitative'' methods in feminist social science and argues in favour of rehabilitating quantitative methods and integrating a range of methods in the task of creating an emancipatory social science.
Abstract: This paper examines the character of the debate about `quantitative' and `qualitative' methods in feminist social science. The `paradigm argument' has been central to feminist social science methodology; the feminist case against `malestream' methods and in favour of qualitative methods has paralleled other methodological arguments within social science against the unthinking adoption by social science of a natural science model of inquiry. The paper argues in favour of rehabilitating quantitative methods and integrating a range of methods in the task of creating an emancipatory social science. It draws on the history of social and natural science, suggesting that a social and historical understanding of ways of knowing gives us the problem not of gender and methodology, but of the gendering of methodology as itself a social construction.

Journal ArticleDOI
01 Dec 1998-Thorax
TL;DR: These clinical guidelines for smoking cessation interventions in England are written for the English health care system but may prove relevant and adaptable to other countries and health care systems.
Abstract: These guidelines have been written in parallel with guidance on the cost effectiveness of smoking cessation interventions, produced by the Centre for Health Economics at the University of York. The cost effectiveness guidance underpins these clinical guidelines and provides the economic justification for them. It is published as the second part of this Thorax supplement. These smoking cessation clinical guidelines are also published in a shorter version as a journal article ( BMJ 1999; 318 : in press). The clinical guidelines have been submitted to many professions for their official endorsement and support. This was not a passive process and their suggestions are reflected in this final version. The guidelines were commissioned by the Health Education Authority (HEA), which is responsible for health education in England. They are written for the English health care system but may prove relevant and adaptable to other countries and health care systems. Comments and questions about these guidelines can be addressed to Dr Ann McNeill at Health Education Authority, Trevelyan House, 30 Great Peter Street, London SW1P 2HW. #### Professional endorsement At the time of going to press the following organisations have endorsed these guidelines: Royal College of Physicians (London), Royal College of General Practitioners, British Medical Association, Royal College of Nursing, Royal College of Midwives, Community Practitioners’ and Health Visitors’ Association, British Thoracic Society, British Lung Foundation, National Asthma Campaign, National Primary Care Facilitators Programme, National Heart Forum, British Dental Association, British Dental Hygienists Association, National Pharmaceutical Association, Royal Pharmaceutical Society of Great Britain, Action on Smoking and Health, ASH Scotland, Quit, Association for Public Health, Imperial Cancer Research Fund, Cancer Research Campaign. #### Acknowledgements This project has depended especially on the goodwill and hard work of the peer reviewers and we would like to thank them for their contribution in reviewing the draft guidelines. We also thank Jacqueline …