scispace - formally typeset
Search or ask a question

Showing papers on "Context (language use) published in 2008"


Book
01 Jan 2008
TL;DR: In this paper, the authors present strategies for qualitative data analysis, including context, process and theoretical integration, and provide a criterion for evaluation of these strategies and answers to student questions and answers.
Abstract: Introduction -- Practical considerations -- Prelude to analysis -- Strategies for qualitative data analysis -- Introduction to context, process and theoretical integration -- Memos and diagrams -- Theoretical sampling -- Analyzing data for concepts -- Elaborating the analysis -- Analyzing data for context -- Bringing process into the analysis -- Integrating around a concept -- Writing theses, monographs, and giving talks -- Criterion for evaluation -- Student questions and answers to these.

31,251 citations


01 Jan 2008
TL;DR: In this paper, a multivalued mapping from a space X to a space S carries a probability measure defined over subsets of X into a system of upper and lower probabilities over S. Some basic properties of such systems are explored in Sects. 1 and 2.
Abstract: A multivalued mapping from a space X to a space S carries a probability measure defined over subsets of X into a system of upper and lower probabilities over subsets of S. Some basic properties of such systems are explored in Sects. 1 and 2. Other approaches to upper and lower probabilities are possible and some of these are related to the present approach in Sect. 3. A distinctive feature of the present approach is a rule for conditioning, or more generally, a rule for combining sources of information, as discussed in Sects. 4 and 5. Finally, the context in statistical inference from which the present theory arose is sketched briefly in Sect. 6.

4,637 citations


01 Jul 2008
TL;DR: The fourth element of the IPCC Fourth Assessment Report ''Climate Change 2007'' as mentioned in this paper is based on the assessment carried out by the three Working Groups of the Panel of the International Organization for Standardization.
Abstract: This Synthesis Report is the fourth element of the IPCC Fourth Assessment Report ''Climate Change 2007''. The Synthesis Report is based on the assessment carried out by the three Working Groups of the IPCC. It provides an integrated view of climate change and addresses the following topics: 1) Observed changes in climate and their effects; 2) Causes of change; 3) Climate change and its impacts in the near and long term under different scenarios; 4) Adaptation and mitigation options and responses, and the interrelationship with sustainable development, at global and regional levels; 5) The long-term perspective: scientific and socio-economic aspects relevant to adaptation and mitigation, consistent with the objectives and provisions of the Convention, and in the context of sustainable development; 6) Robust findings, key uncertainties (ln)

2,132 citations


Book
01 Sep 2008
TL;DR: The authors describes a research paradigm shared by Indigenous scholars in Canada and Australia, and demonstrates how this paradigm can be put into practice, and how to make careful choices in our selection of topics, methods of data collection, forms of analysis and finally in the way we present information.
Abstract: Indigenous researchers are knowledge seekers who work to progress Indigenous ways of being, knowing and doing in a modern and constantly evolving context. This book describes a research paradigm shared by Indigenous scholars in Canada and Australia, and demonstrates how this paradigm can be put into practice. Relationships don't just shape Indigenous reality, they are our reality. Indigenous researchers develop relationships with ideas in order to achieve enlightenment in the ceremony that is Indigenous research. Indigenous research is the ceremony of maintaining accountability to these relationships. For researchers to be accountable to all our relations, we must make careful choices in our selection of topics, methods of data collection, forms of analysis and finally in the way we present information. I'm an Opaskwayak Cree from northern Manitoba currently living in the Northern Rivers area of New South Wales, Australia. I'm also a father of three boys, a researcher, son, uncle, teacher, world traveller, knowledge keeper and knowledge seeker. As an educated Indian, I've spent much of my life straddling the Indigenous and academic worlds. Most of my time these days is spent teaching other Indigenous knowledge seekers (and my kids) how to accomplish this balancing act while still keeping both feet on the ground.

2,035 citations


Journal ArticleDOI
TL;DR: In this paper, the authors review team research that has been conducted over the past 10 years and discuss the nature of work teams in context and note the substantive differences underlying different types of teams.

1,985 citations


Journal ArticleDOI
TL;DR: Genevestigator V3 is a novel meta-analysis system resulting from new algorithmic and software development using a client/server architecture, large-scale manual curation and quality control of microarray data for several organisms, and curation of pathway data for mouse and Arabidopsis.
Abstract: The Web-based software tool Genevestigator provides powerful tools for biologists to explore gene expression across a wide variety of biological contexts. Its first releases, however, were limited by the scaling ability of the system architecture, multiorganism data storage and analysis capability, and availability of computationally intensive analysis methods. Genevestigator V3 is a novel meta-analysis system resulting from new algorithmic and software development using a client/server architecture, large-scale manual curation and quality control of microarray data for several organisms, and curation of pathway data for mouse and Arabidopsis. In addition to improved querying features, Genevestigator V3 provides new tools to analyze the expression of genes in many different contexts, to identify biomarker genes, to cluster genes into expression modules, and to model expression responses in the context of metabolic and regulatory networks. Being a reference expression database with user-friendly tools, Genevestigator V3 facilitates discovery research and hypothesis validation.

1,859 citations


Journal ArticleDOI
TL;DR: This paper outlines the inconsistencies of existing metrics in the context of multi- object miss-distances for performance evaluation, and proposes a new mathematically and intuitively consistent metric that addresses the drawbacks of current multi-object performance evaluation metrics.
Abstract: The concept of a miss-distance, or error, between a reference quantity and its estimated/controlled value, plays a fundamental role in any filtering/control problem. Yet there is no satisfactory notion of a miss-distance in the well-established field of multi-object filtering. In this paper, we outline the inconsistencies of existing metrics in the context of multi-object miss-distances for performance evaluation. We then propose a new mathematically and intuitively consistent metric that addresses the drawbacks of current multi-object performance evaluation metrics.

1,765 citations


Journal ArticleDOI
TL;DR: In this paper, the use of diamond impurity centres as magnetic field sensors is explored, promising a new approach to single-spin detection and magnetic-field imaging at the nanoscale.
Abstract: Impurity centres in diamond have recently attracted attention in the context of quantum information processing. Now their use as magnetic-field sensors is explored, promising a fresh approach to single-spin detection and magnetic-field imaging at the nanoscale.

1,691 citations


Journal ArticleDOI
03 Sep 2008-JAMA
TL;DR: In this study of adults with subjective memory impairment, a 6-month program of physical activity provided a modest improvement in cognition over an 18-month follow-up period.
Abstract: Context Many observational studies have shown that physical activity reduces the risk of cognitive decline; however, evidence from randomized trials is lacking. Objective To determine whether physical activity reduces the rate of cognitive decline among older adults at risk. Design and Setting Randomized controlled trial of a 24-week physical activity intervention conducted between 2004 and 2007 in metropolitan Perth, Western Australia. Assessors of cognitive function were blinded to group membership. Participants We recruited volunteers who reported memory problems but did not meet criteria for dementia. Three hundred eleven individuals aged 50 years or older were screened for eligibility, 89 were not eligible, and 52 refused to participate. A total of 170 participants were randomized and 138 participants completed the 18-month assessment. Intervention Participants were randomly allocated to an education and usual care group or to a 24-week home-based program of physical activity. Main Outcome Measure Change in Alzheimer Disease Assessment Scale–Cognitive Subscale (ADAS-Cog) scores (possible range, 0-70) over 18 months. Results In an intent-to-treat analysis, participants in the intervention group improved 0.26 points (95% confidence interval, −0.89 to 0.54) and those in the usual care group deteriorated 1.04 points (95% confidence interval, 0.32 to 1.82) on the ADAS-Cog at the end of the intervention. The absolute difference of the outcome measure between the intervention and control groups was −1.3 points (95% confidence interval,−2.38 to −0.22) at the end of the intervention. At 18 months, participants in the intervention group improved 0.73 points (95% confidence interval, −1.27 to 0.03) on the ADAS-Cog, and those in the usual care group improved 0.04 points (95% confidence interval, −0.46 to 0.88). Word list delayed recall and Clinical Dementia Rating sum of boxes improved modestly as well, whereas word list total immediate recall, digit symbol coding, verbal fluency, Beck depression score, and Medical Outcomes 36-Item Short-Form physical and mental component summaries did not change significantly. Conclusions In this study of adults with subjective memory impairment, a 6-month program of physical activity provided a modest improvement in cognition over an 18-month follow-up period. Trial Registration anzctr.org.au Identifier: ACTRN12605000136606

1,485 citations


Proceedings ArticleDOI
11 Feb 2008
TL;DR: It is shown that opinion spam is quite different from Web spam and email spam, and thus requires different detection techniques, and therefore requires some novel techniques to detect them.
Abstract: Evaluative texts on the Web have become a valuable source of opinions on products, services, events, individuals, etc. Recently, many researchers have studied such opinion sources as product reviews, forum posts, and blogs. However, existing research has been focused on classification and summarization of opinions using natural language processing and data mining techniques. An important issue that has been neglected so far is opinion spam or trustworthiness of online opinions. In this paper, we study this issue in the context of product reviews, which are opinion rich and are widely used by consumers and product manufacturers. In the past two years, several startup companies also appeared which aggregate opinions from product reviews. It is thus high time to study spam in reviews. To the best of our knowledge, there is still no published study on this topic, although Web spam and email spam have been investigated extensively. We will see that opinion spam is quite different from Web spam and email spam, and thus requires different detection techniques. Based on the analysis of 5.8 million reviews and 2.14 million reviewers from amazon.com, we show that opinion spam in reviews is widespread. This paper analyzes such spam activities and presents some novel techniques to detect them

1,385 citations


Journal ArticleDOI
13 Feb 2008-JAMA
TL;DR: Age- and sex-adjusted self-reported measures of mental health, physical functioning, work or school limitations, and social limitations among adults with spine problems were worse in 2005 than in 1997, and spine-related expenditures increased substantially from 1997 to 2005, without evidence of corresponding improvement in self-assessed health status.
Abstract: Context Back and neck problems are among the symptoms most commonly encountered in clinical practice. However, few studies have examined national trends in expenditures for back and neck problems or related these trends to health status measures. Objectives To estimate inpatient, outpatient, emergency department, and pharmacy expenditures related to back and neck problems in the United States from 1997 through 2005 and to examine associated trends in health status. Design and Setting Age- and sex-adjusted analysis of the nationally representative Medical Expenditure Panel Survey (MEPS) from 1997 to 2005 using complex survey regression methods. The MEPS is a household survey of medical expenditures weighted to represent national estimates. Respondents were US adults (> 17 years) who self-reported back and neck problems (referred to as “spine problems” based on MEPS descriptions and International Classification of Diseases, Ninth Revision, Clinical Modification definitions). Main Outcome Measures Spine-related expenditures for health services (inflation-adjusted); annual surveys of self-reported health status. Results National estimates were based on annual samples of survey respondents with and without self-reported spine problems from 1997 through 2005. A total of 23 045 respondents were sampled in 1997, including 3139 who reported spine problems. In 2005, the sample included 22 258 respondents, including 3187 who reported spine problems. In 1997, the mean age- and sex-adjusted medical costs for respondents with spine problems was $4695 (95% confidence interval [CI], $4181-$5209), compared with $2731 (95% CI, $2557-$2904) among those without spine problems (inflation-adjusted to 2005 dollars). In 2005, the mean age- and sex- adjusted medical expenditure among respondents with spine problems was $6096 (95% CI, $5670-$6522), compared with $3516 (95% CI, $3266-$3765) among those without spine problems. Total estimated expenditures among respondents with spine problems increased 65% (adjusted for inflation) from 1997 to 2005, more rapidly than overall health expenditures. The estimated proportion of persons with back or neck problems who self-reported physical functioning limitations increased from 20.7% (95% CI, 19.9%-21.4%) to 24.7% (95% CI, 23.7%-25.6%) from 1997 to 2005. Age- and sex-adjusted self-reported measures of mental health, physical functioning, work or school limitations, and social limitations among adults with spine problems were worse in 2005 than in 1997. Conclusions In this survey population, self-reported back and neck problems accounted for a large proportion of health care expenditures. These spine-related expenditures have increased substantially from 1997 to 2005, without evidence of corresponding improvement in self-assessed health status.

Journal ArticleDOI
06 Aug 2008-JAMA
TL;DR: This study provides the first direct estimates of HIV incidence in the United States using laboratory technologies previously implemented only in clinic-based settings and indicated that HIV incidence increased in the mid-1990s, then slightly declined after 1999 and has been stable thereafter.
Abstract: Context Incidence of human immunodeficiency virus (HIV) in the United States has not been directly measured. New assays that differentiate recent vs long-standing HIV infections allow improved estimation of HIV incidence. Objective To estimate HIV incidence in the United States. Design, Setting, and Patients Remnant diagnostic serum specimens from patients 13 years or older and newly diagnosed with HIV during 2006 in 22 states were tested with the BED HIV-1 capture enzyme immunoassay to classify infections as recent or long-standing. Information on HIV cases was reported to the Centers for Disease Control and Prevention through June 2007. Incidence of HIV in the 22 states during 2006 was estimated using a statistical approach with adjustment for testing frequency and extrapolated to the United States. Results were corroborated with back-calculation of HIV incidence for 1977-2006 based on HIV diagnoses from 40 states and AIDS incidence from 50 states and the District of Columbia. Main Outcome Measure Estimated HIV incidence. Results An estimated 39 400 persons were diagnosed with HIV in 2006 in the 22 states. Of 6864 diagnostic specimens tested using the BED assay, 2133 (31%) were classified as recent infections. Based on extrapolations from these data, the estimated number of new infections for the United States in 2006 was 56 300 (95% confidence interval [CI], 48 200-64 500); the estimated incidence rate was 22.8 per 100 000 population (95% CI, 19.5-26.1). Forty-five percent of infections were among black individuals and 53% among men who have sex with men. The back-calculation (n = 1.230 million HIV/AIDS cases reported by the end of 2006) yielded an estimate of 55 400 (95% CI, 50 000-60 800) new infections per year for 2003-2006 and indicated that HIV incidence increased in the mid-1990s, then slightly declined after 1999 and has been stable thereafter. Conclusions This study provides the first direct estimates of HIV incidence in the United States using laboratory technologies previously implemented only in clinic-based settings. New HIV infections in the United States remain concentrated among men who have sex with men and among black individuals.

Journal ArticleDOI
TL;DR: The mathematical theory behind the simple random walk is introduced and how this relates to Brownian motion and diffusive processes in general and a reinforced random walk can be used to model movement where the individual changes its environment.
Abstract: Mathematical modelling of the movement of animals, micro-organisms and cells is of great relevance in the fields of biology, ecology and medicine. Movement models can take many different forms, but the most widely used are based on the extensions of simple random walk processes. In this review paper, our aim is twofold: to introduce the mathematics behind random walks in a straightforward manner and to explain how such models can be used to aid our understanding of biological processes. We introduce the mathematical theory behind the simple random walk and explain how this relates to Brownian motion and diffusive processes in general. We demonstrate how these simple models can be extended to include drift and waiting times or be used to calculate first passage times. We discuss biased random walks and show how hyperbolic models can be used to generate correlated random walks. We cover two main applications of the random walk model. Firstly, we review models and results relating to the movement, dispersal and population redistribution of animals and micro-organisms. This includes direct calculation of mean squared displacement, mean dispersal distance, tortuosity measures, as well as possible limitations of these model approaches. Secondly, oriented movement and chemotaxis models are reviewed. General hyperbolic models based on the linear transport equation are introduced and we show how a reinforced random walk can be used to model movement where the individual changes its environment. We discuss the applications of these models in the context of cell migration leading to blood vessel growth (angiogenesis). Finally, we discuss how the various random walk models and approaches are related and the connections that underpin many of the key processes involved.

Journal ArticleDOI
TL;DR: The ecotoxicological studies of both drugs imply that they do not easily cause acute toxic effects at their environmental concentrations, however their chronic effects need cautious attention.

Journal ArticleDOI
TL;DR: In this paper, the authors formulate a version of the growth model in which production is carried out by heterogeneous establishments and calibrate it to US data, and argue that differences in the allocation of resources across establishments that differ in productivity may be an important factor in accounting for cross-country differences in output per capita.

Journal ArticleDOI
TL;DR: The evidence indicates that neither the degree by which fear reduces nor the ending fear level predict therapeutic outcome, and strategies for enhancing inhibitory learning, and its retrieval over time and context, are reviewed.

Journal ArticleDOI
TL;DR: For instance, this article pointed out that people have become increasingly detached from overarching institutions such as public schools, political parties, and civic groups, which at one time provided a shared context for receiving and interpreting messages.
Abstract: The great thinkers who influenced the contemporary field of political communication were preoccupied with understanding the political, social, psychological, and economic transformations in modern industrial society. But societies have changed so dramatically since the time of these landmark contributions that one must question the continuing relevance of paradigms drawn from them. To cite but a few examples, people have become increasingly detached from overarching institutions such as public schools, political parties, and civic groups, which at one time provided a shared context for receiving and interpreting messages. What are the implications of this detachment on how people respond to media messages? Information channels have proliferated and simultaneously become more individualized. Is it still relevant to conceive of ‘‘mass media’’ or has that concept been made obsolete by audience fragmentation and isolation from the public sphere? Does this new environment foreshadow a return to a time of minimal effects? If we are looking at a new minimal effects era, how can we distinguish it from the last such period? Retracing some of the intellectual origins of the field may help us identify the fundamental changes in society and communication technologies that are affecting the composition of audiences, the delivery of information, and the experience of politics itself. In particular, we are concerned with the growing disjuncture between the prevailing research strategies and the sociotechnological context of political communication, which may give rise to unproductive battles over findings (Donsbach, 2006). To the extent that research paradigms fail to reflect prevailing social and technological patterns, the validity of results will be in serious question. Consider just one case in point: the famous earlier era of ‘‘minimal effects’’ that emerged from studies done in the 1940s and early 1950s (Klapper, 1960). The underlying context for this scholarship consisted of a premass communication media system and relatively dense memberships in a group-based society networked through political parties, churches, unions, and service organizations (Putnam, 2000). At this time, scholars concluded that media messages were filtered through social reference processes as described in the two-step flow model proposed by Katz and Lazarsfeld (1955; Bennett & Manheim, 2006). Although the classic study by Lang

Journal ArticleDOI
TL;DR: The theory of large deviations as discussed by the authors is concerned with the exponential decay of probabilities of large fluctuations in random systems, and it provides exponential-order estimates of probabilities that refine and generalize Einstein's theory of fluctuations.
Abstract: The theory of large deviations is concerned with the exponential decay of probabilities of large fluctuations in random systems. These probabilities are important in many fields of study, including statistics, finance, and engineering, as they often yield valuable information about the large fluctuations of a random system around its most probable state or trajectory. In the context of equilibrium statistical mechanics, the theory of large deviations provides exponential-order estimates of probabilities that refine and generalize Einstein's theory of fluctuations. This review explores this and other connections between large deviation theory and statistical mechanics, in an effort to show that the mathematical language of statistical mechanics is the language of large deviation theory. The first part of the review presents the basics of large deviation theory, and works out many of its classical applications related to sums of random variables and Markov processes. The second part goes through many problems and results of statistical mechanics, and shows how these can be formulated and derived within the context of large deviation theory. The problems and results treated cover a wide range of physical systems, including equilibrium many-particle systems, noise-perturbed dynamics, nonequilibrium systems, as well as multifractals, disordered systems, and chaotic systems. This review also covers many fundamental aspects of statistical mechanics, such as the derivation of variational principles characterizing equilibrium and nonequilibrium states, the breaking of the Legendre transform for nonconcave entropies, and the characterization of nonequilibrium fluctuations through fluctuation relations.

Journal ArticleDOI
TL;DR: In this paper, the authors use meta-analytic techniques to examine how knowledge, organization and network level antecedents differentially impact organizational knowledge transfer, and demonstrate how the intra-and inter-organizational context, the directionality of knowledge transfers, and measurement characteristics moderate the relationships studied.
Abstract: Research on organizational knowledge transfer is burgeoning, and yet our understanding of its antecedents and consequences remains rather unclear. Although conceptual and qualitative reviews of the organizational knowledge transfer literature have emerged, no study has attempted to summarize previous quantitative empirical findings. As a first step towards that goal, we use meta-analytic techniques to examine how knowledge, organization and network level antecedents differentially impact organizational knowledge transfer. Additionally, we consolidate research on the relationship between knowledge transfer and its consequences. We also demonstrate how the intra- and inter-organizational context, the directionality of knowledge transfers, and measurement characteristics moderate the relationships studied. By aggregating and consolidating existing research, our study not only reveals new insights into the levers and outcomes of organizational knowledge transfer, but also provides meaningful directions for future research.

Journal ArticleDOI
TL;DR: The role of several IRF family members in the regulation of the cell cycle and apoptosis has important implications for understanding susceptibility to and progression of several cancers.
Abstract: The interferon regulatory factor (IRF) family, consisting of nine members in mammals, was identified in the late 1980s in the context of research into the type I interferon system. Subsequent studies over the past two decades have revealed the versatile and critical functions performed by this transcription factor family. Indeed, many IRF members play central roles in the cellular differentiation of hematopoietic cells and in the regulation of gene expression in response to pathogen-derived danger signals. In particular, the advances made in understanding the immunobiology of Toll-like and other pattern-recognition receptors have recently generated new momentum for the study of IRFs. Moreover, the role of several IRF family members in the regulation of the cell cycle and apoptosis has important implications for understanding susceptibility to and progression of several cancers.

Journal ArticleDOI
TL;DR: ATL: a model transformation language and its execution environment based on the Eclipse framework is presented and ATL tools provide support for the major tasks involved in using a language: editing, compiling, executing, and debugging.

Journal ArticleDOI
TL;DR: The demonstration of the involvement of placebo mechanisms inclinical trials and routine clinical practice has highlighted interesting considerations for clinical trial design and opened up opportunities for ethical enhancement of these mechanisms in clinical practice.
Abstract: Our understanding and conceptualization of the placebo effect has shifted in emphasis from a focus on the inert content of a physical placebo agent to the overall simulation of a therapeutic intervention. Research has identified many types of placebo responses driven by different mechanisms depending on the particular context wherein the placebo is given. Some placebo responses, such as analgesia, are initiated and maintained by expectations of symptom change and changes in motivation/emotions. Placebo factors have neurobiological underpinnings and actual effects on the brain and body. They are not just response biases. Other placebo responses result from less conscious processes, such as classical conditioning in the case of immune, hormonal, and respiratory functions. The demonstration of the involvement of placebo mechanisms in clinical trials and routine clinical practice has highlighted interesting considerations for clinical trial design and opened up opportunities for ethical enhancement of these mechanisms in clinical practice.

Journal ArticleDOI
TL;DR: Scaffolds with new levels of biofunctionality that attempt to recreate nanoscale topographical and biofactor cues from the extracellular environment are emerging as interesting candidate biomimetic materials.

Proceedings ArticleDOI
22 Apr 2008
TL;DR: TinyECC is presented, a ready-to-use, publicly available software package for ECC-based PKC operations that can be flexibly configured and integrated into sensor network applications and shows the impacts of individual optimizations on the execution time and resource consumptions.
Abstract: Public key cryptography (PKC) has been the enabling technology underlying many security services and protocols in traditional networks such as the Internet. In the context of wireless sensor networks, elliptic curve cryptography (ECC), one of the most efficient types of PKC, is being investigated to provide PKC support in sensor network applications so that the existing PKC-based solutions can be exploited. This paper presents the design, implementation, and evaluation of TinyECC, a configurable library for ECC operations in wireless sensor networks. The primary objective of TinyECC is to provide a ready-to-use, publicly available software package for ECC-based PKC operations that can be flexibly configured and integrated into sensor network applications. TinyECC provides a number of optimization switches, which can turn specific optimizations on or off based on developers' needs. Different combinations of the optimizations have different execution time and resource consumptions, giving developers great flexibility in integrating TinyECC into sensor network applications. This paper also reports the experimental evaluation of TinyECC on several common sensor platforms, including MICAz, Tmote Sky, and Imotel. The evaluation results show the impacts of individual optimizations on the execution time and resource consumptions, and give the most computationally efficient and the most storage efficient configuration of TinyECC.

Journal ArticleDOI
TL;DR: The concept of inconsistency and methods that have been proposed to evaluate it as well as the methodological gaps that remain are discussed and the implications of inconsistency, network geometry and asymmetry in informing the planning of future trials are discussed.
Abstract: Randomized trials may be designed and interpreted as single experiments or they may be seen in the context of other similar or relevant evidence. The amount and complexity of available randomized evidence vary for different topics. Systematic reviews may be useful in identifying gaps in the existing randomized evidence, pointing to discrepancies between trials, and planning future trials. A new, promising, but also very much debated extension of systematic reviews, mixed treatment comparison (MTC) meta-analysis, has become increasingly popular recently. MTC meta-analysis may have value in interpreting the available randomized evidence from networks of trials and can rank many different treatments, going beyond focusing on simple pairwise-comparisons. Nevertheless, the evaluation of networks also presents special challenges and caveats. In this article, we review the statistical methodology for MTC meta-analysis. We discuss the concept of inconsistency and methods that have been proposed to evaluate it as well as the methodological gaps that remain. We introduce the concepts of network geometry and asymmetry, and propose metrics for the evaluation of the asymmetry. Finally, we discuss the implications of inconsistency, network geometry and asymmetry in informing the planning of future trials.

Journal ArticleDOI
TL;DR: The ramifications of stress in terms of the effects of acute versus long-term stressors on cardiac functioning are explored and numerous approaches are available for stress management that can decrease patients' suffering and enhance their quality of life.

Journal ArticleDOI
TL;DR: This review introduces concepts and background from the ceramics engineering literature regarding metastable zirconia ceramic to establish a context for understanding current and emerging zIRconia-based dental ceramic technology.

Journal ArticleDOI
TL;DR: These results, based on the largest randomized trial ever conducted in the area of human lactation, provide strong evidence that prolonged and exclusive breastfeeding improves children's cognitive development.
Abstract: Context: The evidence that breastfeeding improves cognitive development is based almost entirely on observational studies and is thus prone to confounding by subtle behavioral differences in the breastfeeding mother’s behavior or her interaction with the infant. Objective: To assess whether prolonged and exclusive breastfeeding improves children’s cognitive ability at age 6.5 years.

Journal ArticleDOI
TL;DR: In this article, the authors propose a more coherent and specific definition of QoG: the impartiality of institutions that exercise government authority, which they relate to a series of criticisms stemming from the fields of public administration, public choice, multiculturalism, and feminism.
Abstract: The recent growth in research on "good governance" and the quality of government institutions has been propelled by empirical findings that show that such institutions may hold the key to understanding economic growth and social welfare in developing and transition countries. We argue, however, that a key issue has not been addressed, namely, what quality of government (QoG) actually means at the conceptual level. Based on analyses of political theory, we propose a more coherent and specific definition of QoG: the impartiality of institutions that exercise government authority. We relate the idea of impartiality to a series of criticisms stemming from the fields of public administration, public choice, multiculturalism, and feminism. To place the theory of impartiality in a larger context, we then contrast its scope and meaning with that of a threefold set of competing concepts of quality of government: democracy, the rule of law, and efficiency/effectiveness.

Journal ArticleDOI
TL;DR: A review of prospectively collected patient-reported outcomes data shows the minimum detectable change (MDC) appears as a statistically and clinically appropriate MCID value.