scispace - formally typeset
Search or ask a question

Showing papers by "University of Kansas published in 2011"


Journal ArticleDOI
TL;DR: The first new effect size index is described is a residual-based index that quantifies the amount of variance explained in both the mediator and the outcome and the second new effectsize index quantifying the indirect effect as the proportion of the maximum possible indirect effect that could have been obtained, given the scales of the variables involved.
Abstract: The statistical analysis of mediation effects has become an indispensable tool for helping scientists investigate processes thought to be causal. Yet, in spite of many recent advances in the estimation and testing of mediation effects, little attention has been given to methods for communicating effect size and the practical importance of those effect sizes. Our goals in this article are to (a) outline some general desiderata for effect size measures, (b) describe current methods of expressing effect size and practical importance for mediation, (c) use the desiderata to evaluate these methods, and (d) develop new methods to communicate effect size in the context of mediation analysis. The first new effect size index we describe is a residual-based index that quantifies the amount of variance explained in both the mediator and the outcome. The second new effect size index quantifies the indirect effect as the proportion of the maximum possible indirect effect that could have been obtained, given the scales of the variables involved. We supplement our discussion by offering easy-to-use R tools for the numerical and visual communication of effect size for mediation effects.

2,359 citations


Journal ArticleDOI
TL;DR: The authors argue that the focus in mediation analysis should be shifted towards assessing the magnitude and significance of indirect effects, arguing that the collective evidence raises considerable concern that focusing on the significance between the independent and dependent variables is unjustified and can impair theory development and testing.
Abstract: A key aim of social psychology is to understand the psychological processes through which independent variables affect dependent variables in the social domain. This objective has given rise to statistical methods for mediation analysis. In mediation analysis, the significance of the relationship between the independent and dependent variables has been integral in theory testing, being used as a basis to determine (1) whether to proceed with analyses of mediation and (2) whether one or several proposed mediator(s) fully or partially accounts for an effect. Synthesizing past research and offering new arguments, we suggest that the collective evidence raises considerable concern that the focus on the significance between the independent and dependent variables, both before and after mediation tests, is unjustified and can impair theory development and testing. To expand theory involving social psychological processes, we argue that attention in mediation analysis should be shifted towards assessing the magnitude and significance of indirect effects.

1,983 citations


Journal ArticleDOI
TL;DR: This work introduces a novel architecture model that supports scalable, distributed suggestions from multiple independent nodes, and proposes a novel algorithm that generates a more optimal recommender input, which is the reason for a considerable accuracy improvement.
Abstract: The use of recommender systems is an emerging trend today, when user behavior information is abundant. There are many large datasets available for analysis because many businesses are interested in future user opinions. Sophisticated algorithms that predict such opinions can simplify decision-making, improve customer satisfaction, and increase sales. However, modern datasets contain millions of records, which represent only a small fraction of all possible data. Furthermore, much of the information in such sparse datasets may be considered irrelevant for making individual recommendations. As a result, there is a demand for a way to make personalized suggestions from large amounts of noisy data. Current recommender systems are usually all-in-one applications that provide one type of recommendation. Their inflexible architectures prevent detailed examination of recommendation accuracy and its causes. We introduce a novel architecture model that supports scalable, distributed suggestions from multiple independent nodes. Our model consists of two components, the input matrix generation algorithm and multiple platform-independent combination algorithms. A dedicated input generation component provides the necessary data for combination algorithms, reduces their size, and eliminates redundant data processing. Likewise, simple combination algorithms can produce recommendations from the same input, so we can more easily distinguish between the benefits of a particular combination algorithm and the quality of the data it receives. Such flexible architecture is more conducive for a comprehensive examination of our system. We believe that a user's future opinion may be inferred from a small amount of data, provided that this data is most relevant. We propose a novel algorithm that generates a more optimal recommender input. Unlike existing approaches, our method sorts the relevant data twice. Doing this is slower, but the quality of the resulting input is considerably better. Furthermore, the modular nature of our approach may improve its performance, especially in the cloud computing context. We implement and validate our proposed model via mathematical modeling, by appealing to statistical theories, and through extensive experiments, data analysis, and empirical studies. Our empirical study examines the effectiveness of accuracy improvement techniques for collaborative filtering recommender systems. We evaluate our proposed architecture model on the Netflix dataset, a popular (over 130,000 solutions), large (over 100,000,000 records), and extremely sparse (1.1%) collection of movie ratings. The results show that combination algorithm tuning has little effect on recommendation accuracy. However, all algorithms produce better results when supplied with a more relevant input. Our input generation algorithm is the reason for a considerable accuracy improvement.

1,957 citations


Posted Content
TL;DR: In this paper, the authors use a well-developed dynamic panel GMM estimator to alleviate endogeneity concerns in two aspects of corporate governance research: the effect of board structure on firm performance and the determinants of board structures.
Abstract: We use a well-developed dynamic panel GMM estimator to alleviate endogeneity concerns in two aspects of corporate governance research: the effect of board structure on firm performance and the determinants of board structure. The estimator incorporates the dynamic nature of internal governance choices to provide valid and powerful instruments that address unobserved heterogeneity and simultaneity. We re-examine the relation between board structure and performance using the GMM estimator in a panel of 6,000 firms over a period from 1991-2003, and find no causal relation between board structure and current firm performance. We illustrate why other commonly used estimators that ignore the dynamic relationship between current governance and past firm performance may be biased. We discuss where it may be appropriate to consider the dynamic panel GMM estimator in corporate governance research, as well as caveats to its use.

1,723 citations


Book ChapterDOI
TL;DR: This chapter describes the requirements for the ROSETTA molecular modeling program's new architecture, justifies the design decisions, sketches out central classes, and highlights a few of the common tasks that the new software can perform.
Abstract: We have recently completed a full re-architecturing of the ROSETTA molecular modeling program, generalizing and expanding its existing functionality. The new architecture enables the rapid prototyping of novel protocols by providing easy-to-use interfaces to powerful tools for molecular modeling. The source code of this rearchitecturing has been released as ROSETTA3 and is freely available for academic use. At the time of its release, it contained 470,000 lines of code. Counting currently unpublished protocols at the time of this writing, the source includes 1,285,000 lines. Its rapid growth is a testament to its ease of use. This chapter describes the requirements for our new architecture, justifies the design decisions, sketches out central classes, and highlights a few of the common tasks that the new software can perform.

1,676 citations


Journal ArticleDOI
TL;DR: This paper proposes and presents a clear and user-friendly guideline for the translation, adaptation and validation of instruments or scales for cross-cultural health care research, which requires careful planning and the adoption of rigorous methodological approaches.
Abstract: Rationale, aims and objectives The diversity of the population worldwide suggests a great need for cross-culturally validated research instruments or scales. Researchers and clinicians must have access to reliable and valid measures of concepts of interest in their own cultures and languages to conduct cross-cultural research and/or provide quality patient care. Although there are well-established methodological approaches for translating, adapting and validating instruments or scales for use in cross-cultural health care research, a great variation in the use of these approaches continues to prevail in the health care literature. Therefore, the objectives of this scholarly paper were to review published recommendations of cross-cultural validation of instruments and scales, and to propose and present a clear and user-friendly guideline for the translation, adaptation and validation of instruments or scales for cross-cultural health care research. Methods A review of highly recommended methodological approaches to translation, adaptation and cross-cultural validation of research instruments or scales was performed. Recommendations were summarized and incorporated into a seven-step guideline. Each one of the steps was described and key points were highlighted. Example of a project using the proposed steps of the guideline was fully described. Conclusions Translation, adaptation and validation of instruments or scales for crosscultural research is very time-consuming and requires careful planning and the adoption of rigorous methodological approaches to derive a reliable and valid measure of the concept of interest in the target population.

1,634 citations


Journal ArticleDOI
TL;DR: In this paper, the authors developed guidelines for reporting reliability and agreement studies in interrater and intra-arater reliability and agreements, and proposed 15 issues that should be addressed when reporting such studies.

1,605 citations


Journal ArticleDOI
24 Mar 2011-Nature
TL;DR: 111,195 new elements are identified, including thousands of genes, coding and non-coding transcripts, exons, splicing and editing events and inferred protein isoforms that previously eluded discovery using established experimental, prediction and conservation-based approaches.
Abstract: Drosophila melanogaster is one of the most well studied genetic model organisms; nonetheless, its genome still contains unannotated coding and non-coding genes, transcripts, exons and RNA editing sites. Full discovery and annotation are pre-requisites for understanding how the regulation of transcription, splicing and RNA editing directs the development of this complex organism. Here we used RNA-Seq, tiling microarrays and cDNA sequencing to explore the transcriptome in 30 distinct developmental stages. We identified 111,195 new elements, including thousands of genes, coding and non-coding transcripts, exons, splicing and editing events, and inferred protein isoforms that previously eluded discovery using established experimental, prediction and conservation-based approaches. These data substantially expand the number of known transcribed elements in the Drosophila genome and provide a high-resolution view of transcriptome dynamics throughout development.

1,427 citations


Journal ArticleDOI
TL;DR: This paper explored the conceptual and empirical reasons behind choice of extent of study area in such analyses, and offer practical, but conceptually justified, reasoning for such decisions, and asserted that the area that has been accessible to the species of interest over relevant time periods represents the ideal area for model development, testing, and comparison.

1,324 citations


Journal ArticleDOI
TL;DR: It is proposed that a patient's RA can be defined as being in remission based on one of two definitions: (1) when scores on the tender joint count, swollen joint counts, CRP level, and patient global assessment are all ≤1, or (2) when the score on the Simplified Disease Activity Index is ≤3.
Abstract: Objective Remission in rheumatoid arthritis (RA) is an increasingly attainable goal, but there is no widely used defi nition of remission that is stringent but achievable and could be applied uniformly as an outcome measure in clinical trials. This work was undertaken to develop such a defi nition. Methods A committee consisting of members of the American College of Rheumatology, the European League Against Rheumatism, and the Outcome Measures in Rheumatology Initiative met to guide the process and review prespecifi ed analyses from RA clinical trials. The committee requested a stringent defi nition (little, if any, active disease) and decided to use core set measures including, as a minimum, joint counts and levels of an acute-phase reactant to defi ne remission. Members were surveyed to select the level of each core set measure that would be consistent with remission. Candidate defi nitions of remission were tested, including those that constituted a number of individual measures of remission (Boolean approach) as well as defi nitions using disease activity indexes. To select a defi nition of remission, trial data were analysed to examine the added contribution of patient-reported outcomes and the ability of candidate measures to predict later good radiographic and functional outcomes. Results Survey results for the defi nition of remission suggested indexes at published thresholds and a count of core set measures, with each measure scored as 1 or less (eg, tender and swollen joint counts, C reactive protein (CRP) level, and global assessments on a 0–10 scale). Analyses suggested the need to include a patientreported measure. Examination of 2-year follow-up data suggested that many candidate defi nitions performed comparably in terms of predicting later good radiographic and functional outcomes, although 28-joint Disease Activity Score–based measures of remission did not

1,273 citations


Journal ArticleDOI
TL;DR: Green tea catechin has great potential in cancer prevention because of its safety, low cost and bioavailability, and its mechanism of action at numerous points regulating cancer cell growth, survival, angiogenesis and metastasis.

Journal ArticleDOI
30 Jun 2011-Nature
TL;DR: These findings show that macrophages are defended from HIV-1 infection by a mechanism that prevents an unwanted interferon response triggered by self nucleic acids, and uncover an intricate relationship between innate immune mechanisms that control response to self and to retroviral pathogens.
Abstract: Macrophages and dendritic cells have key roles in viral infections, providing virus reservoirs that frequently resist antiviral therapies and linking innate virus detection to antiviral adaptive immune responses. Human immunodeficiency virus 1 (HIV-1) fails to transduce dendritic cells and has a reduced ability to transduce macrophages, due to an as yet uncharacterized mechanism that inhibits infection by interfering with efficient synthesis of viral complementary DNA. In contrast, HIV-2 and related simian immunodeficiency viruses (SIVsm/mac) transduce myeloid cells efficiently owing to their virion-associated Vpx accessory proteins, which counteract the restrictive mechanism. Here we show that the inhibition of HIV-1 infection in macrophages involves the cellular SAM domain HD domain-containing protein 1 (SAMHD1). Vpx relieves the inhibition of lentivirus infection in macrophages by loading SAMHD1 onto the CRL4(DCAF1) E3 ubiquitin ligase, leading to highly efficient proteasome-dependent degradation of the protein. Mutations in SAMHD1 cause Aicardi-Goutieres syndrome, a disease that produces a phenotype that mimics the effects of a congenital viral infection. Failure to dispose of endogenous nucleic acid debris in Aicardi-Goutieres syndrome results in inappropriate triggering of innate immune responses via cytosolic nucleic acids sensors. Thus, our findings show that macrophages are defended from HIV-1 infection by a mechanism that prevents an unwanted interferon response triggered by self nucleic acids, and uncover an intricate relationship between innate immune mechanisms that control response to self and to retroviral pathogens.

Journal ArticleDOI
TL;DR: In this paper, a harmonized set of land-use scenarios is presented that smoothly connects historical reconstructions of land use with future projections, in the format required by ESMs, in preparation for the fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC).
Abstract: In preparation for the fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC), the international community is developing new advanced Earth System Models (ESMs) to assess the combined effects of human activities (e.g. land use and fossil fuel emissions) on the carbon-climate system. In addition, four Representative Concentration Pathway (RCP) scenarios of the future (2005–2100) are being provided by four Integrated Assessment Model (IAM) teams to be used as input to the ESMs for future carbon-climate projections (Moss et al. 2010). The diversity of approaches and requirements among IAMs and ESMs for tracking land-use change, along with the dependence of model projections on land-use history, presents a challenge for effectively passing data between these communities and for smoothly transitioning from the historical estimates to future projections. Here, a harmonized set of land-use scenarios are presented that smoothly connects historical reconstructions of land use with future projections, in the format required by ESMs. The land-use harmonization strategy estimates fractional land-use patterns and underlying land-use transitions annually for the time period 1500–2100 at 0.5° × 0.5° resolution. Inputs include new gridded historical maps of crop and pasture data from HYDE 3.1 for 1500–2005, updated estimates of historical national wood harvest and of shifting cultivation, and future information on crop, pasture, and wood harvest from the IAM implementations of the RCPs for the period 2005–2100. The computational method integrates these multiple data sources, while minimizing differences at the transition between the historical reconstruction ending conditions and IAM initial conditions, and working to preserve the future changes depicted by the IAMs at the grid cell level. This study for the first time harmonizes land-use history data together with future scenario information from multiple IAMs into a single consistent, spatially gridded, set of land-use change scenarios for studies of human impacts on the past, present, and future Earth system.

Journal ArticleDOI
TL;DR: The case for the use of HTS as part of a proven scientific tool kit, the wider use of which is essential for the discovery of new chemotypes is presented.
Abstract: High-throughput screening (HTS) has been postulated in several quarters to be a contributory factor to the decline in productivity in the pharmaceutical industry. Moreover, it has been blamed for stifling the creativity that drug discovery demands. In this article, we aim to dispel these myths and present the case for the use of HTS as part of a proven scientific tool kit, the wider use of which is essential for the discovery of new chemotypes.

Journal ArticleDOI
TL;DR: The Update Committee noted the importance of continued symptom monitoring throughout therapy, and Clinicians underestimate the incidence of nausea, which is not as well controlled as emesis.
Abstract: Purpose To update the American Society of Clinical Oncology (ASCO) guideline for antiemetics in oncology. Methods A systematic review of the medical literature was completed to inform this update. MEDLINE, the Cochrane Collaboration Library, and meeting materials from ASCO and the Multinational Association for Supportive Care in Cancer were all searched. Primary outcomes of interest were complete response and rates of any vomiting or nausea. Results Thirty-seven trials met prespecified inclusion and exclusion criteria for this systematic review. Two systematic reviews from the Cochrane Collaboration were identified; one surveyed the pediatric literature. The other compared the relative efficacy of the 5-hydroxytryptamine-3 (5-HT3) receptor antagonists. Recommendations

Journal ArticleDOI
TL;DR: Addressing pain as a global public health issue will mean that health care providers and public health professionals will have a more comprehensive understanding of pain and the appropriate public health and social policy responses to this problem.
Abstract: Background Pain is an enormous problem globally. Estimates suggest that 20% of adults suffer from pain globally and 10% are newly diagnosed with chronic pain each year. Nevertheless, the problem of pain has primarily been regarded as a medical problem, and has been little addressed by the field of public health.

Journal ArticleDOI
TL;DR: Transition metal catalyzed decarboxylative allylation, benzylations, and interceptive allylations are reviewed.
Abstract: A review. Transition metal catalyzed decarboxylative allylations, benzylations, and interceptive allylations are reviewed.


Journal ArticleDOI
TL;DR: Recommendations are provided to patients, physicians, and other health care providers on several issues involving deep brain stimulation (DBS) for Parkinson disease (PD) to be considered in a select group of appropriate patients.
Abstract: Objective: To provide recommendations to patients, physicians, and other health care providers on several issues involving deep brain stimulation (DBS) for Parkinson disease (PD). Data Sources and Study Selection: An international consortium of experts organized, reviewed the literature, and attended the workshop. Topics were introduced at the workshop, followed by group discussion. Data Extraction and Synthesis: A draft of a consensus statement was presented and further edited after plenary debate. The final statements were agreed on by all members. Conclusions: (1) Patients with PD without significant active cognitive or psychiatric problems who have medically intractable motor fluctuations, intractable tremor, or intolerance of medication adverse effects are good candidates for DBS. (2) Deep brain stimulation surgery is best performed by an experienced neurosurgeon with expertise in stereotactic neurosurgery who is working as part of a interprofessional team. (3) Surgical complication rates are extremely variable, with infection being the most commonly reported complication of DBS. (4) Deep brain stimulation programming is best accomplished by a highly trained clinician and can take 3 to 6 months to obtain optimal results. (5) Deep brain stimulation improves levodopa-responsive symptoms, dyskinesia, and tremor; benefits seem to be long-lasting in many motor domains. (6) Subthalamic nuclei DBS may be complicated by increased depression, apathy, impulsivity, worsened verbal fluency, and executive dysfunction in a subset of patients. (7) Both globus pallidus pars interna and subthalamic nuclei DBS have been shown to be effective in addressing the motor symptoms of PD. (8) Ablative therapy is still an effective alternative and should be considered in a select group of appropriate patients. Arch Neurol. 2011; 68(2):165-171. Published online October 11, 2010. doi:10.1001/archneurol.2010.260

Journal ArticleDOI
TL;DR: Exemestane significantly reduced invasive breast cancers in postmenopausal women who were at moderately increased risk for breast cancer and was associated with no serious toxic effects and only minimal changes in health-related quality of life.
Abstract: Background Tamoxifen and raloxifene have limited patient acceptance for primary prevention of breast cancer. Aromatase inhibitors prevent more contralateral breast cancers and cause fewer side effects than tamoxifen in patients with early-stage breast cancer. Methods In a randomized, placebo-controlled, double-blind trial of exemestane designed to detect a 65% relative reduction in invasive breast cancer, eligible postmenopausal women 35 years of age or older had at least one of the following risk factors: 60 years of age or older; Gail 5-year risk score greater than 1.66% (chances in 100 of invasive breast cancer developing within 5 years); prior atypical ductal or lobular hyperplasia or lobular carcinoma in situ; or ductal carcinoma in situ with mastectomy. Toxic effects and health-related and menopause-specific qualities of life were measured. Results A total of 4560 women for whom the median age was 62.5 years and the median Gail risk score was 2.3% were randomly assigned to either exemestane or placebo. At a median follow-up of 35 months, 11 invasive breast cancers were detected in those given exemestane and in 32 of those given placebo, with a 65% relative reduction in the annual incidence of invasive breast cancer (0.19% vs. 0.55%; hazard ratio, 0.35; 95% confidence interval [CI], 0.18 to 0.70; P = 0.002). The annual incidence of invasive plus noninvasive (ductal carcinoma in situ) breast cancers was 0.35% on exemestane and 0.77% on placebo (hazard ratio, 0.47; 95% CI, 0.27 to 0.79; P = 0.004). Adverse events occurred in 88% of the exemestane group and 85% of the placebo group (P = 0.003), with no significant differences between the two groups in terms of skeletal fractures, cardiovascular events, other cancers, or treatmentrelated deaths. Minimal quality-of-life differences were observed. Conclusions Exemestane significantly reduced invasive breast cancers in postmenopausal women who were at moderately increased risk for breast cancer. During a median follow-up period of 3 years, exemestane was associated with no serious toxic effects and only minimal changes in health-related quality of life. (Funded by Pfizer and others; NCIC CTG MAP.3 ClinicalTrials.gov number, NCT00083174.)

Journal ArticleDOI
19 Aug 2011-Science
TL;DR: It is found that Asians are 4 percentage points and black or African-American applicants are 13 percentage points less likely to receive NIH investigator-initiated research funding compared with whites, after controlling for the applicant’s educational background, country of origin, training, previous research awards, publication record, and employer characteristics.
Abstract: We investigated the association between a U.S. National Institutes of Health (NIH) R01 applicant’s self-identified race or ethnicity and the probability of receiving an award by using data from the NIH IMPAC II grant database, the Thomson Reuters Web of Science, and other sources. Although proposals with strong priority scores were equally likely to be funded regardless of race, we find that Asians are 4 percentage points and black or African-American applicants are 13 percentage points less likely to receive NIH investigator-initiated research funding compared with whites. After controlling for the applicant’s educational background, country of origin, training, previous research awards, publication record, and employer characteristics, we find that black applicants remain 10 percentage points less likely than whites to be awarded NIH research funding. Our results suggest some leverage points for policy intervention.

Journal ArticleDOI
S. Chatrchyan, Vardan Khachatryan, Albert M. Sirunyan, A. Tumasyan  +2268 moreInstitutions (158)
TL;DR: In this article, the transverse momentum balance in dijet and γ/Z+jets events is used to measure the jet energy response in the CMS detector, as well as the transversal momentum resolution.
Abstract: Measurements of the jet energy calibration and transverse momentum resolution in CMS are presented, performed with a data sample collected in proton-proton collisions at a centre-of-mass energy of 7TeV, corresponding to an integrated luminosity of 36pb−1. The transverse momentum balance in dijet and γ/Z+jets events is used to measure the jet energy response in the CMS detector, as well as the transverse momentum resolution. The results are presented for three different methods to reconstruct jets: a calorimeter-based approach, the ``Jet-Plus-Track'' approach, which improves the measurement of calorimeter jets by exploiting the associated tracks, and the ``Particle Flow'' approach, which attempts to reconstruct individually each particle in the event, prior to the jet clustering, based on information from all relevant subdetectors

Journal ArticleDOI
TL;DR: This work highlights that, in the case of invasive species, distributional predictions should aim to derive the best hypothesis of the potential distribution of the species by using all distributional information available, including information from both the native range and other invaded regions.
Abstract: Risk maps summarizing landscape suitability of novel areas for invading species can be valuable tools for preventing species’ invasions or controlling their spread, but methods employed for development of such maps remain variable and unstandardized. We discuss several considerations in development of such models, including types of distributional information that should be used, the nature of explanatory variables that should be incorporated, and caveats regarding model testing and evaluation. We highlight that, in the case of invasive species, such distributional predictions should aim to derive the best hypothesis of the potential distribution of the species by using (1) all distributional information available, including information from both the native range and other invaded regions; (2) predictors linked as directly as is feasible to the physiological requirements of the species; and (3) modelling procedures that carefully avoid overfitting to the training data. Finally, model testing and evaluation should focus on well-predicted presences, and less on efficient prediction of absences; a k-fold regional cross-validation test is discussed.

Journal ArticleDOI
TL;DR: To evaluate the evolutionary conservatism of coarse‐resolution Grinnellian (or scenopoetic) ecological niches, a large number of studies have found that the latter are more conservative than the former.
Abstract: Aim To evaluate the evolutionary conservatism of coarse-resolution Grinnellian (or scenopoetic) ecological niches. Location Global. Methods I review a broad swathe of literature relevant to the topic of niche conservatism or differentiation, and illustrate some of the resulting insights with examplar analyses. Results Ecological niche characteristics are highly conserved over short-to-moderate time spans (i.e. from individual life spans up to tens or hundreds of thousands of years); little or no ecological niche differentiation is discernible as part of the processes of invasion or speciation. Main conclusions Although niche conservatism is widespread, many methodological complications obscure this point. In particular, niche models are frequently over-interpreted: too often, they are based on limited occurrence data in high-dimensional environmental spaces, and cannot be interpreted robustly to indicate niche differentiation.

Journal ArticleDOI
TL;DR: In this article, the MSEM method outperforms two MLM-based techniques in 2-level models in terms of bias and confidence interval coverage while displaying adequate efficiency, convergence rates, and power under a variety of conditions.
Abstract: Multilevel modeling (MLM) is a popular way of assessing mediation effects with clustered data. Two important limitations of this approach have been identified in prior research and a theoretical rationale has been provided for why multilevel structural equation modeling (MSEM) should be preferred. However, to date, no empirical evidence of MSEM's advantages relative to MLM approaches for multilevel mediation analysis has been provided. Nor has it been demonstrated that MSEM performs adequately for mediation analysis in an absolute sense. This study addresses these gaps and finds that the MSEM method outperforms 2 MLM-based techniques in 2-level models in terms of bias and confidence interval coverage while displaying adequate efficiency, convergence rates, and power under a variety of conditions. Simulation results support prior theoretical work regarding the advantages of MSEM over MLM for mediation in clustered data.

Journal ArticleDOI
TL;DR: In this article, the authors studied the effect of collision centrality on the transverse momentum of PbPb collisions at the LHC with a data sample of 6.7 inverse microbarns.
Abstract: Jet production in PbPb collisions at a nucleon-nucleon center-of-mass energy of 2.76 TeV was studied with the CMS detector at the LHC, using a data sample corresponding to an integrated luminosity of 6.7 inverse microbarns. Jets are reconstructed using the energy deposited in the CMS calorimeters and studied as a function of collision centrality. With increasing collision centrality, a striking imbalance in dijet transverse momentum is observed, consistent with jet quenching. The observed effect extends from the lower cut-off used in this study (jet transverse momentum = 120 GeV/c) up to the statistical limit of the available data sample (jet transverse momentum approximately 210 GeV/c). Correlations of charged particle tracks with jets indicate that the momentum imbalance is accompanied by a softening of the fragmentation pattern of the second most energetic, away-side jet. The dijet momentum balance is recovered when integrating low transverse momentum particles distributed over a wide angular range relative to the direction of the away-side jet.

Book
06 Sep 2011
TL;DR: In this article, the causes, consequences, and implications of cross-border consolidation of financial institutions by reviewing several hundred studies, providing comparative international data, and estimating cross-bank efficiency in France, Germany, Spain, the U.K., and the United States during the 1990s.
Abstract: We address the causes, consequences, and implications of the cross-border consolidation of financial institutions by reviewing several hundred studies, providing comparative international data, and estimating cross-border banking efficiency in France, Germany, Spain, the U.K., and the U.S. during the 1990s. We find that, on average, domestic banks have higher profit efficiency than foreign banks. However, banks from at least one country (the U.S.) appear to operate with relatively high efficiency both at home and abroad. If these results continue to hold, they do not preclude successful international expansion by some financial firms, but they do suggest limits to global consolidation.

Journal ArticleDOI
TL;DR: The results indicate that the KiVa program is effective in reducing school bullying and victimization in Grades 4-6 and suggest that well-conceived school-based programs can reduce victimization.
Abstract: This study demonstrates the effectiveness of the KiVa antibullying program using a large sample of 8,237 youth from Grades 4-6 (10-12 years). Altogether, 78 schools were randomly assigned to intervention (39 schools, 4,207 students) and control conditions (39 schools, 4,030 students). Multilevel regression analyses revealed that after 9 months of implementation, the intervention had consistent beneficial effects on 7 of the 11 dependent variables, including self- and peer-reported victimization and self-reported bullying. The results indicate that the KiVa program is effective in reducing school bullying and victimization in Grades 4-6. Despite some evidence against school-based interventions, the results suggest that well-conceived school-based programs can reduce victimization.

Journal ArticleDOI
17 Nov 2011-Nature
TL;DR: It is shown that climate has been a major driver of population change over the past 50,000 years, however, each species responds differently to the effects of climatic shifts, habitat redistribution and human encroachment.
Abstract: Despite decades of research, the roles of climate and humans in driving the dramatic extinctions of large-bodied mammals during the Late Quaternary period remain contentious. Here we use ancient DNA, species distribution models and the human fossil record to elucidate how climate and humans shaped the demographic history of woolly rhinoceros, woolly mammoth, wild horse, reindeer, bison and musk ox. We show that climate has been a major driver of population change over the past 50,000 years. However, each species responds differently to the effects of climatic shifts, habitat redistribution and human encroachment. Although climate change alone can explain the extinction of some species, such as Eurasian musk ox and woolly rhinoceros, a combination of climatic and anthropogenic effects appears to be responsible for the extinction of others, including Eurasian steppe bison and wild horse. We find no genetic signature or any distinctive range dynamics distinguishing extinct from surviving species, emphasizing the challenges associated with predicting future responses of extant mammals to climate and human-mediated habitat change.