scispace - formally typeset
Search or ask a question

Showing papers by "University of Virginia published in 2013"


Journal ArticleDOI
TL;DR: A range of new simulation algorithms and features developed during the past 4 years are presented, leading up to the GROMACS 4.5 software package, which provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations.
Abstract: Motivation: Molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. Results: Here, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations. Availability: GROMACS is an open source and free software available from http://www.gromacs.org. Contact: erik.lindahl@scilifelab.se Supplementary information:Supplementary data are available at Bioinformatics online.

6,029 citations


Journal ArticleDOI
TL;DR: It is shown that the average statistical power of studies in the neurosciences is very low, and the consequences include overestimates of effect size and low reproducibility of results.
Abstract: A study with low statistical power has a reduced chance of detecting a true effect, but it is less well appreciated that low power also reduces the likelihood that a statistically significant result reflects a true effect. Here, we show that the average statistical power of studies in the neurosciences is very low. The consequences of this include overestimates of effect size and low reproducibility of results. There are also ethical dimensions to this problem, as unreliable research is inefficient and wasteful. Improving reproducibility in neuroscience is a key priority and requires attention to well-established but often ignored methodological principles.

5,683 citations


Journal ArticleDOI
26 Apr 2013-Science
TL;DR: Pulsar J0348+0432 is only the second neutron star with a precisely determined mass of 2 M☉
Abstract: Many physically motivated extensions to general relativity (GR) predict significant deviations at energies present in massive neutron stars. We report the measurement of a 2.01 \(\pm \) 0.04 solar mass (M\(_\odot \)) pulsar in a 2.46-h orbit around a 0.172 \(\pm \) 0.003 M\(_\odot \) white dwarf. The high pulsar mass and the compact orbit make this system a sensitive laboratory of a previously untested strong-field gravity regime. Thus far, the observed orbital decay agrees with GR, supporting its validity even for the extreme conditions present in the system. The resulting constraints on deviations support the use of GR-based templates for ground-based gravitational wave detection experiments. Additionally, the system strengthens recent constraints on the properties of dense matter and provides novel insight to binary stellar astrophysics and pulsar recycling.

3,224 citations


Journal ArticleDOI
TL;DR: These guidelines provide a roadmap for developing integrated, evidence-based, and patient-centered protocols for preventing and treating pain, agitation, and delirium in critically ill patients.
Abstract: Objective:To revise the “Clinical Practice Guidelines for the Sustained Use of Sedatives and Analgesics in the Critically Ill Adult” published in Critical Care Medicine in 2002.Methods:The American College of Critical Care Medicine assembled a 20-person, multidisciplinary, multi-institutional task f

3,005 citations


Journal ArticleDOI
TL;DR: Modules for Experiments in Stellar Astrophysics (MESA) as discussed by the authors is an open source software package for modeling the evolution of stellar structures and composition. But it is not suitable for large-scale systems such as supernovae.
Abstract: We substantially update the capabilities of the open source software package Modules for Experiments in Stellar Astrophysics (MESA), and its one-dimensional stellar evolution module, MESA star. Improvements in MESA star's ability to model the evolution of giant planets now extends its applicability down to masses as low as one-tenth that of Jupiter. The dramatic improvement in asteroseismology enabled by the space-based Kepler and CoRoT missions motivates our full coupling of the ADIPLS adiabatic pulsation code with MESA star. This also motivates a numerical recasting of the Ledoux criterion that is more easily implemented when many nuclei are present at non-negligible abundances. This impacts the way in which MESA star calculates semi-convective and thermohaline mixing. We exhibit the evolution of 3-8 M ? stars through the end of core He burning, the onset of He thermal pulses, and arrival on the white dwarf cooling sequence. We implement diffusion of angular momentum and chemical abundances that enable calculations of rotating-star models, which we compare thoroughly with earlier work. We introduce a new treatment of radiation-dominated envelopes that allows the uninterrupted evolution of massive stars to core collapse. This enables the generation of new sets of supernovae, long gamma-ray burst, and pair-instability progenitor models. We substantially modify the way in which MESA star solves the fully coupled stellar structure and composition equations, and we show how this has improved the scaling of MESA's calculational speed on multi-core processors. Updates to the modules for equation of state, opacity, nuclear reaction rates, and atmospheric boundary conditions are also provided. We describe the MESA Software Development Kit that packages all the required components needed to form a unified, maintained, and well-validated build environment for MESA. We also highlight a few tools developed by the community for rapid visualization of MESA star results.

2,761 citations


Journal ArticleDOI
Cristen J. Willer1, Ellen M. Schmidt1, Sebanti Sengupta1, Gina M. Peloso2  +316 moreInstitutions (87)
TL;DR: It is found that loci associated with blood lipid levels are often associated with cardiovascular and metabolic traits, including coronary artery disease, type 2 diabetes, blood pressure, waist-hip ratio and body mass index.
Abstract: Levels of low-density lipoprotein (LDL) cholesterol, high-density lipoprotein (HDL) cholesterol, triglycerides and total cholesterol are heritable, modifiable risk factors for coronary artery disease. To identify new loci and refine known loci influencing these lipids, we examined 188,577 individuals using genome-wide and custom genotyping arrays. We identify and annotate 157 loci associated with lipid levels at P < 5 × 10(-8), including 62 loci not previously associated with lipid levels in humans. Using dense genotyping in individuals of European, East Asian, South Asian and African ancestry, we narrow association signals in 12 loci. We find that loci associated with blood lipid levels are often associated with cardiovascular and metabolic traits, including coronary artery disease, type 2 diabetes, blood pressure, waist-hip ratio and body mass index. Our results demonstrate the value of using genetic data from individuals of diverse ancestry and provide insights into the biological mechanisms regulating blood lipids to guide future genetic, biological and therapeutic research.

2,585 citations


Journal ArticleDOI
TL;DR: This monograph discusses 10 learning techniques that benefit learners of different ages and abilities and have been shown to boost students’ performance across many criterion tasks and even in educational contexts.
Abstract: Many students are being left behind by an educational system that some people believe is in crisis. Improving educational outcomes will require efforts on many fronts, but a central premise of this monograph is that one part of a solution involves helping students to better regulate their learning through the use of effective learning techniques. Fortunately, cognitive and educational psychologists have been developing and evaluating easy-to-use learning techniques that could help students achieve their learning goals. In this monograph, we discuss 10 learning techniques in detail and offer recommendations about their relative utility. We selected techniques that were expected to be relatively easy to use and hence could be adopted by many students. Also, some techniques (e.g., highlighting and rereading) were selected because students report relying heavily on them, which makes it especially important to examine how well they work. The techniques include elaborative interrogation, self-explanation, summarization, highlighting (or underlining), the keyword mnemonic, imagery use for text learning, rereading, practice testing, distributed practice, and interleaved practice. To offer recommendations about the relative utility of these techniques, we evaluated whether their benefits generalize across four categories of variables: learning conditions, student characteristics, materials, and criterion tasks. Learning conditions include aspects of the learning environment in which the technique is implemented, such as whether a student studies alone or with a group. Student characteristics include variables such as age, ability, and level of prior knowledge. Materials vary from simple concepts to mathematical problems to complicated science texts. Criterion tasks include different outcome measures that are relevant to student achievement, such as those tapping memory, problem solving, and comprehension. We attempted to provide thorough reviews for each technique, so this monograph is rather lengthy. However, we also wrote the monograph in a modular fashion, so it is easy to use. In particular, each review is divided into the following sections: General description of the technique and why it should work How general are the effects of this technique? 2a. Learning conditions 2b. Student characteristics 2c. Materials 2d. Criterion tasks Effects in representative educational contexts Issues for implementation Overall assessment The review for each technique can be read independently of the others, and particular variables of interest can be easily compared across techniques. To foreshadow our final recommendations, the techniques vary widely with respect to their generalizability and promise for improving student learning. Practice testing and distributed practice received high utility assessments because they benefit learners of different ages and abilities and have been shown to boost students' performance across many criterion tasks and even in educational contexts. Elaborative interrogation, self-explanation, and interleaved practice received moderate utility assessments. The benefits of these techniques do generalize across some variables, yet despite their promise, they fell short of a high utility assessment because the evidence for their efficacy is limited. For instance, elaborative interrogation and self-explanation have not been adequately evaluated in educational contexts, and the benefits of interleaving have just begun to be systematically explored, so the ultimate effectiveness of these techniques is currently unknown. Nevertheless, the techniques that received moderate-utility ratings show enough promise for us to recommend their use in appropriate situations, which we describe in detail within the review of each technique. Five techniques received a low utility assessment: summarization, highlighting, the keyword mnemonic, imagery use for text learning, and rereading. These techniques were rated as low utility for numerous reasons. Summarization and imagery use for text learning have been shown to help some students on some criterion tasks, yet the conditions under which these techniques produce benefits are limited, and much research is still needed to fully explore their overall effectiveness. The keyword mnemonic is difficult to implement in some contexts, and it appears to benefit students for a limited number of materials and for short retention intervals. Most students report rereading and highlighting, yet these techniques do not consistently boost students' performance, so other techniques should be used in their place (e.g., practice testing instead of rereading). Our hope is that this monograph will foster improvements in student learning, not only by showcasing which learning techniques are likely to have the most generalizable effects but also by encouraging researchers to continue investigating the most promising techniques. Accordingly, in our closing remarks, we discuss some issues for how these techniques could be implemented by teachers and students, and we highlight directions for future research.

1,989 citations


Journal ArticleDOI
TL;DR: The authors compared Mechanical Turk participants with community and student samples on a set of personality dimensions and classic decision-making biases and found that MTurk participants are less extraverted and have lower self-esteem than other participants, presenting challenges for some research domains.
Abstract: Mechanical Turk (MTurk), an online labor system run by Amazon.com, provides quick, easy, and inexpensive access to online research participants. As use of MTurk has grown, so have questions from behavioral researchers about its participants, reliability, and low compensation. In this article, we review recent research about MTurk and compare MTurk participants with community and student samples on a set of personality dimensions and classic decision-making biases. Across two studies, we find many similarities between MTurk participants and traditional samples, but we also find important differences. For instance, MTurk participants are less likely to pay attention to experimental materials, reducing statistical power. They are more likely to use the Internet to find answers, even with no incentive for correct responses. MTurk participants have attitudes about money that are different from a community sample’s attitudes but similar to students’ attitudes. Finally, MTurk participants are less extraverted and have lower self-esteem than other participants, presenting challenges for some research domains. Despite these differences, MTurk participants produce reliable results consistent with standard decision-making biases: they are present biased, risk-averse for gains, risk-seeking for losses, show delay/expedite asymmetries, and show the certainty effect—with almost no significant differences in effect sizes from other samples. We conclude that MTurk offers a highly valuable opportunity for data collection and recommend that researchers using MTurk (1) include screening questions that gauge attention and language comprehension; (2) avoid questions with factual answers; and (3) consider how individual differences in financial and social domains may influence results. Copyright © 2012 John Wiley & Sons, Ltd.

1,755 citations


Journal ArticleDOI
TL;DR: These guidelines were developed jointly by the American Society of Health-System Pharmacists (ASHP), the Infectious Diseases Society of America, the Surgical Infection Society (SIS), and the Society for Healthcare Epidemiology of America (SHEA).
Abstract: These guidelines were developed jointly by the American Society of Health-System Pharmacists (ASHP), the Infectious Diseases Society of America (IDSA), the Surgical Infection Society (SIS), and the Society for Healthcare Epidemiology of America (SHEA). This work represents an update to the

1,691 citations


Journal ArticleDOI
TL;DR: Ibrutinib shows durable single-agent efficacy in relapsed or refractory mantle-cell lymphoma and is enrolled into two groups: patients who had previously received at least 2 cycles of bortezomib therapy and those who had received less than 2 complete cycles.
Abstract: rolled into two groups: those who had previously received at least 2 cycles of bor - tezomib therapy and those who had received less than 2 complete cycles of bortezo - mib or had received no prior bortezomib therapy. The primary end point was the overall response rate. Secondary end points were duration of response, progression- free survival, overall survival, and safety. RESULTS The median age was 68 years, and 86% of patients had intermediate-risk or high-risk mantle-cell lymphoma according to clinical prognostic factors. Patients had received a median of three prior therapies. The most common treatment-related adverse events were mild or moderate diarrhea, fatigue, and nausea. Grade 3 or higher hematologic events were infrequent and included neutropenia (in 16% of patients), thrombocytope - nia (in 11%), and anemia (in 10%). A response rate of 68% (75 patients) was observed, with a complete response rate of 21% and a partial response rate of 47%; prior treat - ment with bortezomib had no effect on the response rate. With an estimated median follow-up of 15.3 months, the estimated median response duration was 17.5 months (95% confidence interval (CI), 15.8 to not reached), the estimated median progression- free survival was 13.9 months (95% CI, 7.0 to not reached), and the median overall survival was not reached. The estimated rate of overall survival was 58% at 18 months. CONCLUSIONS Ibrutinib shows durable single-agent efficacy in relapsed or refractory mantle-cell lymphoma. (Funded by Pharmacyclics and others; ClinicalTrials.gov number, NCT01236391.)

1,389 citations


Journal ArticleDOI
TL;DR: In the ocean, the lifetime of Nr is less well known but seems to be longer than in terrestrial ecosystems and may represent an important long-term source of N2O that will respond very slowly to control measures on the sources of NR from which it is produced.
Abstract: Global nitrogen fixation contributes 413 Tg of reactive nitrogen (Nr) to terrestrial and marine ecosystems annually of which anthropogenic activities are responsible for half, 210 Tg N. The majority of the transformations of anthropogenic Nr are on land (240 Tg N yr−1) within soils and vegetation where reduced Nr contributes most of the input through the use of fertilizer nitrogen in agriculture. Leakages from the use of fertilizer Nr contribute to nitrate (NO3−) in drainage waters from agricultural land and emissions of trace Nr compounds to the atmosphere. Emissions, mainly of ammonia (NH3) from land together with combustion related emissions of nitrogen oxides (NOx), contribute 100 Tg N yr−1 to the atmosphere, which are transported between countries and processed within the atmosphere, generating secondary pollutants, including ozone and other photochemical oxidants and aerosols, especially ammonium nitrate (NH4NO3) and ammonium sulfate (NH4)2SO4. Leaching and riverine transport of NO3 contribute 40–70 Tg N yr−1 to coastal waters and the open ocean, which together with the 30 Tg input to oceans from atmospheric deposition combine with marine biological nitrogen fixation (140 Tg N yr−1) to double the ocean processing of Nr. Some of the marine Nr is buried in sediments, the remainder being denitrified back to the atmosphere as N2 or N2O. The marine processing is of a similar magnitude to that in terrestrial soils and vegetation, but has a larger fraction of natural origin. The lifetime of Nr in the atmosphere, with the exception of N2O, is only a few weeks, while in terrestrial ecosystems, with the exception of peatlands (where it can be 102–103 years), the lifetime is a few decades. In the ocean, the lifetime of Nr is less well known but seems to be longer than in terrestrial ecosystems and may represent an important long-term source of N2O that will respond very slowly to control measures on the sources of Nr from which it is produced.

Journal ArticleDOI
TL;DR: A favorable penumbral pattern on neuroimaging did not identify patients who would differentially benefit from endovascular therapy for acute ischemic stroke, nor was embolectomy shown to be superior to standard care.
Abstract: BackgroundWhether brain imaging can identify patients who are most likely to benefit from therapies for acute ischemic stroke and whether endovascular thrombectomy improves clinical outcomes in such patients remains unclear. MethodsIn this study, we randomly assigned patients within 8 hours after the onset of large-vessel, anterior-circulation strokes to undergo mechanical embolectomy (Merci Retriever or Penumbra System) or receive standard care. All patients underwent pretreatment computed tomography or magnetic resonance imaging of the brain. Randomization was stratified according to whether the patient had a favorable penumbral pattern (substantial salvageable tissue and small infarct core) or a nonpenumbral pattern (large core or small or absent penumbra). We assessed outcomes using the 90-day modified Rankin scale, ranging from 0 (no symptoms) to 6 (dead). ResultsAmong 118 eligible patients, the mean age was 65.5 years, the mean time to enrollment was 5.5 hours, and 58% had a favorable penumbral patt...

Journal ArticleDOI
TL;DR: Graphene-on-MoS2 binary heterostructures display remarkable dual optoelectronic functionality, including highly sensitive photodetection and gate-tunable persistent photoconductivity, and may lead to new graphene-based optoeLECTronic devices that are naturally scalable for large-area applications at room temperature.
Abstract: Combining the electronic properties of graphene(1,2) and molybdenum disulphide (MoS2)(3-6) in hybrid heterostructures offers the possibility to create devices with various functionalities. Electronic logic and memory devices have already been constructed from graphene-MoS2 hybrids(7,8), but they do not make use of the photosensitivity of MoS2, which arises from its optical-range bandgap(9). Here, we demonstrate that graphene-on-MoS2 binary heterostructures display remarkable dual optoelectronic functionality, including highly sensitive photodetection and gate-tunable persistent photoconductivity. The responsivity of the hybrids was found to be nearly 1 x 10(10) A W-1 at 130 K and 5 x 10(8) A W-1 at room temperature, making them the most sensitive graphene-based photodetectors. When subjected to time-dependent photoillumination, the hybrids could also function as a rewritable optoelectronic switch or memory, where the persistent state shows almost no relaxation or decay within experimental timescales, indicating near-perfect charge retention. These effects can be quantitatively explained by gate-tunable charge exchange between the graphene and MoS2 layers, and may lead to new graphene-based optoelectronic devices that are naturally scalable for large-area applications at room temperature.

Journal ArticleDOI
12 Sep 2013-Nature
TL;DR: In this paper, a screen for de novo mutations in patients with two classical epileptic encephalopathies: infantile spasms and Lennox-Gastaut syndrome (n = 115) was performed.
Abstract: Epileptic encephalopathies are a devastating group of severe childhood epilepsy disorders for which the cause is often unknown. Here we report a screen for de novo mutations in patients with two classical epileptic encephalopathies: infantile spasms (n = 149) and Lennox-Gastaut syndrome (n = 115). We sequenced the exomes of 264 probands, and their parents, and confirmed 329 de novo mutations. A likelihood analysis showed a significant excess of de novo mutations in the ∼4,000 genes that are the most intolerant to functional genetic variation in the human population (P = 2.9 × 10(-3)). Among these are GABRB3, with de novo mutations in four patients, and ALG13, with the same de novo mutation in two patients; both genes show clear statistical evidence of association with epileptic encephalopathy. Given the relevant site-specific mutation rates, the probabilities of these outcomes occurring by chance are P = 4.1 × 10(-10) and P = 7.8 × 10(-12), respectively. Other genes with de novo mutations in this cohort include CACNA1A, CHD2, FLNA, GABRA1, GRIN1, GRIN2B, HNRNPU, IQSEC2, MTOR and NEDD4L. Finally, we show that the de novo mutations observed are enriched in specific gene sets including genes regulated by the fragile X protein (P < 10(-8)), as has been reported previously for autism spectrum disorders.

Journal ArticleDOI
01 Sep 2013-Stroke
TL;DR: A multidisciplinary panel of neurointerventionalists, neuroradiologists, and stroke neurologists with extensive experience in neuroimaging and IAT, convened at the “Consensus Meeting on Revascularization Grading Following Endovascular Therapy” with the goal of addressing heterogeneity in cerebral angiographic revascularization grading.
Abstract: See related article, p 2509 Intra-arterial therapy (IAT) for acute ischemic stroke (AIS) has dramatically evolved during the past decade to include aspiration and stent-retriever devices. Recent randomized controlled trials have demonstrated the superior revascularization efficacy of stent-retrievers compared with the first-generation Merci device.1,2 Additionally, the Diffusion and Perfusion Imaging Evaluation for Understanding Stroke Evolution (DEFUSE) 2, the Mechanical Retrieval and Recanalization of Stroke Clots Using Embolectomy (MR RESCUE), and the Interventional Management of Stroke (IMS) III trials have confirmed the importance of early revascularization for achieving better clinical outcome.3–5 Despite these data, the current heterogeneity in cerebral angiographic revascularization grading (CARG) poses a major obstacle to further advances in stroke therapy. To date, several CARG scales have been used to measure the success of IAT.6–14 Even when the same scale is used in different studies, it is applied using varying operational criteria, which further confounds the interpretation of this key metric.10 The lack of a uniform grading approach limits comparison of revascularization rates across clinical trials and hinders the translation of promising, early phase angiographic results into proven, clinically effective treatments.6–14 For these reasons, it is critical that CARG scales be standardized and end points for successful revascularization be refined.6 This will lead to a greater understanding of the aspects of revascularization that are strongly predictive of clinical response. The optimal grading scale must demonstrate (1) a strong correlation with clinical outcome, (2) simplicity and feasibility of scale interpretation while ensuring characterization of relevant angiographic findings, and (3) high inter-rater reproducibility. To address these issues, a multidisciplinary panel of neurointerventionalists, neuroradiologists, and stroke neurologists with extensive experience in neuroimaging and IAT, convened at the “Consensus Meeting on Revascularization Grading Following Endovascular Therapy” with the goal …

Journal ArticleDOI
TL;DR: The authors used a unique identification strategy that employs school-by-grade level turnover and two classes of fixed-effects models to estimate the effects of teacher turnover on over 850,000 New York City fourth and fifth-grade student observations over 8 years.
Abstract: Researchers and policymakers often assume that teacher turnover harms student achievement, though recent studies suggest this may not be the case Using a unique identification strategy that employs school-by-grade level turnover and two classes of fixed-effects models, this study estimates the effects of teacher turnover on over 850,000 New York City fourth- and fifth-grade student observations over 8 years The results indicate that students in grade levels with higher turnover score lower in both English language arts (ELA) and math and that these effects are particularly strong in schools with more low-performing and Black students Moreover, the results suggest that there is a disruptive effect of turnover beyond changing the distribution in teacher quality

Journal ArticleDOI
TL;DR: The guidelines were developed jointly by the American Society of Health-System Pharmacists (ASHP), the Infectious Diseases Society of America (IDSA), the Surgical Infection Society (SIS), and the Society for Healthcare Epidemiology (SHEA) as mentioned in this paper.
Abstract: These guidelines were developed jointly by the American Society of Health-System Pharmacists (ASHP), the Infectious Diseases Society of America (IDSA), the Surgical Infection Society (SIS), and the Society for Healthcare Epidemiology of America (SHEA). This work represents an update to the

Journal ArticleDOI
TL;DR: Recon 2, a community-driven, consensus 'metabolic reconstruction', is described, which is the most comprehensive representation of human metabolism that is applicable to computational modeling and has improved topological and functional features.
Abstract: Multiple models of human metabolism have been reconstructed, but each represents only a subset of our knowledge. Here we describe Recon 2, a community-driven, consensus 'metabolic reconstruction', which is the most comprehensive representation of human metabolism that is applicable to computational modeling. Compared with its predecessors, the reconstruction has improved topological and functional features, including ~2× more reactions and ~1.7× more unique metabolites. Using Recon 2 we predicted changes in metabolite biomarkers for 49 inborn errors of metabolism with 77% accuracy when compared to experimental data. Mapping metabolomic data and drug information onto Recon 2 demonstrates its potential for integrating and analyzing diverse data types. Using protein expression data, we automatically generated a compendium of 65 cell type–specific models, providing a basis for manual curation or investigation of cell-specific metabolic properties. Recon 2 will facilitate many future biomedical studies and is freely available at http://humanmetabolism.org/.

Journal ArticleDOI
01 Feb 2013-Science
TL;DR: The authors investigated the role of the gut microbiome in kwashiorkor, an enigmatic form of severe acute malnutrition that is the consequence of inadequate nutrient intake plus additional environmental insults, and found that RUTF produced a transient maturation of metabolic functions that regressed when administration of RUTF was stopped.
Abstract: Kwashiorkor, an enigmatic form of severe acute malnutrition, is the consequence of inadequate nutrient intake plus additional environmental insults. To investigate the role of the gut microbiome, we studied 317 Malawian twin pairs during the first 3 years of life. During this time, half of the twin pairs remained well nourished, whereas 43% became discordant, and 7% manifested concordance for acute malnutrition. Both children in twin pairs discordant for kwashiorkor were treated with a peanut-based, ready-to-use therapeutic food (RUTF). Time-series metagenomic studies revealed that RUTF produced a transient maturation of metabolic functions in kwashiorkor gut microbiomes that regressed when administration of RUTF was stopped. Previously frozen fecal communities from several discordant pairs were each transplanted into gnotobiotic mice. The combination of Malawian diet and kwashiorkor microbiome produced marked weight loss in recipient mice, accompanied by perturbations in amino acid, carbohydrate, and intermediary metabolism that were only transiently ameliorated with RUTF. These findings implicate the gut microbiome as a causal factor in kwashiorkor.

Journal ArticleDOI
TL;DR: Data are insufficient to show that any intervention enhances recovery or diminishes long-term sequelae postconcussion, and practice recommendations are presented for preparation counseling, management of suspected concussion, and management of diagnosed concussion.
Abstract: Objective: To update the 1997 American Academy of Neurology (AAN) practice parameter regarding sports concussion, focusing on 4 questions: 1) What factors increase/decrease concussion risk? 2) What diagnostic tools identify those with concussion and those at increased risk for severe/prolonged early impairments, neurologic catastrophe, or chronic neurobehavioral impairment? 3) What clinical factors identify those at increased risk for severe/prolonged early postconcussion impairments, neurologic catastrophe, recurrent concussions, or chronic neurobehavioral impairment? 4) What interventions enhance recovery, reduce recurrent concussion risk, or diminish long-term sequelae? The complete guideline on which this summary is based is available as an online data supplement to this article. Methods: We systematically reviewed the literature from 1955 to June 2012 for pertinent evidence. We assessed evidence for quality and synthesized into conclusions using a modified Grading of Recommendations Assessment, Development and Evaluation process. We used a modified Delphi process to develop recommendations. Results: Specific risk factors can increase or decrease concussion risk. Diagnostic tools to help identify individuals with concussion include graded symptom checklists, the Standardized Assessment of Concussion, neuropsychological assessments, and the Balance Error Scoring System. Ongoing clinical symptoms, concussion history, and younger age identify those at risk for postconcussion impairments. Risk factors for recurrent concussion include history of multiple concussions, particularly within 10 days after initial concussion. Risk factors for chronic neurobehavioral impairment include concussion exposure and APOE e4 genotype. Data are insufficient to show that any intervention enhances recovery or diminishes long-term sequelae postconcussion. Practice recommendations are presented for preparticipation counseling, management of suspected concussion, and management of diagnosed concussion.

Journal ArticleDOI
Ron Do1, Cristen J. Willer2, Ellen M. Schmidt2, Sebanti Sengupta2  +263 moreInstitutions (83)
TL;DR: It is suggested that triglyceride-rich lipoproteins causally influence risk for CAD, and the strength of a polymorphism's effect on triglyceride levels is correlated with the magnitude of its effect on CAD risk.
Abstract: Triglycerides are transported in plasma by specific triglyceride-rich lipoproteins; in epidemiological studies, increased triglyceride levels correlate with higher risk for coronary artery disease (CAD). However, it is unclear whether this association reflects causal processes. We used 185 common variants recently mapped for plasma lipids (P < 5 × 10(-8) for each) to examine the role of triglycerides in risk for CAD. First, we highlight loci associated with both low-density lipoprotein cholesterol (LDL-C) and triglyceride levels, and we show that the direction and magnitude of the associations with both traits are factors in determining CAD risk. Second, we consider loci with only a strong association with triglycerides and show that these loci are also associated with CAD. Finally, in a model accounting for effects on LDL-C and/or high-density lipoprotein cholesterol (HDL-C) levels, the strength of a polymorphism's effect on triglyceride levels is correlated with the magnitude of its effect on CAD risk. These results suggest that triglyceride-rich lipoproteins causally influence risk for CAD.

Journal ArticleDOI
01 Jun 2013-Spine
TL;DR: In this article, the authors evaluated correlations between spinopelvic parameters and health-related quality of life (HRQOL) scores in patients with spinal deformity and found that spinopels can provide a more complete assessment of the sagittal plane.
Abstract: Study design Prospective multicenter study evaluating operative (OP) versus nonoperative (NONOP) treatment for adult spinal deformity (ASD). Objective Evaluate correlations between spinopelvic parameters and health-related quality of life (HRQOL) scores in patients with ASD. Summary of background data Sagittal spinal deformity is commonly defined by an increased sagittal vertical axis (SVA); however, SVA alone may underestimate the severity of the deformity. Spinopelvic parameters provide a more complete assessment of the sagittal plane but only limited data are available that correlate spinopelvic parameters with disability. METHODS.: Baseline demographic, radiographical, and HRQOL data were obtained for all patients enrolled in a multicenter consecutive database. Inclusion criteria were: age more than 18 years and radiographical diagnosis of ASD. Radiographical evaluation was conducted on the frontal and lateral planes and HRQOL questionnaires (Oswestry Disability Index [ODI], Scoliosis Research Society-22r and Short Form [SF]-12) were completed. Radiographical parameters demonstrating highest correlation with HRQOL values were evaluated to determine thresholds predictive of ODI more than 40. Results Four hundred ninety-two consecutive patients with ASD (mean age, 51.9 yr) were enrolled. Patients from the OP group (n = 178) were older (55 vs. 50.1 yr, P Conclusion ASD is a disabling condition. Prospective analysis of consecutively enrolled patients with ASD demonstrated that PT and PI-LL combined with SVA can predict patient disability and provide a guide for patient assessment for appropriate therapeutic decision making. Threshold values for severe disability (ODI > 40) included: PT 22° or more, SVA 47 mm or more, and PI - LL 11° or more.

Book ChapterDOI
TL;DR: The Moral Foundations Theory (MFT) as discussed by the authors was created to answer these questions, including: where does morality come from? Why are moral judgments often so similar across cultures, yet sometimes so variable? Is morality one thing, or many?
Abstract: Where does morality come from? Why are moral judgments often so similar across cultures, yet sometimes so variable? Is morality one thing, or many? Moral Foundations Theory (MFT) was created to answer these questions. In this chapter, we describe the origins, assumptions, and current conceptualization of the theory and detail the empirical findings that MFT has made possible, both within social psychology and beyond. Looking toward the future, we embrace several critiques of the theory and specify five criteria for determining what should be considered a foundation of human morality. Finally, we suggest a variety of future directions for MFT and moral psychology.

Journal ArticleDOI
TL;DR: In this paper, the authors argue that the notion of value has been overly simplified and narrowed to focus on economic returns and propose a more complex perspective of the value that stakeholders seek as well as new ways to measure it.
Abstract: This paper argues that the notion of value has been overly simplified and narrowed to focus on economic returns. Stakeholder theory provides an appropriate lens for considering a more complex perspective of the value that stakeholders seek as well as new ways to measure it. We develop a four-factor perspective for defining value that includes, but extends beyond, the economic value stakeholders seek. To highlight its distinctiveness, we compare this perspective to three other popular performance perspectives. Recommendations are made regarding performance measurement for both academic researchers and practitioners. The stakeholder perspective on value offered in this paper draws attention to those factors that are most closely associated with building more value for stakeholders, and in so doing, allows academics to better measure it and enhances managerial ability to create it.

Journal ArticleDOI
TL;DR: In this pilot study, essential tremor improved in 15 patients treated with MRI-guided focused ultrasound thalamotomy, and large, randomized, controlled trials will be required to assess the procedure's efficacy and safety.
Abstract: Background Recent advances have enabled delivery of high-intensity focused ultrasound through the intact human cranium with magnetic resonance imaging (MRI) guidance. This preliminary study investigates the use of transcranial MRI-guided focused ultrasound thalamotomy for the treatment of essential tremor. Methods From February 2011 through December 2011, in an open-label, uncontrolled study, we used transcranial MRI-guided focused ultrasound to target the unilateral ventral intermediate nucleus of the thalamus in 15 patients with severe, medication-refractory essential tremor. We recorded all safety data and measured the effectiveness of tremor suppression using the Clinical Rating Scale for Tremor to calculate the total score (ranging from 0 to 160), hand subscore (primary outcome, ranging from 0 to 32), and disability subscore (ranging from 0 to 32), with higher scores indicating worse tremor. We assessed the patients' perceptions of treatment efficacy with the Quality of Life in Essential Tremor Quest...

Journal ArticleDOI
TL;DR: The need for sound ecological science has escalated alongside the rise of the information age and "big data" across all sectors of society as discussed by the authors, which presents unprecedented opportunities for advancing science and inform- ing resource management through dataintensive approaches.
Abstract: The need for sound ecological science has escalated alongside the rise of the information age and “big data” across all sectors of society. Big data generally refer to massive volumes of data not readily handled by the usual data tools and practices and present unprecedented opportunities for advancing science and inform- ing resource management through data-intensive approaches. The era of big data need not be propelled only by “big science” – the term used to describe large-scale efforts that have had mixed success in the individual-driven culture of ecology. Collectively, ecologists already have big data to bolster the scientific effort – a large volume of distributed, high-value information – but many simply fail to contribute. We encourage ecologists to join the larger scientific community in global initiatives to address major scientific and societal problems by bringing their distributed data to the table and harnessing its collective power. The scientists who contribute such information will be at the forefront of socially relevant science – but will they be ecologists?

Journal ArticleDOI
TL;DR: In this article, a review of recent research on the drivers, feedbacks, and impacts of global desertification is presented, motivated by the increasing need to improve global food production and to sustainably manage ecosystems in the context of climate change.

Journal ArticleDOI
TL;DR: In this paper, the authors focus on research practices but also offer guidelines for reviewers, editors, journal management, teachers, granting institutions, and university promotion committees, highlighting some of the emerging and existing practical solutions that can facilitate implementation of these recommendations.
Abstract: Replicability of findings is at the heart of any empirical science. The aim of this article is to move the current replicability debate in psychology towards concrete recommendations for improvement. We focus on research practices but also offer guidelines for reviewers, editors, journal management, teachers, granting institutions, and university promotion committees, highlighting some of the emerging and existing practical solutions that can facilitate implementation of these recommendations. The challenges for improving replicability in psychological science are systemic. Improvement can occur only if changes are made at many levels of practice, evaluation, and reward. Copyright © 2013 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this article, a detailed description of the analysis used by the CMS Collaboration in the search for the standard model Higgs boson in pp collisions at the LHC, which led to the observation of a new boson.
Abstract: A detailed description is reported of the analysis used by the CMS Collaboration in the search for the standard model Higgs boson in pp collisions at the LHC, which led to the observation of a new boson. The data sample corresponds to integrated luminosities up to 5.1 inverse femtobarns at sqrt(s) = 7 TeV, and up to 5.3 inverse femtobarns at sqrt(s) = 8 TeV. The results for five Higgs boson decay modes gamma gamma, ZZ, WW, tau tau, and bb, which show a combined local significance of 5 standard deviations near 125 GeV, are reviewed. A fit to the invariant mass of the two high resolution channels, gamma gamma and ZZ to 4 ell, gives a mass estimate of 125.3 +/- 0.4 (stat) +/- 0.5 (syst) GeV. The measurements are interpreted in the context of the standard model Lagrangian for the scalar Higgs field interacting with fermions and vector bosons. The measured values of the corresponding couplings are compared to the standard model predictions. The hypothesis of custodial symmetry is tested through the measurement of the ratio of the couplings to the W and Z bosons. All the results are consistent, within their uncertainties, with the expectations for a standard model Higgs boson.

Journal ArticleDOI
TL;DR: Pretend play has been claimed to be crucial to children's healthy development as discussed by the authors, and the evidence for this position versus two alternatives: pretend play is one of many routes to positive developments (equifinality), and pretend play is an epiphenomenon of other factors that drive development.
Abstract: Pretend play has been claimed to be crucial to children’s healthy development. Here we examine evidence for this position versus 2 alternatives: Pretend play is 1 of many routes to positive developments (equifinality), and pretend play is an epiphenomenon of other factors that drive development. Evidence from several domains is considered. For language, narrative, and emotion regulation, the research conducted to date is consistent with all 3 positions but insufficient to draw conclusions. For executive function and social skills, existing research leans against the crucial causal position but is insufficient to differentiate the other 2. For reasoning, equifinality is definitely supported, ruling out a crucially causal position but still leaving open the possibility that pretend play is epiphenomenal. For problem solving, there is no compelling evidence that pretend play helps or is even a correlate. For creativity, intelligence, conservation, and theory of mind, inconsistent correlational results from sound studies and nonreplication with masked experimenters are problematic for a causal position, and some good studies favor an epiphenomenon position in which child, adult, and environment characteristics that go along with play are the true causal agents. We end by considering epiphenomenalism more deeply and discussing implications for preschool settings and further research in this domain. Our take-away message is that existing evidence does not support strong causal claims about the unique importance of pretend play for development and that much more and better research is essential for clarifying its possible role.