scispace - formally typeset
Search or ask a question

Showing papers by "University of Alabama published in 2016"


Journal ArticleDOI
Daniel J. Klionsky1, Kotb Abdelmohsen2, Akihisa Abe3, Joynal Abedin4  +2519 moreInstitutions (695)
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.

5,187 citations


Journal ArticleDOI
TL;DR: In this paper, the International Max Planck Research School for Astronomy and Astrophysics at the Universities of Bonn and Cologne (IMPRS Bonn/Cologne); Estonian Research Council [IUT26-2]; European Regional Development Fund [TK133]; Australian Research Council Future Fellowship [FT150100024]; NSF CAREER grant [AST-1149491]
Abstract: Deutsche Forschungsgemeinschaft (DFG) [KA1265/5-1, KA1265/5-2, KE757/71, KE757/7-2, KE757/7-3, KE757/11-1.]; International Max Planck Research School for Astronomy and Astrophysics at the Universities of Bonn and Cologne (IMPRS Bonn/Cologne); Estonian Research Council [IUT26-2]; European Regional Development Fund [TK133]; Australian Research Council Future Fellowship [FT150100024]; NSF CAREER grant [AST-1149491]

832 citations


Journal ArticleDOI
TL;DR: Imaging results suggest that intra-brain vascular dysregulation is an early pathological event during disease development, suggesting early memory deficit associated with the primary disease factors.
Abstract: Multifactorial mechanisms underlying late-onset Alzheimer's disease (LOAD) are poorly characterized from an integrative perspective. Here spatiotemporal alterations in brain amyloid-β deposition, metabolism, vascular, functional activity at rest, structural properties, cognitive integrity and peripheral proteins levels are characterized in relation to LOAD progression. We analyse over 7,700 brain images and tens of plasma and cerebrospinal fluid biomarkers from the Alzheimer's Disease Neuroimaging Initiative (ADNI). Through a multifactorial data-driven analysis, we obtain dynamic LOAD-abnormality indices for all biomarkers, and a tentative temporal ordering of disease progression. Imaging results suggest that intra-brain vascular dysregulation is an early pathological event during disease development. Cognitive decline is noticeable from initial LOAD stages, suggesting early memory deficit associated with the primary disease factors. High abnormality levels are also observed for specific proteins associated with the vascular system's integrity. Although still subjected to the sensitivity of the algorithms and biomarkers employed, our results might contribute to the development of preventive therapeutic interventions.

786 citations


Journal ArticleDOI
Lourens Poorter1, Frans Bongers1, T. Mitchell Aide2, Angelica M. Almeyda Zambrano3, Patricia Balvanera4, Justin M. Becknell5, Vanessa K. Boukili6, Pedro H. S. Brancalion7, Eben N. Broadbent3, Robin L. Chazdon6, Dylan Craven8, Dylan Craven9, Jarcilene S. Almeida-Cortez10, George A. L. Cabral10, Ben H. J. de Jong, Julie S. Denslow11, Daisy H. Dent12, Daisy H. Dent9, Saara J. DeWalt13, Juan Manuel Dupuy, Sandra M. Durán14, Mário M. Espírito-Santo, María C. Fandiño, Ricardo Gomes César7, Jefferson S. Hall9, José Luis Hernández-Stefanoni, Catarina C. Jakovac1, Catarina C. Jakovac15, André Braga Junqueira15, André Braga Junqueira1, Deborah K. Kennard16, Susan G. Letcher17, Juan Carlos Licona, Madelon Lohbeck1, Madelon Lohbeck18, Erika Marin-Spiotta19, Miguel Martínez-Ramos4, Paulo Eduardo dos Santos Massoca15, Jorge A. Meave4, Rita C. G. Mesquita15, Francisco Mora4, Rodrigo Muñoz4, Robert Muscarella20, Robert Muscarella21, Yule Roberta Ferreira Nunes, Susana Ochoa-Gaona, Alexandre Adalardo de Oliveira7, Edith Orihuela-Belmonte, Marielos Peña-Claros1, Eduardo A. Pérez-García4, Daniel Piotto, Jennifer S. Powers22, Jorge Rodríguez-Velázquez4, I. Eunice Romero-Pérez4, Jorge Ruiz23, Jorge Ruiz24, Juan Saldarriaga, Arturo Sanchez-Azofeifa14, Naomi B. Schwartz20, Marc K. Steininger, Nathan G. Swenson25, Marisol Toledo, María Uriarte20, Michiel van Breugel26, Michiel van Breugel9, Michiel van Breugel27, Hans van der Wal28, Maria das Dores Magalhães Veloso, Hans F. M. Vester29, Alberto Vicentini15, Ima Célia Guimarães Vieira30, Tony Vizcarra Bentos15, G. Bruce Williamson31, G. Bruce Williamson15, Danaë M. A. Rozendaal1, Danaë M. A. Rozendaal32, Danaë M. A. Rozendaal6 
11 Feb 2016-Nature
TL;DR: A biomass recovery map of Latin America is presented, which illustrates geographical and climatic variation in carbon sequestration potential during forest regrowth and will support policies to minimize forest loss in areas where biomass resilience is naturally low and promote forest regeneration and restoration in humid tropical lowland areas with high biomass resilience.
Abstract: Land-use change occurs nowhere more rapidly than in the tropics, where the imbalance between deforestation and forest regrowth has large consequences for the global carbon cycle. However, considerable uncertainty remains about the rate of biomass recovery in secondary forests, and how these rates are influenced by climate, landscape, and prior land use. Here we analyse aboveground biomass recovery during secondary succession in 45 forest sites and about 1,500 forest plots covering the major environmental gradients in the Neotropics. The studied secondary forests are highly productive and resilient. Aboveground biomass recovery after 20 years was on average 122 megagrams per hectare (Mg ha(-1)), corresponding to a net carbon uptake of 3.05 Mg C ha(-1) yr(-1), 11 times the uptake rate of old-growth forests. Aboveground biomass stocks took a median time of 66 years to recover to 90% of old-growth values. Aboveground biomass recovery after 20 years varied 11.3-fold (from 20 to 225 Mg ha(-1)) across sites, and this recovery increased with water availability (higher local rainfall and lower climatic water deficit). We present a biomass recovery map of Latin America, which illustrates geographical and climatic variation in carbon sequestration potential during forest regrowth. The map will support policies to minimize forest loss in areas where biomass resilience is naturally low (such as seasonally dry forest regions) and promote forest regeneration and restoration in humid tropical lowland areas with high biomass resilience.

724 citations


Journal ArticleDOI
Vardan Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam  +2283 moreInstitutions (141)
TL;DR: Combined fits to CMS UE proton–proton data at 7TeV and to UEProton–antiproton data from the CDF experiment at lower s, are used to study the UE models and constrain their parameters, providing thereby improved predictions for proton-proton collisions at 13.
Abstract: New sets of parameters ("tunes") for the underlying-event (UE) modeling of the PYTHIA8, PYTHIA6 and HERWIG++ Monte Carlo event generators are constructed using different parton distribution functions. Combined fits to CMS UE data at sqrt(s) = 7 TeV and to UE data from the CDF experiment at lower sqrt(s), are used to study the UE models and constrain their parameters, providing thereby improved predictions for proton-proton collisions at 13 TeV. In addition, it is investigated whether the values of the parameters obtained from fits to UE observables are consistent with the values determined from fitting observables sensitive to double-parton scattering processes. Finally, comparisons of the UE tunes to "minimum bias" (MB) events, multijet, and Drell-Yan (q q-bar to Z / gamma* to lepton-antilepton + jets) observables at 7 and 8 TeV are presented, as well as predictions of MB and UE observables at 13 TeV.

686 citations


Journal ArticleDOI
TL;DR: Propensity score matching (PSM) has become a popular technique for estimating average treatment effects (ATEs) in accounting research, but studies often oversell the capabilities of PSM, fail to disclose important design choices, and/or implement PSM in a theoretically inconsistent manner.
Abstract: Propensity score matching (PSM) has become a popular technique for estimating average treatment effects (ATEs) in accounting research. In this study, we discuss the usefulness and limitations of PSM relative to more traditional multiple regression (MR) analysis. We discuss several PSM design choices and review the use of PSM in 86 articles in leading accounting journals from 2008-2014. We document a significant increase in the use of PSM from 0 studies in 2008 to 26 studies in 2014. However, studies often oversell the capabilities of PSM, fail to disclose important design choices, and/or implement PSM in a theoretically inconsistent manner. We then empirically illustrate complications associated with PSM in three accounting research settings. We first demonstrate that when the treatment is not binary, PSM tends to confine analyses to a subsample of observations where the effect size is likely to be smallest. We also show that seemingly innocuous design choices greatly influence sample composition and estimates of the ATE. We conclude with suggestions for future research considering the use of matching methods.

666 citations


Journal ArticleDOI
TL;DR: The promises and challenges of these genome scan methods are reviewed, including correcting for the confounding influence of a species’ demographic history, biases caused by missing aspects of the genome, matching scales of environmental data with population structure, and other statistical considerations.
Abstract: Uncovering the genetic and evolutionary basis of local adaptation is a major focus of evolutionary biology. The recent development of cost-effective methods for obtaining high-quality genome-scale data makes it possible to identify some of the loci responsible for adaptive differences among populations. Two basic approaches for identifying putatively locally adaptive loci have been developed and are broadly used: one that identifies loci with unusually high genetic differentiation among populations (differentiation outlier methods) and one that searches for correlations between local population allele frequencies and local environments (genetic-environment association methods). Here, we review the promises and challenges of these genome scan methods, including correcting for the confounding influence of a species’ demographic history, biases caused by missing aspects of the genome, matching scales of environmental data with population structure, and other statistical considerations. In each case, ...

627 citations


Journal ArticleDOI
M. G. Aartsen1, K. Abraham2, Markus Ackermann, Jenni Adams3  +313 moreInstitutions (49)
TL;DR: In this paper, an isotropic, unbroken power-law flux with a normalization at 100 TeV neutrino energy of (0.90 -0.27 +0.30) × 10-18 Gev-1 cm-2 s-1 sr-1 and a hard spectral index of γ = 2.13 ± 0.13.
Abstract: The IceCube Collaboration has previously discovered a high-energy astrophysical neutrino flux using neutrino events with interaction vertices contained within the instrumented volume of the IceCube detector. We present a complementary measurement using charged current muon neutrino events where the interaction vertex can be outside this volume. As a consequence of the large muon range the effective area is significantly larger but the field of view is restricted to the Northern Hemisphere. IceCube data from 2009 through 2015 have been analyzed using a likelihood approach based on the reconstructed muon energy and zenith angle. At the highest neutrino energies between 194 TeV and 7.8 PeV a significant astrophysical contribution is observed, excluding a purely atmospheric origin of these events at 5.6s significance. The data are well described by an isotropic, unbroken power-law flux with a normalization at 100 TeV neutrino energy of (0.90 -0.27 +0.30) × 10-18 Gev-1 cm-2 s-1 sr-1and a hard spectral index of γ = 2.13 ± 0.13. The observed spectrum is harder in comparison to previous IceCube analyses with lower energy thresholds which may indicate a break in the astrophysical neutrino spectrum of unknown origin. The highest-energy event observed has a reconstructed muon energy of (4.5 ± 1.2) PeV which implies a probability of less than 0.005% for this event to be of atmospheric origin. Analyzing the arrival directions of all events with reconstructed muon energies above 200 TeV no correlation with known γ-ray sources was found. Using the high statistics of atmospheric neutrinos we report the current best constraints on a prompt atmospheric muon neutrino flux originating from charmed meson decays which is below 1.06 in units of the flux normalization of the model in Enberg et al.

503 citations


Journal ArticleDOI
TL;DR: The design parameters for different applications of various pure and composite hydrogels based on cellulose, chitin, or chitosan, including applications as controlled and targeted drug delivery systems, improved tissue engineering scaffolds, wound dressings, water purification sorbents, and others are compared.

473 citations


Journal ArticleDOI
TL;DR: It is shown that sponges are a reservoir of exceptional microbial diversity and major contributors to the total microbial diversity of the world's oceans, and a model of independent assembly and evolution in symbiont communities across the entire host phylum is supported.
Abstract: Sponges (phylum Porifera) are early-diverging metazoa renowned for establishing complex microbial symbioses. Here we present a global Porifera microbiome survey, set out to establish the ecological and evolutionary drivers of these host-microbe interactions. We show that sponges are a reservoir of exceptional microbial diversity and major contributors to the total microbial diversity of the world's oceans. Little commonality in species composition or structure is evident across the phylum, although symbiont communities are characterized by specialists and generalists rather than opportunists. Core sponge microbiomes are stable and characterized by generalist symbionts exhibiting amensal and/or commensal interactions. Symbionts that are phylogenetically unique to sponges do not disproportionally contribute to the core microbiome, and host phylogeny impacts complexity rather than composition of the symbiont community. Our findings support a model of independent assembly and evolution in symbiont communities across the entire host phylum, with convergent forces resulting in analogous community organization and interactions.

456 citations


Journal ArticleDOI
TL;DR: Reduction in treatment exposure was associated with reduced late mortality among survivors of acute lymphoblastic leukemia and Wilms' tumor and the strategy of lowering therapeutic exposure has contributed to an observed decline inLate mortality among 5-year survivors of childhood cancer.
Abstract: BackgroundAmong patients in whom childhood cancer was diagnosed in the 1970s and 1980s, 18% of those who survived for 5 years died within the subsequent 25 years. In recent decades, cancer treatments have been modified with the goal of reducing life-threatening late effects. MethodsWe evaluated late mortality among 34,033 patients in the Childhood Cancer Survivor Study cohort who survived at least 5 years after childhood cancer (i.e., cancer diagnosed before the age of 21 years) for which treatment was initiated during the period from 1970 through 1999. The median follow-up was 21 years (range, 5 to 38). We evaluated demographic and disease factors that were associated with death from health-related causes (i.e., conditions that exclude recurrence or progression of the original cancer and external causes but include the late effects of cancer therapy) using cumulative incidence and piecewise exponential models to estimate relative rates and 95% confidence intervals. ResultsOf the 3958 deaths that occurred...

Journal ArticleDOI
Robin L. Chazdon1, Robin L. Chazdon2, Eben N. Broadbent3, Danaë M. A. Rozendaal4, Danaë M. A. Rozendaal5, Danaë M. A. Rozendaal1, Frans Bongers5, Angelica M. Almeyda Zambrano3, T. Mitchell Aide6, Patricia Balvanera7, Justin M. Becknell8, Vanessa K. Boukili1, Pedro H. S. Brancalion9, Dylan Craven10, Dylan Craven11, Jarcilene S. Almeida-Cortez12, George A. L. Cabral12, Ben de Jong, Julie S. Denslow13, Daisy H. Dent10, Daisy H. Dent14, Saara J. DeWalt15, Juan Manuel Dupuy, Sandra M. Durán16, Mário M. Espírito-Santo, María C. Fandiño, Ricardo Gomes César9, Jefferson S. Hall10, José Luis Hernández-Stefanoni, Catarina C. Jakovac5, Catarina C. Jakovac17, André Braga Junqueira17, André Braga Junqueira5, Deborah K. Kennard18, Susan G. Letcher19, Madelon Lohbeck5, Madelon Lohbeck20, Miguel Martínez-Ramos7, Paulo Eduardo dos Santos Massoca17, Jorge A. Meave7, Rita C. G. Mesquita17, Francisco Mora7, Rodrigo Muñoz7, Robert Muscarella21, Robert Muscarella22, Yule Roberta Ferreira Nunes, Susana Ochoa-Gaona, Edith Orihuela-Belmonte, Marielos Peña-Claros5, Eduardo A. Pérez-García7, Daniel Piotto, Jennifer S. Powers23, Jorge Rodríguez-Velázquez7, Isabel Eunice Romero-Pérez7, Jorge Ruiz24, Jorge Ruiz25, Juan Saldarriaga, Arturo Sanchez-Azofeifa16, Naomi B. Schwartz22, Marc K. Steininger26, Nathan G. Swenson26, María Uriarte22, Michiel van Breugel27, Michiel van Breugel10, Michiel van Breugel28, Hans van der Wal29, Hans van der Wal30, Maria das Dores Magalhães Veloso, Hans F. M. Vester, Ima Célia Guimarães Vieira31, Tony Vizcarra Bentos17, G. Bruce Williamson32, G. Bruce Williamson17, Lourens Poorter5 
TL;DR: This study estimates the age and spatial extent of lowland second-growth forests in the Latin American tropics and model their potential aboveground carbon accumulation over four decades to guide national-level forest-based carbon mitigation plans.
Abstract: Regrowth of tropical secondary forests following complete or nearly complete removal of forest vegetation actively stores carbon in aboveground biomass, partially counterbalancing carbon emissions from deforestation, forest degradation, burning of fossil fuels, and other anthropogenic sources. We estimate the age and spatial extent of lowland second-growth forests in the Latin American tropics and model their potential aboveground carbon accumulation over four decades. Our model shows that, in 2008, second-growth forests (1 to 60 years old) covered 2.4 million km2 of land (28.1% of the total study area). Over 40 years, these lands can potentially accumulate a total aboveground carbon stock of 8.48 Pg C (petagrams of carbon) in aboveground biomass via low-cost natural regeneration or assisted regeneration, corresponding to a total CO2 sequestration of 31.09 Pg CO2. This total is equivalent to carbon emissions from fossil fuel use and industrial processes in all of Latin America and the Caribbean from 1993 to 2014. Ten countries account for 95% of this carbon storage potential, led by Brazil, Colombia, Mexico, and Venezuela. We model future land-use scenarios to guide national carbon mitigation policies. Permitting natural regeneration on 40% of lowland pastures potentially stores an additional 2.0 Pg C over 40 years. Our study provides information and maps to guide national-level forest-based carbon mitigation plans on the basis of estimated rates of natural regeneration and pasture abandonment. Coupled with avoided deforestation and sustainable forest management, natural regeneration of second-growth forests provides a low-cost mechanism that yields a high carbon sequestration potential with multiple benefits for biodiversity and ecosystem services.

Journal ArticleDOI
TL;DR: In this article, the authors present an overview of the literature on residential demand response systems, load-scheduling techniques, and the latest ICT that supports residential DR applications, and highlight and analyze the challenges with regard to the residential DR of smart grid.
Abstract: Advances in information and communication technologies (ICT) enable a great opportunity to develop the residential demand response that is relevant in smart grid applications. Demand response (DR) aims to manage the required demand to match the available energy resources without adding new generation capacity. Expanding the DR to cover the residential sector in addition to the industrial and commercial sectors gives rise to a wide range of challenges. This study presents an overview of the literature on residential DR systems, load-scheduling techniques, and the latest ICT that supports residential DR applications. Furthermore, challenges are highlighted and analyzed, which are likely to become relevant research topics with regard to the residential DR of smart grid. The literature review shows that most DR schemes suffer from an externality problem that involves the effect of high-level customer consumption on the price rates of other customers, especially during peak period. A recommendation for using adaptive multi-consumption level pricing scheme is presented to overcome this challenge. *Corresponding author at: Department of Electronics and Communication Engineering, Tenaga Nasional Universiti, P13-B-07-06, Sri Cempaka, Jalan Sepakat Indah 2/2 Taman Sep, Kajang 43000 Selangor, Malaysia. Tel.: +60 183262643.

Journal ArticleDOI
TL;DR: New immunotherapeutic agents, such as nivolumab and pembrolizumab, for patients with metastatic NSCLC are discussed, based on improved overall survival rates, higher response rates, longer duration of response, and fewer adverse events when compared with docetaxel therapy.
Abstract: These NCCN Guidelines Insights focus on recent updates in the 2016 NCCN Guidelines for Non-Small Cell Lung Cancer (NSCLC; Versions 1-4). These NCCN Guidelines Insights will discuss new immunotherapeutic agents, such as nivolumab and pembrolizumab, for patients with metastatic NSCLC. For the 2016 update, the NCCN panel recommends immune checkpoint inhibitors as preferred agents (in the absence of contraindications) for second-line and beyond (subsequent) therapy in patients with metastatic NSCLC (both squamous and nonsquamous histologies). Nivolumab and pembrolizumab are preferred based on improved overall survival rates, higher response rates, longer duration of response, and fewer adverse events when compared with docetaxel therapy.

Journal ArticleDOI
TL;DR: In this article, the authors examine relationships among context, student engagement, and adjustment, and provide a short overview of the papers in this special issue highlighting their theoretical frameworks, methodologies, and analytical techniques by which many of the challenges outlined in this introduction are addressed.

Journal ArticleDOI
TL;DR: It is concluded that genome scans based on RADseq data alone, while useful for studies of neutral genetic variation and genetic population structure, will likely miss many loci under selection in studies of local adaptation.
Abstract: Understanding how and why populations evolve is of fundamental importance to molecular ecology. Restriction site-associated DNA sequencing (RADseq), a popular reduced representation method, has ushered in a new era of genome-scale research for assessing population structure, hybridization, demographic history, phylogeography and migration. RADseq has also been widely used to conduct genome scans to detect loci involved in adaptive divergence among natural populations. Here, we examine the capacity of those RADseq-based genome scan studies to detect loci involved in local adaptation. To understand what proportion of the genome is missed by RADseq studies, we developed a simple model using different numbers of RAD-tags, genome sizes and extents of linkage disequilibrium (length of haplotype blocks). Under the best-case modelling scenario, we found that RADseq using six- or eight-base pair cutting restriction enzymes would fail to sample many regions of the genome, especially for species with short linkage disequilibrium. We then surveyed recent studies that have used RADseq for genome scans and found that the median density of markers across these studies was 4.08 RAD-tag markers per megabase (one marker per 245 kb). The length of linkage disequilibrium for many species is one to three orders of magnitude less than density of the typical recent RADseq study. Thus, we conclude that genome scans based on RADseq data alone, while useful for studies of neutral genetic variation and genetic population structure, will likely miss many loci under selection in studies of local adaptation.

Journal ArticleDOI
TL;DR: Major discussion topics this year included multigene testing, risk management recommendations for less common genetic mutations, and salpingectomy for ovarian cancer risk reduction.
Abstract: The NCCN Guidelines for Genetic/Familial High-Risk Assessment: Breast and Ovarian provide recommendations for genetic testing and counseling and risk assessment and management for hereditary cancer syndromes. Guidelines focus on syndromes associated with an increased risk of breast and/or ovarian cancer and are intended to assist with clinical and shared decision-making. These NCCN Guidelines Insights summarize major discussion points of the 2015 NCCN Genetic/Familial High-Risk Assessment: Breast and Ovarian panel meeting. Major discussion topics this year included multigene testing, risk management recommendations for less common genetic mutations, and salpingectomy for ovarian cancer risk reduction. The panel also discussed revisions to genetic testing criteria that take into account ovarian cancer histology and personal history of pancreatic cancer.

Proceedings ArticleDOI
TL;DR: In this paper, a 3D sequentially coupled finite element (FE) model was developed to investigate the thermomechanical responses in the selective laser melting (SLM) process, and the model was applied to test different scanning strategies and evaluate their effects on part temperature, stress and deformation.
Abstract: Selective laser melting (SLM) has emerged as one of the primary metal additive manufacturing technologies used for many applications in various industries such as medical and aerospace sectors. However, defects such as part distortion and delamination resulted from process-induced residual stresses are still one of the key challenges that hinder widespread adoptions of SLM. For process parameters, the laser beam scanning path will affect the thermomechanical behaviors of the build part, and thus, altering the scanning pattern may be a possible strategy to reduce residual stresses and deformations through influencing the heat intensity input distributions. In this study, a 3D sequentially coupled finite element (FE) model was developed to investigate the thermomechanical responses in the SLM process. The model was applied to test different scanning strategies and evaluate their effects on part temperature, stress and deformation. The major results are summarized as follows. (1) Among all cases tested, the out-in scanning pattern has the maximum stresses along the X and Y directions; while the 45° inclined line scanning may reduce residual stresses in both directions. (2) Large directional stress differences can be generated by the horizontal line scanning strategy. (3) X and Y directional stress concentrations are shown around the edge of the deposited layers and the interface between the deposited layers and the substrate for all cases. (4) The 45° inclined line scanning case also has a smaller build direction deformation than other cases.

Journal ArticleDOI
23 Feb 2016-PeerJ
TL;DR: Contrary to long held beliefs that the orb web is the crowning achievement of spider evolution, ancestral state reconstructions of web type support a phylogenetically ancient origin of the orbweb, and diversification analyses show that the mostly ground-dwelling, web-less RTA clade diversified faster than orb weavers.
Abstract: Spiders (Order Araneae) are massively abundant generalist arthropod predators that are found in nearly every ecosystem on the planet and have persisted for over 380 million years. Spiders have long served as evolutionary models for studying complex mating and web spinning behaviors, key innovation and adaptive radiation hypotheses, and have been inspiration for important theories like sexual selection by female choice. Unfortunately, past major attempts to reconstruct spider phylogeny typically employing the "usual suspect" genes have been unable to produce a well-supported phylogenetic framework for the entire order. To further resolve spider evolutionary relationships we have assembled a transcriptome-based data set comprising 70 ingroup spider taxa. Using maximum likelihood and shortcut coalescence-based approaches, we analyze eight data sets, the largest of which contains 3,398 gene regions and 696,652 amino acid sites forming the largest phylogenomic analysis of spider relationships produced to date. Contrary to long held beliefs that the orb web is the crowning achievement of spider evolution, ancestral state reconstructions of web type support a phylogenetically ancient origin of the orb web, and diversification analyses show that the mostly ground-dwelling, web-less RTA clade diversified faster than orb weavers. Consistent with molecular dating estimates we report herein, this may reflect a major increase in biomass of non-flying insects during the Cretaceous Terrestrial Revolution 125-90 million years ago favoring diversification of spiders that feed on cursorial rather than flying prey. Our results also have major implications for our understanding of spider systematics. Phylogenomic analyses corroborate several well-accepted high level groupings: Opisthothele, Mygalomorphae, Atypoidina, Avicularoidea, Theraphosoidina, Araneomorphae, Entelegynae, Araneoidea, the RTA clade, Dionycha and the Lycosoidea. Alternatively, our results challenge the monophyly of Eresoidea, Orbiculariae, and Deinopoidea. The composition of the major paleocribellate and neocribellate clades, the basal divisions of Araneomorphae, appear to be falsified. Traditional Haplogynae is in need of revision, as our findings appear to support the newly conceived concept of Synspermiata. The sister pairing of filistatids with hypochilids implies that some peculiar features of each family may in fact be synapomorphic for the pair. Leptonetids now are seen as a possible sister group to the Entelegynae, illustrating possible intermediates in the evolution of the more complex entelegyne genitalic condition, spinning organs and respiratory organs.

Journal ArticleDOI
TL;DR: In this article, the as-cast microstructures and 1050°C oxidation behaviors of a series of arc-melted Al x (NiCoCrFe) 100-x HEAs where x = 8, 10, 12, 15, 20, and 30 (at.

Journal ArticleDOI
TL;DR: This large cohort study found that despite similar causes and severity of ALF among patients referred to specialty centers from 1998 to 2013, the proportion of patients listed for liver transplantation decreased and survival improved among those who did not receive a transplant as well asThose who did.
Abstract: Whether changes have occurred in the causes of acute liver failure (ALF), its management, or the survival of patients with the condition with or without liver transplantation is not known. This lar...

Journal ArticleDOI
M. G. Aartsen1, K. Abraham2, Markus Ackermann, Jenni Adams3  +295 moreInstitutions (47)
TL;DR: New exclusion limits are placed on the parameter space of the 3+1 model, in which muon antineutrinos experience a strong Mikheyev-Smirnov-Wolfenstein-resonant oscillation.
Abstract: The IceCube neutrino telescope at the South Pole has measured the atmospheric muon neutrino spectrum as a function of zenith angle and energy in the approximate 320 GeV to 20 TeV range, to search for the oscillation signatures of light sterile neutrinos. No evidence for anomalous ν_{μ} or ν[over ¯]_{μ} disappearance is observed in either of two independently developed analyses, each using one year of atmospheric neutrino data. New exclusion limits are placed on the parameter space of the 3+1 model, in which muon antineutrinos experience a strong Mikheyev-Smirnov-Wolfenstein-resonant oscillation. The exclusion limits extend to sin^{2}2θ_{24}≤0.02 at Δm^{2}∼0.3 eV^{2} at the 90% confidence level. The allowed region from global analysis of appearance experiments, including LSND and MiniBooNE, is excluded at approximately the 99% confidence level for the global best-fit value of |U_{e4}|^{2}.

Proceedings ArticleDOI
TL;DR: In this article, a viewport-adaptive 360-degree video streaming system is proposed to reduce the bandwidth waste, while still providing an immersive experience, by preparing multiple video representations, which differ not only by their bit-rate, but also by the qualities of different scene regions.
Abstract: The delivery and display of 360-degree videos on Head-Mounted Displays (HMDs) presents many technical challenges. 360-degree videos are ultra high resolution spherical videos, which contain an omnidirectional view of the scene. However only a portion of this scene is displayed on the HMD. Moreover, HMD need to respond in 10 ms to head movements, which prevents the server to send only the displayed video part based on client feedback. To reduce the bandwidth waste, while still providing an immersive experience, a viewport-adaptive 360-degree video streaming system is proposed. The server prepares multiple video representations, which differ not only by their bit-rate, but also by the qualities of different scene regions. The client chooses a representation for the next segment such that its bit-rate fits the available throughput and a full quality region matches its viewing. We investigate the impact of various spherical-to-plane projections and quality arrangements on the video quality displayed to the user, showing that the cube map layout offers the best quality for the given bit-rate budget. An evaluation with a dataset of users navigating 360-degree videos demonstrates that segments need to be short enough to enable frequent view switches.

Journal ArticleDOI
TL;DR: Strong support is found for the internal validity of the sluggish cognitive tempo (SCT) construct and preliminary support is provided for the external validity of SCT in terms of diagnostic validity.
Abstract: Objective To conduct the first meta-analysis evaluating the internal and external validity of the sluggish cognitive tempo (SCT) construct as related to or distinct from attention-deficit/hyperactivity disorder (ADHD) and as associated with functional impairment and neuropsychological functioning. Method Electronic databases were searched through September 2015 for studies examining the factor structure and/or correlates of SCT in children or adults. The search procedures identified 73 papers. The core SCT behaviors included across studies, as well as factor loadings and reliability estimates, were reviewed to evaluate internal validity. Pooled correlation effect sizes using random effects models were used to evaluate SCT in relation to external validity domains (i.e., demographics, other psychopathologies, functional impairment, and neuropsychological functioning). Results Strong support was found for the internal validity of the SCT construct. Specifically, across factor analytic studies including more than 19,000 individuals, 13 SCT items loaded consistently on an SCT factor as opposed to an ADHD factor. Findings also support the reliability (i.e., internal consistency, test–retest reliability, interrater reliability) of SCT. In terms of external validity, there is some indication that SCT may increase with age ( r = 0.11) and be associated with lower socioeconomic status ( r = 0.10). Modest (potentially negligible) support was found for SCT symptoms being higher in males than females in children ( r = 0.05) but not in adults. SCT is more strongly associated with ADHD inattention ( r = 0.63 in children, r = 0.72 in adults) than with ADHD hyperactivity-impulsivity ( r = 0.32 in children, r = 0.46 in adults), and it likewise appears that SCT is more strongly associated with internalizing symptoms than with externalizing symptoms. SCT is associated with significant global, social, and academic impairment ( r = 0.38–0.44). Effects for neuropsychological functioning are mixed, although there is initial support for SCT being associated with processing speed, sustained attention, and metacognitive deficits. Conclusion This meta-analytic review provides strong support for the internal validity of SCT and preliminary support for the external validity of SCT. In terms of diagnostic validity, there is currently not enough evidence to describe SCT in diagnostic terms. Key directions for future research are discussed, including evaluating the conceptualization of SCT as a transdiagnostic construct and the need for longitudinal research.

Journal ArticleDOI
TL;DR: In this paper, the authors developed a multi-scale modeling methodology for fast prediction of part distortion by integrating a micro-scale laser scan model, a meso-scale layer hatch model, and a macro-scale part model.

Journal ArticleDOI
TL;DR: In this article, the wavelet-based image segmentation and evaluation (WISE) method was applied to 11 images obtained from multi-epoch Very Long Baseline Array (VLBA) observations made in January-August 2007 at 43 GHz (λ = 7 mm).
Abstract: Context. Very long baseline interferometry (VLBI) imaging of radio emission from extragalactic jets provides a unique probe of physical mechanisms governing the launching, acceleration, and collimation of relativistic outflows.Aims. VLBI imaging of the jet in the nearby active galaxy M 87 enables morphological and kinematic studies to be done on linear scales down to ~100 Schwarzschild radii (R s ).Methods. The two-dimensional structure and kinematics of the jet in M 87 (NGC 4486) have been studied by applying the wavelet-based image segmentation and evaluation (WISE) method to 11 images obtained from multi-epoch Very Long Baseline Array (VLBA) observations made in January-August 2007 at 43 GHz (λ = 7 mm).Results. The WISE analysis recovers a detailed two-dimensional velocity field in the jet in M 87 at sub-parsec scales. The observed evolution of the flow velocity with distance from the jet base can be explained in the framework of MHD jet acceleration and Poynting flux conversion. A linear acceleration regime is observed up to z obs ~ 2 mas. The acceleration is reduced at larger scales, which is consistent with saturation of Poynting flux conversion. Stacked cross correlation analysis of the images reveals a pronounced stratification of the flow. The flow consists of a slow, mildly relativistic layer (moving at β ~ 0.5c ), associated either with instability pattern speed or an outer wind, and a fast, accelerating stream line (with β ~ 0.92, corresponding to a bulk Lorentz factor γ ~ 2.5). A systematic difference of the apparent speeds in the northern and southern limbs of the jet is detected, providing evidence for jet rotation. The angular velocity of the magnetic field line associated with this rotation suggests that the jet in M 87 is launched in the inner part of the disk, at a distance r 0 ~ 5R s from the central engine.Conclusions. The combined results of the analysis imply that MHD acceleration and conversion of Poynting flux to kinetic energy play the dominant roles in collimation and acceleration of the flow in M 87.

Journal ArticleDOI
TL;DR: This paper presents how a learning system should be designed to learn the energy consumption model of HVACs, how to integrate the learning mechanism with optimization techniques to generate optimal demand response policies, and how a data structure should bedesigned to store and capture current home appliance behaviors properly.
Abstract: This paper focuses on developing an interdisciplinary mechanism that combines machine learning, optimization, and data structure design to build a demand response and home energy management system that can meet the needs of real-life conditions. The loads of major home appliances are divided into three categories: 1) containing fixed loads; 2) regulate-able loads; and 3) deferrable loads, based on which a decoupled demand response mechanism is proposed for optimal energy management of the three categories of loads. A learning-based demand response strategy is developed for regulateable loads with a special focus on home heating, ventilation, and air conditioning (HVACs). This paper presents how a learning system should be designed to learn the energy consumption model of HVACs, how to integrate the learning mechanism with optimization techniques to generate optimal demand response policies, and how a data structure should be designed to store and capture current home appliance behaviors properly. This paper investigates how the integrative and learning-based home energy management system behaves in a demand response framework. Case studies are conducted through an integrative simulation approach that combines a home energy simulator and MATLAB together for demand response evaluation.

Journal ArticleDOI
TL;DR: Issues related to domestic HIV care in the US are examined, targeted interventions that address weak points in the continuum are discussed, domestic and international policies that help shape and direct HIV care strategies are reviewed, and recommendations and future directions for HIV providers and policymakers are concluded.
Abstract: The HIV care continuum is a framework that models the dynamic stages of HIV care. The continuum consists of five main steps, which, at the population level, are depicted cross-sectionally as the HIV treatment cascade. These steps include diagnosis, linkage to care (LTC), retention in care (RiC), adherence to antiretroviral therapy (ART), and viral suppression. Although the HIV treatment cascade is represented as a linear, unidirectional framework, persons living with HIV (PLWH) often experience the care continuum in a less streamlined fashion, skip steps altogether, or even exit the continuum for a period of time and regress to an earlier stage. The proportion of PLWH decreases at each successive step of the cascade, beginning with an estimated 86% who are diagnosed and dropping dramatically to approximately 30% of PLWH who are virally suppressed in the United States (US). In this current issues review, we describe each step in the cascade, discuss targeted interventions that address weak points in the continuum, review domestic and international policies that help shape and direct HIV care strategies, and conclude with recommendations and future directions for HIV providers and policymakers. While we primarily examine issues related to domestic HIV care in the US, we also discuss international applications of the continuum in order to provide broader context.

Journal ArticleDOI
TL;DR: The problem of finding a fixed point to a nonexpansive operator (i.e., $x^*=Tx^*), where x is the number of points in a non-convex operator, has been studied in numerical linear algebra, optimization, and other areas of data science as discussed by the authors.
Abstract: Finding a fixed point to a nonexpansive operator, i.e., $x^*=Tx^*$, abstracts many problems in numerical linear algebra, optimization, and other areas of data science. To solve fixed-point problems...

Journal ArticleDOI
Benjamin W. Abbott1, Jeremy B. Jones1, Edward A. G. Schuur2, F. Stuart Chapin1, William B. Bowden3, M. Syndonia Bret-Harte1, Howard E. Epstein4, Mike D. Flannigan5, Tamara K. Harms1, Teresa N. Hollingsworth6, Michelle C. Mack2, A. David McGuire7, Susan M. Natali8, Adrian V. Rocha9, Suzanne E. Tank5, Merritt R. Turetsky10, Jorien E. Vonk11, Kimberly P. Wickland7, George R. Aiken7, Heather D. Alexander12, Rainer M. W. Amon13, Brian W. Benscoter14, Yves Bergeron15, Kevin Bishop16, Olivier Blarquez17, Ben Bond-Lamberty18, Amy L. Breen1, Ishi Buffam19, Yihua Cai20, Christopher Carcaillet21, Sean K. Carey22, Jing M. Chen23, Han Y. H. Chen24, Torben R. Christensen25, Lee W. Cooper26, J. Hans C. Cornelissen11, William J. de Groot27, Thomas H. DeLuca28, Ellen Dorrepaal29, Ned Fetcher30, Jacques C. Finlay31, Bruce C. Forbes, Nancy H. F. French32, Sylvie Gauthier27, Martin P. Girardin27, Scott J. Goetz8, Johann G. Goldammer33, Laura Gough34, Paul Grogan35, Laodong Guo36, Philip E. Higuera37, Larry D. Hinzman1, Feng Sheng Hu38, Gustaf Hugelius39, Elchin Jafarov40, Randi Jandt1, Jill F. Johnstone41, Jan Karlsson29, Eric S. Kasischke, Gerhard Kattner42, Ryan C. Kelly, Frida Keuper43, George W. Kling44, Pirkko Kortelainen45, Jari Kouki46, Peter Kuhry39, Hjalmar Laudon16, Isabelle Laurion15, Robie W. Macdonald47, Paul J. Mann48, Pertti J. Martikainen46, James W. McClelland49, Ulf Molau50, Steven F. Oberbauer14, David Olefeldt5, David Paré27, Marc-André Parisien27, Serge Payette51, Changhui Peng52, Oleg S. Pokrovsky53, Edward B. Rastetter54, Peter A. Raymond55, Martha K. Raynolds1, Guillermo Rein56, James F. Reynolds57, Martin D. Robards, Brendan M. Rogers8, Christina Schaedel2, Kevin Schaefer40, Inger Kappel Schmidt58, Anatoly Shvidenko, Jasper Sky, Robert G. M. Spencer14, Gregory Starr59, Robert G. Striegl7, Roman Teisserenc60, Lars J. Tranvik61, Tarmo Virtanen, Jeffrey M. Welker62, Sergei Zimov63 
University of Alaska Fairbanks1, Northern Arizona University2, University of Vermont3, University of Virginia4, University of Alberta5, United States Department of Agriculture6, United States Geological Survey7, Woods Hole Oceanographic Institution8, University of Notre Dame9, University of Guelph10, VU University Amsterdam11, Mississippi State University12, University of North Texas13, Florida State University14, Université du Québec15, Swedish University of Agricultural Sciences16, McGill University17, United States Department of Energy18, University of Cincinnati19, Xiamen University20, École Normale Supérieure21, McMaster University22, University of Toronto23, Lakehead University24, Aarhus University25, University of Maryland Center for Environmental Science26, Natural Resources Canada27, University of Washington28, Umeå University29, Wilkes University30, University of Minnesota31, Michigan Technological University32, Max Planck Society33, University System of Maryland34, Queen's University35, University of Wisconsin–Milwaukee36, University of Montana System37, University of Illinois at Chicago38, Stockholm University39, University of Colorado Boulder40, University of Saskatchewan41, Alfred Wegener Institute for Polar and Marine Research42, Institut national de la recherche agronomique43, University of Michigan44, Finnish Environment Institute45, University of Eastern Finland46, Fisheries and Oceans Canada47, Northumbria University48, University of Texas at Austin49, University of Gothenburg50, Laval University51, Northwest A&F University52, Tomsk State University53, Marine Biological Laboratory54, Yale University55, Imperial College London56, Duke University57, University of Copenhagen58, University of Alabama59, Centre national de la recherche scientifique60, Uppsala University61, University of Alaska Anchorage62, Russian Academy of Sciences63
TL;DR: As the permafrost region warms, its large organic carbon pool will be increasingly vulnerable to decomposition, combustion, and hydrologic export as mentioned in this paper, and models predict that some portion of this release w...
Abstract: As the permafrost region warms, its large organic carbon pool will be increasingly vulnerable to decomposition, combustion, and hydrologic export. Models predict that some portion of this release w ...