scispace - formally typeset
Search or ask a question

Showing papers by "University of Graz published in 2018"


Journal ArticleDOI
Lorenzo Galluzzi1, Lorenzo Galluzzi2, Ilio Vitale3, Stuart A. Aaronson4  +183 moreInstitutions (111)
TL;DR: The Nomenclature Committee on Cell Death (NCCD) has formulated guidelines for the definition and interpretation of cell death from morphological, biochemical, and functional perspectives.
Abstract: Over the past decade, the Nomenclature Committee on Cell Death (NCCD) has formulated guidelines for the definition and interpretation of cell death from morphological, biochemical, and functional perspectives. Since the field continues to expand and novel mechanisms that orchestrate multiple cell death pathways are unveiled, we propose an updated classification of cell death subroutines focusing on mechanistic and essential (as opposed to correlative and dispensable) aspects of the process. As we provide molecularly oriented definitions of terms including intrinsic apoptosis, extrinsic apoptosis, mitochondrial permeability transition (MPT)-driven necrosis, necroptosis, ferroptosis, pyroptosis, parthanatos, entotic cell death, NETotic cell death, lysosome-dependent cell death, autophagy-dependent cell death, immunogenic cell death, cellular senescence, and mitotic catastrophe, we discuss the utility of neologisms that refer to highly specialized instances of these processes. The mission of the NCCD is to provide a widely accepted nomenclature on cell death in support of the continued development of the field.

3,301 citations


Journal ArticleDOI
TL;DR: In this paper, a variational network approach is proposed to reconstruct the clinical knee imaging protocol for different acceleration factors and sampling patterns using retrospectively and prospectively undersampled data.
Abstract: PURPOSE To allow fast and high-quality reconstruction of clinical accelerated multi-coil MR data by learning a variational network that combines the mathematical structure of variational models with deep learning. THEORY AND METHODS Generalized compressed sensing reconstruction formulated as a variational model is embedded in an unrolled gradient descent scheme. All parameters of this formulation, including the prior model defined by filter kernels and activation functions as well as the data term weights, are learned during an offline training procedure. The learned model can then be applied online to previously unseen data. RESULTS The variational network approach is evaluated on a clinical knee imaging protocol for different acceleration factors and sampling patterns using retrospectively and prospectively undersampled data. The variational network reconstructions outperform standard reconstruction algorithms, verified by quantitative error measures and a clinical reader study for regular sampling and acceleration factor 4. CONCLUSION Variational network reconstructions preserve the natural appearance of MR images as well as pathologies that were not included in the training data set. Due to its high computational performance, that is, reconstruction time of 193 ms on a single graphics card, and the omission of parameter tuning once the network is trained, this new approach to image reconstruction can easily be integrated into clinical workflow. Magn Reson Med 79:3055-3071, 2018. © 2017 International Society for Magnetic Resonance in Medicine.

1,111 citations


Journal ArticleDOI
TL;DR: The task force recommends an early imaging test in patients with suspected LVV, with ultrasound and MRI being the first choices in GCA and TAK, respectively, which are the first EULAR recommendations providing up-to-date guidance for the role of imaging in the diagnosis and monitoring of patients with (suspected) LVV.
Abstract: To develop evidence-based recommendations for the use of imaging modalities in primary large vessel vasculitis (LVV) including giant cell arteritis (GCA) and Takayasu arteritis (TAK). European League Against Rheumatism (EULAR) standardised operating procedures were followed. A systematic literature review was conducted to retrieve data on the role of imaging modalities including ultrasound, MRI, CT and [18F]-fluorodeoxyglucose positron emission tomography (PET) in LVV. Based on evidence and expert opinion, the task force consisting of 20 physicians, healthcare professionals and patients from 10 EULAR countries developed recommendations, with consensus obtained through voting. The final level of agreement was voted anonymously. A total of 12 recommendations have been formulated. The task force recommends an early imaging test in patients with suspected LVV, with ultrasound and MRI being the first choices in GCA and TAK, respectively. CT or PET may be used alternatively. In case the diagnosis is still in question after clinical examination and imaging, additional investigations including temporal artery biopsy and/or additional imaging are required. In patients with a suspected flare, imaging might help to better assess disease activity. The frequency and choice of imaging modalities for long-term monitoring of structural damage remains an individual decision; close monitoring for aortic aneurysms should be conducted in patients at risk for this complication. All imaging should be performed by a trained specialist using appropriate operational procedures and settings. These are the first EULAR recommendations providing up-to-date guidance for the role of imaging in the diagnosis and monitoring of patients with (suspected) LVV.

669 citations


Journal ArticleDOI
TL;DR: While gingival health and gingivitis have many clinical features, case definitions are primarily predicated on presence or absence of bleeding on probing, which creates differences in the way in which a "case" of gedival health or gingIVitis is defined for clinical practice as opposed to epidemiologically in population prevalence surveys.
Abstract: Periodontal health is defined by absence of clinically detectable inflammation. There is a biological level of immune surveillance that is consistent with clinical gingival health and homeostasis. Clinical gingival health may be found in a periodontium that is intact, i.e. without clinical attachment loss or bone loss, and on a reduced periodontium in either a non-periodontitis patient (e.g. in patients with some form of gingival recession or following crown lengthening surgery) or in a patient with a history of periodontitis who is currently periodontally stable. Clinical gingival health can be restored following treatment of gingivitis and periodontitis. However, the treated and stable periodontitis patient with current gingival health remains at increased risk of recurrent periodontitis, and accordingly, must be closely monitored. Two broad categories of gingival diseases include non-dental plaque biofilm-induced gingival diseases and dental plaque-induced gingivitis. Non-dental plaque biofilm-induced gingival diseases include a variety of conditions that are not caused by plaque and usually do not resolve following plaque removal. Such lesions may be manifestations of a systemic condition or may be localized to the oral cavity. Dental plaque-induced gingivitis has a variety of clinical signs and symptoms, and both local predisposing factors and systemic modifying factors can affect its extent, severity, and progression. Dental plaque-induced gingivitis may arise on an intact periodontium or on a reduced periodontium in either a non-periodontitis patient or in a currently stable "periodontitis patient" i.e. successfully treated, in whom clinical inflammation has been eliminated (or substantially reduced). A periodontitis patient with gingival inflammation remains a periodontitis patient (Figure 1), and comprehensive risk assessment and management are imperative to ensure early prevention and/or treatment of recurrent/progressive periodontitis. Precision dental medicine defines a patient-centered approach to care, and therefore, creates differences in the way in which a "case" of gingival health or gingivitis is defined for clinical practice as opposed to epidemiologically in population prevalence surveys. Thus, case definitions of gingival health and gingivitis are presented for both purposes. While gingival health and gingivitis have many clinical features, case definitions are primarily predicated on presence or absence of bleeding on probing. Here we classify gingival health and gingival diseases/conditions, along with a summary table of diagnostic features for defining health and gingivitis in various clinical situations.

573 citations


Journal ArticleDOI
26 Jan 2018-Science
TL;DR: Evidence that increased intake of the polyamine spermidine appears to reproduce many of the healthful effects of caloric restriction is reviewed, and its cellular actions, which include enhancement of autophagy and protein deacetylation, are explained.
Abstract: Interventions that delay aging and protect from age-associated disease are slowly approaching clinical implementation. Such interventions include caloric restriction mimetics, which are defined as agents that mimic the beneficial effects of dietary restriction while limiting its detrimental effects. One such agent, the natural polyamine spermidine, has prominent cardioprotective and neuroprotective effects and stimulates anticancer immunosurveillance in rodent models. Moreover, dietary polyamine uptake correlates with reduced cardiovascular and cancer-related mortality in human epidemiological studies. Spermidine preserves mitochondrial function, exhibits anti-inflammatory properties, and prevents stem cell senescence. Mechanistically, it shares the molecular pathways engaged by other caloric restriction mimetics: It induces protein deacetylation and depends on functional autophagy. Because spermidine is already present in daily human nutrition, clinical trials aiming at increasing the uptake of this polyamine appear feasible.

542 citations


Journal ArticleDOI
24 Dec 2018
TL;DR: This paper conducted preregistered replications of 28 classic and contemporary published findings, with protocols that were peer reviewed in advance, to examine variation in effect magnitudes across samples and settings, and found that very little heterogeneity was attributable to the order in which the tasks were performed or whether the task were administered in lab versus online.
Abstract: We conducted preregistered replications of 28 classic and contemporary published findings, with protocols that were peer reviewed in advance, to examine variation in effect magnitudes across samples and settings. Each protocol was administered to approximately half of 125 samples that comprised 15,305 participants from 36 countries and territories. Using the conventional criterion of statistical significance (p < .05), we found that 15 (54%) of the replications provided evidence of a statistically significant effect in the same direction as the original finding. With a strict significance criterion (p < .0001), 14 (50%) of the replications still provided such evidence, a reflection of the extremely high-powered design. Seven (25%) of the replications yielded effect sizes larger than the original ones, and 21 (75%) yielded effect sizes smaller than the original ones. The median comparable Cohen’s ds were 0.60 for the original findings and 0.15 for the replications. The effect sizes were small (< 0.20) in 16 of the replications (57%), and 9 effects (32%) were in the direction opposite the direction of the original effect. Across settings, the Q statistic indicated significant heterogeneity in 11 (39%) of the replication effects, and most of those were among the findings with the largest overall effect sizes; only 1 effect that was near zero in the aggregate showed significant heterogeneity according to this measure. Only 1 effect had a tau value greater than .20, an indication of moderate heterogeneity. Eight others had tau values near or slightly above .10, an indication of slight heterogeneity. Moderation tests indicated that very little heterogeneity was attributable to the order in which the tasks were performed or whether the tasks were administered in lab versus online. Exploratory comparisons revealed little heterogeneity between Western, educated, industrialized, rich, and democratic (WEIRD) cultures and less WEIRD cultures (i.e., cultures with relatively high and low WEIRDness scores, respectively). Cumulatively, variability in the observed effect sizes was attributable more to the effect being studied than to the sample or setting in which it was studied.

495 citations


Journal ArticleDOI
TL;DR: A whole-brain network associated with high-creative ability comprised of cortical hubs within default, salience, and executive systems—intrinsic functional networks that tend to work in opposition is identified, suggesting that highly creative people are characterized by the ability to simultaneously engage these large-scale brain networks.
Abstract: People’s ability to think creatively is a primary means of technological and cultural progress, yet the neural architecture of the highly creative brain remains largely undefined. Here, we employed a recently developed method in functional brain imaging analysis—connectome-based predictive modeling—to identify a brain network associated with high-creative ability, using functional magnetic resonance imaging (fMRI) data acquired from 163 participants engaged in a classic divergent thinking task. At the behavioral level, we found a strong correlation between creative thinking ability and self-reported creative behavior and accomplishment in the arts and sciences (r = 0.54). At the neural level, we found a pattern of functional brain connectivity related to high-creative thinking ability consisting of frontal and parietal regions within default, salience, and executive brain systems. In a leave-one-out cross-validation analysis, we show that this neural model can reliably predict the creative quality of ideas generated by novel participants within the sample. Furthermore, in a series of external validation analyses using data from two independent task fMRI samples and a large task-free resting-state fMRI sample, we demonstrate robust prediction of individual creative thinking ability from the same pattern of brain connectivity. The findings thus reveal a whole-brain network associated with high-creative ability comprised of cortical hubs within default, salience, and executive systems—intrinsic functional networks that tend to work in opposition—suggesting that highly creative people are characterized by the ability to simultaneously engage these large-scale brain networks.

490 citations


Journal ArticleDOI
TL;DR: The review introduces a systematic classification of the cascades according to the number of enzymes in the linear sequence and differentiates between cascades involving exclusively enzymes and combinations of enzymes with non-natural catalysts or chemical steps.
Abstract: The review compiles artificial cascades involving enzymes with a focus on the last 10 years. A cascade is defined as the combination of at least two reaction steps in a single reaction vessel without isolation of the intermediates, whereby at least one step is catalyzed by an enzyme. Additionally, cascades performed in vivo and in vitro are discussed separately, whereby in vivo cascades are defined here as cascades relying on cofactor recycling by the metabolism or on a metabolite from the living organism. The review introduces a systematic classification of the cascades according to the number of enzymes in the linear sequence and differentiates between cascades involving exclusively enzymes and combinations of enzymes with non-natural catalysts or chemical steps. Since the number of examples involving two enzymes is predominant, the two enzyme cascades are further subdivided according to the number, order, and type of redox steps. Furthermore, this classification differentiates between cascades where al...

420 citations


Journal ArticleDOI
TL;DR: This group of MS researchers and clinicians with varied expertise took stock of the current state of the field, and identified several important practical and theoretical challenges, including key knowledge gaps and methodologic limitations related to understanding and measurement of cognitive deficits, and development of effective treatments.
Abstract: Cognitive decline is recognized as a prevalent and debilitating symptom of multiple sclerosis (MS), especially deficits in episodic memory and processing speed. The field aims to (1) incorporate cognitive assessment into standard clinical care and clinical trials, (2) utilize state-of-the-art neuroimaging to more thoroughly understand neural bases of cognitive deficits, and (3) develop effective, evidence-based, clinically feasible interventions to prevent or treat cognitive dysfunction, which are lacking. There are obstacles to these goals. Our group of MS researchers and clinicians with varied expertise took stock of the current state of the field, and we identify several important practical and theoretical challenges, including key knowledge gaps and methodologic limitations related to (1) understanding and measurement of cognitive deficits, (2) neuroimaging of neural bases and correlates of deficits, and (3) development of effective treatments. This is not a comprehensive review of the extensive literature, but instead a statement of guidelines and priorities for the field. For instance, we provide recommendations for improving the scientific basis and methodologic rigor for cognitive rehabilitation research. Toward this end, we call for multidisciplinary collaborations toward development of biologically based theoretical models of cognition capable of empirical validation and evidence-based refinement, providing the scientific context for effective treatment discovery.

389 citations


Journal ArticleDOI
TL;DR: The understanding of the complex relationship between recovery and performance has significantly increased through research, and some important issues for future investigations are also elaborated.
Abstract: The relationship between recovery and fatigue and its impact on performance has attracted the interest of sport science for many years. An adequate balance between stress (training and competition load, other life demands) and recovery is essential for athletes to achieve continuous high-level performance. Research has focused on the examination of physiological and psychological recovery strategies to compensate external and internal training and competition loads. A systematic monitoring of recovery and the subsequent implementation of recovery routines aims at maximizing performance and preventing negative developments such as underrecovery, nonfunctional overreaching, the overtraining syndrome, injuries, or illnesses. Due to the inter- and intraindividual variability of responses to training, competition, and recovery strategies, a diverse set of expertise is required to address the multifaceted phenomena of recovery, performance, and their interactions to transfer knowledge from sport science to sport practice. For this purpose, a symposium on Recovery and Performance was organized at the Technical University Munich Science and Study Center Raitenhaslach (Germany) in September 2016. Various international experts from many disciplines and research areas gathered to discuss and share their knowledge of recovery for performance enhancement in a variety of settings. The results of this meeting are outlined in this consensus statement that provides central definitions, theoretical frameworks, and practical implications as a synopsis of the current knowledge of recovery and performance. While our understanding of the complex relationship between recovery and performance has significantly increased through research, some important issues for future investigations are also elaborated.

313 citations


Journal ArticleDOI
TL;DR: A panel of synthetic antimicrobial and antibiofilm peptides (SAAPs) with enhanced antimicrobial activities compared to the parent peptide, human antimicrobial peptide LL-37, demonstrate that SAAP-148 is a promising drug candidate in the battle against antibiotic-resistant bacteria that pose a great threat to human health.
Abstract: Development of novel antimicrobial agents is a top priority in the fight against multidrug-resistant (MDR) and persistent bacteria. We developed a panel of synthetic antimicrobial and antibiofilm peptides (SAAPs) with enhanced antimicrobial activities compared to the parent peptide, human antimicrobial peptide LL-37. Our lead peptide SAAP-148 was more efficient in killing bacteria under physiological conditions in vitro than many known preclinical- and clinical-phase antimicrobial peptides. SAAP-148 killed MDR pathogens without inducing resistance, prevented biofilm formation, and eliminated established biofilms and persister cells. A single 4-hour treatment with hypromellose ointment containing SAAP-148 completely eradicated acute and established, biofilm-associated infections with methicillin-resistant Staphylococcus aureus and MDR Acinetobacter baumannii from wounded ex vivo human skin and murine skin in vivo. Together, these data demonstrate that SAAP-148 is a promising drug candidate in the battle against antibiotic-resistant bacteria that pose a great threat to human health.

Journal ArticleDOI
TL;DR: A typology of four reasons for using storylines to represent uncertainty in physical aspects of climate change: improving risk awareness by framing risk in an event-oriented rather than a probabilistic manner, which corresponds more directly to how people perceive and respond to risk.
Abstract: As climate change research becomes increasingly applied, the need for actionable information is growing rapidly. A key aspect of this requirement is the representation of uncertainties. The conventional approach to representing uncertainty in physical aspects of climate change is probabilistic, based on ensembles of climate model simulations. In the face of deep uncertainties, the known limitations of this approach are becoming increasingly apparent. An alternative is thus emerging which may be called a ‘storyline’ approach. We define a storyline as a physically self-consistent unfolding of past events, or of plausible future events or pathways. No a priori probability of the storyline is assessed; emphasis is placed instead on understanding the driving factors involved, and the plausibility of those factors. We introduce a typology of four reasons for using storylines to represent uncertainty in physical aspects of climate change: (i) improving risk awareness by framing risk in an event-oriented rather than a probabilistic manner, which corresponds more directly to how people perceive and respond to risk; (ii) strengthening decision-making by allowing one to work backward from a particular vulnerability or decision point, combining climate change information with other relevant factors to address compound risk and develop appropriate stress tests; (iii) providing a physical basis for partitioning uncertainty, thereby allowing the use of more credible regional models in a conditioned manner and (iv) exploring the boundaries of plausibility, thereby guarding against false precision and surprise. Storylines also offer a powerful way of linking physical with human aspects of climate change.

Journal ArticleDOI
TL;DR: Division of Angiology, Heart and Vessel Department, Lausanne University Hospital, Ch du Mont-Paisible 18, 1011 Laus Switzerland; Department of Cardiology, Dupuytren University hospital, and, Inserm 1098, Tropical Neuroepidemiology, School of Medicine, 2 avenue martin Luther-King, France.
Abstract: Division of Angiology, Heart and Vessel Department, Lausanne University Hospital, Ch du Mont-Paisible 18, 1011 Lausanne, Switzerland; Department of Cardiology, Dupuytren University Hospital, and, Inserm 1098, Tropical Neuroepidemiology, School of Medicine, 2 avenue martin Luther-King 87042 Limoges cedex, France; Department of Clinical and Experimental Medicine, University of Insubria, Via Ravasi 2, 21100 Varese, Italy; Internal and Cardiovascular Medicine Stroke Unit, University of Perugia, S. Andrea delle Fratte, 06156 Perugia, Italy; Department of Vascular Medicine, Klinikum Darmstadt GmbH, Grafenstraße 9, 64283 Darmstadt, Germany; Center for Thrombosis and Hemostasis, University Medical Center Mainz, Langenbeckstr. 1, 55131 Mainz, Germany; Department of Vascular Medicine, Academic Medical Center, Meibergdreef 9, 1105 AZ, Amsterdam, The Netherlands; Cardiology and Vascular Medicine, Toulon Hospital Centre, 54 Rue Henri Sainte-Claire Deville, 83100 Toulon, France; Assistance PubliqueHôpitaux de Paris, Saint-Louis Hospital, Internal Medicine and Vascular Disease Unit and Groupe Francophone on Thrombosis and Cancer, Paris 7 Diderot University, Sorbonne Paris Cité, 1, Avenue Claude Vellefaux, 75010 Paris, France; Department of Cardiology, Democritus University of Thrace, Greece; Cardiovascular Diseases, University of Bologna, Via Albertoni 15, 40138 Bologna, Italy; Department of Cardiovascular Sciences, Vascular Medicine Unit, University of Padua, Via Nicol o Giustiniani, 2, 35121 Padua, Italy; Division of Angiology and Hemostasis, Department of Medical Specialties, Geneva University Hospital, Rue Gabrielle Perret-Gentil 4, 1205 Geneva, Switzerland; Department of Pulmonary Circulation and Thromboembolic Diseases, Medical Center for Postgraduate Education, ul Plocka 26, 01-138, Warszawa, Otwock, Poland; Department of Cardiology, Athens Medical School, Profiti elia 24, 14575 Athens, Greece; and Division of Angiology, Medical University Graz, Graz, Austria

Journal ArticleDOI
Symen Ligthart1, Ahmad Vaez2, Urmo Võsa3, Maria G. Stathopoulou4  +283 moreInstitutions (97)
TL;DR: In this article, the authors performed two genome-wide association studies (GWASs), on HapMap and 1000 Genomes imputed data, of circulating amounts of CRP by using data from 88 studies comprising 204,402 European individuals.
Abstract: C-reactive protein (CRP) is a sensitive biomarker of chronic low-grade inflammation and is associated with multiple complex diseases. The genetic determinants of chronic inflammation remain largely unknown, and the causal role of CRP in several clinical outcomes is debated. We performed two genome-wide association studies (GWASs), on HapMap and 1000 Genomes imputed data, of circulating amounts of CRP by using data from 88 studies comprising 204,402 European individuals. Additionally, we performed in silico functional analyses and Mendelian randomization analyses with several clinical outcomes. The GWAS meta-analyses of CRP revealed 58 distinct genetic loci (p < 5 × 10−8). After adjustment for body mass index in the regression analysis, the associations at all except three loci remained. The lead variants at the distinct loci explained up to 7.0% of the variance in circulating amounts of CRP. We identified 66 gene sets that were organized in two substantially correlated clusters, one mainly composed of immune pathways and the other characterized by metabolic pathways in the liver. Mendelian randomization analyses revealed a causal protective effect of CRP on schizophrenia and a risk-increasing effect on bipolar disorder. Our findings provide further insights into the biology of inflammation and could lead to interventions for treating inflammation and its clinical consequences.

Journal ArticleDOI
TL;DR: Systematic vitamin D food fortification is an effective approach to improve vitamin D status in the general population and this has already been introduced by countries such as the US, Canada, India, and Finland.
Abstract: Vitamin D deficiency can lead to musculoskeletal diseases such as rickets and osteomalacia, but vitamin D supplementation may also prevent extraskeletal diseases such as respiratory tract infections, asthma exacerbations, pregnancy complications and premature deaths. Vitamin D has a unique metabolism as it is mainly obtained through synthesis in the skin under the influence of sunlight (i.e., ultraviolet-B radiation) whereas intake by nutrition traditionally plays a relatively minor role. Dietary guidelines for vitamin D are based on a consensus that serum 25-hydroxyvitamin D (25[OH]D) concentrations are used to assess vitamin D status, with the recommended target concentrations ranging from ≥25 to ≥50 nmol/L (≥10-≥20 ng/mL), corresponding to a daily vitamin D intake of 10 to 20 μg (400-800 international units). Most populations fail to meet these recommended dietary vitamin D requirements. In Europe, 25(OH)D concentrations <30 nmol/L (12 ng/mL) and <50 nmol/L (20 ng/mL) are present in 13.0 and 40.4% of the general population, respectively. This substantial gap between officially recommended dietary reference intakes for vitamin D and the high prevalence of vitamin D deficiency in the general population requires action from health authorities. Promotion of a healthier lifestyle with more outdoor activities and optimal nutrition are definitely warranted but will not erase vitamin D deficiency and must, in the case of sunlight exposure, be well balanced with regard to potential adverse effects such as skin cancer. Intake of vitamin D supplements is limited by relatively poor adherence (in particular in individuals with low-socioeconomic status) and potential for overdosing. Systematic vitamin D food fortification is, however, an effective approach to improve vitamin D status in the general population, and this has already been introduced by countries such as the US, Canada, India, and Finland. Recent advances in our knowledge on the safety of vitamin D treatment, the dose-response relationship of vitamin D intake and 25(OH)D levels, as well as data on the effectiveness of vitamin D fortification in countries such as Finland provide a solid basis to introduce and modify vitamin D food fortification in order to improve public health with this likewise cost-effective approach.

Journal ArticleDOI
TL;DR: This work extends existing literature by performing various tests on efficiency of several cryptocurrencies and additionally link efficiency to measures of liquidity, finding that Cryptocurrencies become less predictable / inefficient as liquidity increases.


Journal ArticleDOI
TL;DR: The findings indicate that e-cigarettes are a potential source of exposure to toxic metals (Cr, Ni, and Pb), and to metals that are toxic when inhaled (Mn and Zn).
Abstract: Background: Electronic cigarettes (e-cigarettes) generate an aerosol by heating a solution (e-liquid) with a metallic coil. Whether metals are transferred from the coil to the aerosol is unknown. O...

Journal ArticleDOI
TL;DR: Long-term statin treatment is remarkably safe with a low risk of clinically relevant adverse effects as defined above, and the established cardiovascular benefits of statin therapy far outweigh the risk of adverse effects.
Abstract: Aims To objectively appraise evidence for possible adverse effects of long-term statin therapy on glucose homeostasis, cognitive, renal and hepatic function, and risk for haemorrhagic stroke or cataract.

Journal ArticleDOI
31 Jan 2018
TL;DR: The authors assesses the skill of current-generation global atmospheric chemistry models in simulating the observed present-day tropospheric ozone distribution, variability, and trends using a range of model evaluation techniques, and demonstrate that global chemistry models are broadly skillful in capturing the spatio-temporal variations of troposphere ozone over the seasonal cycle, for extreme pollution episodes, and changes over interannual to decadal periods.
Abstract: The goal of the Tropospheric Ozone Assessment Report (TOAR) is to provide the research community with an up-to-date scientific assessment of tropospheric ozone, from the surface to the tropopause. While a suite of observations provides significant information on the spatial and temporal distribution of tropospheric ozone, observational gaps make it necessary to use global atmospheric chemistry models to synthesize our understanding of the processes and variables that control tropospheric ozone abundance and its variability. Models facilitate the interpretation of the observations and allow us to make projections of future tropospheric ozone and trace gas distributions for different anthropogenic or natural perturbations. This paper assesses the skill of current-generation global atmospheric chemistry models in simulating the observed present-day tropospheric ozone distribution, variability, and trends. Drawing upon the results of recent international multi-model intercomparisons and using a range of model evaluation techniques, we demonstrate that global chemistry models are broadly skillful in capturing the spatio-temporal variations of tropospheric ozone over the seasonal cycle, for extreme pollution episodes, and changes over interannual to decadal periods. However, models are consistently biased high in the northern hemisphere and biased low in the southern hemisphere, throughout the depth of the troposphere, and are unable to replicate particular metrics that define the longer term trends in tropospheric ozone as derived from some background sites. When the models compare unfavorably against observations, we discuss the potential causes of model biases and propose directions for future developments, including improved evaluations that may be able to better diagnose the root cause of the model-observation disparity. Overall, model results should be approached critically, including determining whether the model performance is acceptable for the problem being addressed, whether biases can be tolerated or corrected, whether the model is appropriately constituted, and whether there is a way to satisfactorily quantify the uncertainty.

Book
01 Jan 2018
TL;DR: In this article, the main approaches including statistical downscaling, bias correction, and weather generators, along with their underlying assumptions, skill and limitations are presented, together with user context and technical background.
Abstract: Statistical downscaling and bias correction are becoming standard tools in climate impact studies. This book provides a comprehensive reference to widely-used approaches, and additionally covers the relevant user context and technical background, as well as a synthesis and guidelines for practitioners. It presents the main approaches including statistical downscaling, bias correction and weather generators, along with their underlying assumptions, skill and limitations. Relevant background information on user needs and observational and climate model uncertainties is complemented by concise introductions to the most important concepts in statistical and dynamical modelling. A substantial part is dedicated to the evaluation of regional climate projections and their value in different user contexts. Detailed guidelines for the application of downscaling and the use of downscaled information in practice complete the volume. Its modular approach makes the book accessible for developers and practitioners, graduate students and experienced researchers, as well as impact modellers and decision makers.

Book ChapterDOI
27 Aug 2018
TL;DR: Explainable AI is not a new field but the evolution of formal reasoning architectures to incorporate principled probabilistic reasoning helped address the capture and use of uncertain knowledge.
Abstract: Explainable AI is not a new field. Since at least the early exploitation of C.S. Pierce’s abductive reasoning in expert systems of the 1980s, there were reasoning architectures to support an explanation function for complex AI systems, including applications in medical diagnosis, complex multi-component design, and reasoning about the real world. So explainability is at least as old as early AI, and a natural consequence of the design of AI systems. While early expert systems consisted of handcrafted knowledge bases that enabled reasoning over narrowly well-defined domains (e.g., INTERNIST, MYCIN), such systems had no learning capabilities and had only primitive uncertainty handling. But the evolution of formal reasoning architectures to incorporate principled probabilistic reasoning helped address the capture and use of uncertain knowledge.

Journal ArticleDOI
TL;DR: In this paper, the authors discuss the observations and physical mechanisms behind this eruptive activity, with a view to making an assessment of the current capability of forecasting these events for space weather risk and impact mitigation.
Abstract: Coronal mass ejections (CMEs) were discovered in the early 1970s when space-borne coronagraphs revealed that eruptions of plasma are ejected from the Sun. Today, it is known that the Sun produces eruptive flares, filament eruptions, coronal mass ejections and failed eruptions; all thought to be due to a release of energy stored in the coronal magnetic field during its drastic reconfiguration. This review discusses the observations and physical mechanisms behind this eruptive activity, with a view to making an assessment of the current capability of forecasting these events for space weather risk and impact mitigation. Whilst a wealth of observations exist, and detailed models have been developed, there still exists a need to draw these approaches together. In particular more realistic models are encouraged in order to asses the full range of complexity of the solar atmosphere and the criteria for which an eruption is formed. From the observational side, a more detailed understanding of the role of photospheric flows and reconnection is needed in order to identify the evolutionary path that ultimately means a magnetic structure will erupt.

Journal ArticleDOI
Martin Koller1
TL;DR: Traditional processing techniques for production of PHA-based medical devices, such as melt-spinning, melt extrusion, or solvent evaporation, and emerging processing techniques like 3D-printing, computer-aided wet-sp spinning, laser perforation, and electrospinning are described.
Abstract: Polyhydroxyalkanoates (PHA) are bio-based microbial biopolyesters; their stiffness, elasticity, crystallinity and degradability are tunable by the monomeric composition, selection of microbial production strain, substrates, process parameters during production, and post-synthetic processing; they display biological alternatives for diverse technomers of petrochemical origin. This, together with the fact that their monomeric and oligomeric in vivo degradation products do not exert any toxic or elsewhere negative effect to living cells or tissue of humans or animals, makes them highly stimulating for various applications in the medical field. This article provides an overview of PHA application in the therapeutic, surgical and tissue engineering area, and reviews strategies to produce PHA at purity levels high enough to be used in vivo. Tested applications of differently composed PHA and advanced follow-up products as carrier materials for controlled in vivo release of anti-cancer drugs or antibiotics, as scaffolds for tissue engineering, as guidance conduits for nerve repair or as enhanced sutures, implants or meshes are discussed from both a biotechnological and a material-scientific perspective. The article also describes the use of traditional processing techniques for production of PHA-based medical devices, such as melt-spinning, melt extrusion, or solvent evaporation, and emerging processing techniques like 3D-printing, computer-aided wet-spinning, laser perforation, and electrospinning.

Journal ArticleDOI
TL;DR: The results suggest that the ptau burden in the isocortex is comparable between all analyzed ptau sites when using a quantitative approach while levels of ptau at Tyr18 or Thr231 in the transentorhinal region are different between all Braak stages.
Abstract: Alzheimer’s disease is characterized by accumulation of amyloid plaques and tau aggregates in several cortical brain regions. Tau phosphorylation causes formation of neurofibrillary tangles and neuropil threads. Phosphorylation at tau Ser202/Thr205 is well characterized since labeling of this site is used to assign Braak stage based on occurrence of neurofibrillary tangles. Only little is known about the spatial and temporal phosphorylation profile of other phosphorylated tau (ptau) sites. Here, we investigate total tau and ptau at residues Tyr18, Ser199, Ser202/Thr205, Thr231, Ser262, Ser396, Ser422 as well as amyloid-β plaques in human brain tissue of AD patients and controls. Allo- and isocortical brain regions were evaluated applying rater-independent automated quantification based on digital image analysis. We found that the level of ptau at several residues, like Ser199, Ser202/Thr205, and Ser422 was similar in healthy controls and Braak stages I to IV but was increased in Braak stage V/VI throughout the entire isocortex and transentorhinal cortex. Quantification of ThioS-stained plaques showed a similar pattern. Only tau phosphorylation at Tyr18 and Thr231 was already significantly increased in the transentorhinal region at Braak stage III/IV and hence showed a progressive increase with increasing Braak stages. Additionally, the increase in phosphorylation relative to controls was highest at Tyr18, Thr231 and Ser199. By contrast, Ser396 tau and Ser262 tau showed only a weak phosphorylation in all analyzed brain regions and only minor progression. Our results suggest that the ptau burden in the isocortex is comparable between all analyzed ptau sites when using a quantitative approach while levels of ptau at Tyr18 or Thr231 in the transentorhinal region are different between all Braak stages. Hence these sites could be crucial in the pathogenesis of AD already at early stages and therefore represent putative novel therapeutic targets.

Journal ArticleDOI
TL;DR: In this paper, first-principle results for quark, gluon, and meson 1PI correlation functions were obtained by solving their functional renormalization group equations in a systematic vertex expansion, aiming at apparent convergence.
Abstract: We present nonperturbative first-principle results for quark, gluon, and meson 1PI correlation functions of two-flavor Landau-gauge QCD in the vacuum. These correlation functions carry the full information about the theory. They are obtained by solving their functional renormalization group equations in a systematic vertex expansion, aiming at apparent convergence. This work represents a crucial prerequisite for quantitative first-principle studies of the QCD phase diagram and the hadron spectrum within this framework. In particular, we have computed the gluon, ghost, quark, and scalar-pseudoscalar meson propagators, as well as gluon, ghost-gluon, quark-gluon, quark, quark-meson, and meson interactions. Our results stress the crucial importance of the quantitatively correct running of different vertices in the semiperturbative regime for describing the phenomena and scales of confinement and spontaneous chiral symmetry breaking without phenomenological input.

Journal ArticleDOI
TL;DR: These results substantiate the clinical significance of considering diffuse midline glioma, H3 K27M-mutant, as a distinct entity corresponding to WHO grade IV, carrying a universally fatal prognosis.
Abstract: Background The novel entity of "diffuse midline glioma, H3 K27M-mutant" has been defined in the 2016 revision of the World Health Organization (WHO) classification of tumors of the central nervous system (CNS). Tumors of this entity arise in CNS midline structures of predominantly pediatric patients and are associated with an overall dismal prognosis. They are defined by K27M mutations in H3F3A or HIST1H3B/C, encoding for histone 3 variants H3.3 and H3.1, respectively, which are considered hallmark events driving gliomagenesis. Methods Here, we characterized 85 centrally reviewed diffuse gliomas on midline locations enrolled in the nationwide pediatric German HIT-HGG registry regarding tumor site, histone 3 mutational status, WHO grade, age, sex, and extent of tumor resection. Results We found 56 H3.3 K27M-mutant tumors (66%), 6 H3.1 K27M-mutant tumors (7%), and 23 H3-wildtype tumors (27%). H3 K27M-mutant gliomas shared an aggressive clinical course independent of their anatomic location. Multivariate regression analysis confirmed the significant impact of the H3 K27M mutation as the only independent parameter predictive of overall survival (P = 0.009). In H3 K27M-mutant tumors, neither anatomic midline location nor histopathological grading nor extent of tumor resection had an influence on survival. Conclusion These results substantiate the clinical significance of considering diffuse midline glioma, H3 K27M-mutant, as a distinct entity corresponding to WHO grade IV, carrying a universally fatal prognosis.

Journal ArticleDOI
TL;DR: The archaeome remains mysterious, and many questions with respect to potential pathogenicity, function, and structural interactions with their host and other microorganisms remain.

Journal ArticleDOI
TL;DR: Continuous apixaban is safe and effective in patients undergoing atrial fibrillation ablation at risk of stroke with respect to bleeding, stroke, and cognitive function.
Abstract: Aims It is recommended to perform atrial fibrillation ablation with continuous anticoagulation. Continuous apixaban has not been tested. Methods and results We compared continuous apixaban (5 mg b.i.d.) to vitamin K antagonists (VKA, international normalized ratio 2-3) in atrial fibrillation patients at risk of stroke a prospective, open, multi-centre study with blinded outcome assessment. Primary outcome was a composite of death, stroke, or bleeding (Bleeding Academic Research Consortium 2-5). A high-resolution brain magnetic resonance imaging (MRI) sub-study quantified acute brain lesions. Cognitive function was assessed by Montreal Cognitive Assessment (MoCA) at baseline and at end of follow-up. Overall, 674 patients (median age 64 years, 33% female, 42% non-paroxysmal atrial fibrillation, 49 sites) were randomized; 633 received study drug and underwent ablation; 335 undertook MRI (25 sites, 323 analysable scans). The primary outcome was observed in 22/318 patients randomized to apixaban, and in 23/315 randomized to VKA {difference -0.38% [90% confidence interval (CI) -4.0%, 3.3%], non-inferiority P = 0.0002 at the pre-specified absolute margin of 0.075}, including 2 (0.3%) deaths, 2 (0.3%) strokes, and 24 (3.8%) ISTH major bleeds. Acute small brain lesions were found in a similar number of patients in each arm [apixaban 44/162 (27.2%); VKA 40/161 (24.8%); P = 0.64]. Cognitive function increased at the end of follow-up (median 1 MoCA unit; P = 0.005) without differences between study groups. Conclusions Continuous apixaban is safe and effective in patients undergoing atrial fibrillation ablation at risk of stroke with respect to bleeding, stroke, and cognitive function. Further research is needed to reduce ablation-related acute brain lesions.

Journal ArticleDOI
TL;DR: An interdisciplinary approach is presented by combining results from material sciences, microbiology, mineralogy and hydrochemistry to stimulate the development of novel and sustainable materials and mitigation strategies for MICC.