scispace - formally typeset
Search or ask a question

Showing papers by "University of Luxembourg published in 2014"


Journal ArticleDOI
TL;DR: LCZ696 was superior to enalapril in reducing the risks of death and of hospitalization for heart failure and decreased the symptoms and physical limitations of heart failure.
Abstract: Background We compared the angiotensin receptor–neprilysin inhibitor LCZ696 with enalapril in patients who had heart failure with a reduced ejection fraction. In previous studies, enalapril improved survival in such patients. Methods In this double-blind trial, we randomly assigned 8442 patients with class II, III, or IV heart failure and an ejection fraction of 40% or less to receive either LCZ696 (at a dose of 200 mg twice daily) or enalapril (at a dose of 10 mg twice daily), in addition to recommended therapy. The primary outcome was a composite of death from cardiovascular causes or hospitalization for heart failure, but the trial was designed to detect a difference in the rates of death from cardiovascular causes. Results The trial was stopped early, according to prespecified rules, after a median followup of 27 months, because the boundary for an overwhelming benefit with LCZ696 had been crossed. At the time of study closure, the primary outcome had occurred in 914 patients (21.8%) in the LCZ696 group and 1117 patients (26.5%) in the enalapril group (hazard ratio in the LCZ696 group, 0.80; 95% confidence interval [CI], 0.73 to 0.87; P<0.001). A total of 711 patients (17.0%) receiving LCZ696 and 835 patients (19.8%) receiving enalapril died (hazard ratio for death from any cause, 0.84; 95% CI, 0.76 to 0.93; P<0.001); of these patients, 558 (13.3%) and 693 (16.5%), respectively, died from cardiovascular causes (hazard ratio, 0.80; 95% CI, 0.71 to 0.89; P<0.001). As compared with enalapril, LCZ696 also reduced the risk of hospitalization for heart failure by 21% (P<0.001) and decreased the symptoms and physical limitations of heart failure (P = 0.001). The LCZ696 group had higher proportions of patients with hypotension and nonserious angioedema but lower proportions with renal impairment, hyperkalemia, and cough than the enalapril group. Conclusions LCZ696 was superior to enalapril in reducing the risks of death and of hospitalization for heart failure. (Funded by Novartis; PARADIGM-HF ClinicalTrials.gov number, NCT01035255.)

4,727 citations


Posted Content
TL;DR: Software-Defined Networking (SDN) as discussed by the authors is an emerging paradigm that promises to change this state of affairs, by breaking vertical integration, separating the network's control logic from the underlying routers and switches, promoting (logical) centralization of network control, and introducing the ability to program the network.
Abstract: Software-Defined Networking (SDN) is an emerging paradigm that promises to change this state of affairs, by breaking vertical integration, separating the network's control logic from the underlying routers and switches, promoting (logical) centralization of network control, and introducing the ability to program the network. The separation of concerns introduced between the definition of network policies, their implementation in switching hardware, and the forwarding of traffic, is key to the desired flexibility: by breaking the network control problem into tractable pieces, SDN makes it easier to create and introduce new abstractions in networking, simplifying network management and facilitating network evolution. In this paper we present a comprehensive survey on SDN. We start by introducing the motivation for SDN, explain its main concepts and how it differs from traditional networking, its roots, and the standardization activities regarding this novel paradigm. Next, we present the key building blocks of an SDN infrastructure using a bottom-up, layered approach. We provide an in-depth analysis of the hardware infrastructure, southbound and northbound APIs, network virtualization layers, network operating systems (SDN controllers), network programming languages, and network applications. We also look at cross-layer problems such as debugging and troubleshooting. In an effort to anticipate the future evolution of this new paradigm, we discuss the main ongoing research efforts and challenges of SDN. In particular, we address the design of switches and control platforms -- with a focus on aspects such as resiliency, scalability, performance, security and dependability -- as well as new opportunities for carrier transport networks and cloud providers. Last but not least, we analyze the position of SDN as a key enabler of a software-defined environment.

1,968 citations


Proceedings ArticleDOI
09 Jun 2014
TL;DR: FlowDroid is presented, a novel and highly precise static taint analysis for Android applications that successfully finds leaks in a subset of 500 apps from Google Play and about 1,000 malware apps from the VirusShare project.
Abstract: Today's smartphones are a ubiquitous source of private and confidential data. At the same time, smartphone users are plagued by carelessly programmed apps that leak important data by accident, and by malicious apps that exploit their given privileges to copy such data intentionally. While existing static taint-analysis approaches have the potential of detecting such data leaks ahead of time, all approaches for Android use a number of coarse-grain approximations that can yield high numbers of missed leaks and false alarms. In this work we thus present FlowDroid, a novel and highly precise static taint analysis for Android applications. A precise model of Android's lifecycle allows the analysis to properly handle callbacks invoked by the Android framework, while context, flow, field and object-sensitivity allows the analysis to reduce the number of false alarms. Novel on-demand algorithms help FlowDroid maintain high efficiency and precision at the same time. We also propose DroidBench, an open test suite for evaluating the effectiveness and accuracy of taint-analysis tools specifically for Android apps. As we show through a set of experiments using SecuriBench Micro, DroidBench, and a set of well-known Android test applications, FlowDroid finds a very high fraction of data leaks while keeping the rate of false positives low. On DroidBench, FlowDroid achieves 93% recall and 86% precision, greatly outperforming the commercial tools IBM AppScan Source and Fortify SCA. FlowDroid successfully finds leaks in a subset of 500 apps from Google Play and about 1,000 malware apps from the VirusShare project.

1,730 citations


Journal ArticleDOI
TL;DR: The results indicate that a number of core symptoms of Internet addiction appear relevant for diagnosis, which assimilates Internet addiction and other addictive disorders and also differentiates them, implying a conceptualisation as syndrome with similar etiology and components, but different expressions of addictions.
Abstract: In the last decade, Internet usage has grown tremendously on a global scale. The increasing popularity and frequency of Internet use has led to an increasing number of reports highlighting the potential negative consequences of overuse. Over the last decade, research into Internet addiction has proliferated. This paper reviews the existing 68 epidemiological studies of Internet addiction that (i) contain quantitative empirical data, (ii) have been published after 2000, (iii) include an analysis relating to Internet addiction, (iv) include a minimum of 1000 participants, and (v) provide a full-text article published in English using the database Web of Science. Assessment tools and conceptualisations, prevalence, and associated factors in adolescents and adults are scrutinised. The results reveal the following. First, no gold standard of Internet addiction classification exists as 21 different assessment instruments have been identified. They adopt official criteria for substance use disorders or pathological gambling, no or few criteria relevant for an addiction diagnosis, time spent online, or resulting problems. Second, reported prevalence rates differ as a consequence of different assessment tools and cut-offs, ranging from 0.8% in Italy to 26.7% in Hong Kong. Third, Internet addiction is associated with a number of sociodemographic, Internet use, and psychosocial factors, as well as comorbid symptoms and disorder in adolescents and adults. The results indicate that a number of core symptoms (i.e., compulsive use, negative outcomes and salience) appear relevant for diagnosis, which assimilates Internet addiction and other addictive disorders and also differentiates them, implying a conceptualisation as syndrome with similar etiology and components, but different expressions of addictions. Limitations include the exclusion of studies with smaller sample sizes and studies focusing on specific online behaviours. Conclusively, there is a need for nosological precision so that ultimately those in need can be helped by translating the scientific evidence established in the context of Internet addiction into actual clinical practice.

974 citations


Journal ArticleDOI
TL;DR: An overview of SWIPT systems with a particular focus on the hardware realization of rectenna circuits and practical techniques that achieveSWIPT in the domains of time, power, antennas, and space is provided.
Abstract: Energy harvesting for wireless communication networks is a new paradigm that allows terminals to recharge their batteries from external energy sources in the surrounding environment. A promising energy harvesting technology is wireless power transfer where terminals harvest energy from electromagnetic radiation. Thereby, the energy may be harvested opportunistically from ambient electromagnetic sources or from sources that intentionally transmit electromagnetic energy for energy harvesting purposes. A particularly interesting and challenging scenario arises when sources perform simultaneous wireless information and power transfer (SWIPT), as strong signals not only increase power transfer but also interference. This article provides an overview of SWIPT systems with a particular focus on the hardware realization of rectenna circuits and practical techniques that achieve SWIPT in the domains of time, power, antennas, and space. The article also discusses the benefits of a potential integration of SWIPT technologies in modern communication networks in the context of resource allocation and cooperative cognitive radio networks.

870 citations


Book
01 Jan 2014
TL;DR: The Weakness of Education: An Interview with Gert Biesta by Philip Winter as discussed by the authors is a good starting point for this paper. But it is not a complete account of Biesta's life.
Abstract: Prologue On the Weakness of Education Chapter One Creativity Chapter Two Communication Chapter Three Teaching Chapter Four Learning Chapter Five Emancipation Chapter Six Democracy Chapter Seven Virtuosity EpilogueFor a Pedagogy of the Event Appendix Coming into the World, Uniqueness, and the Beautiful Risk of Education: An Interview with Gert Biesta by Philip Winter

515 citations


Journal ArticleDOI
TL;DR: This paper provides guidelines on how to carry out and properly analyse randomized algorithms applied to solve software engineering tasks, with a particular focus on software testing, which is by far the most frequent application area of randomized algorithms within software engineering.
Abstract: Randomized algorithms are widely used to address many types of software engineering problems, especially in the area of software verification and validation with a strong emphasis on test automation. However, randomized algorithms are affected by chance and so require the use of appropriate statistical tests to be properly analysed in a sound manner. This paper features a systematic review regarding recent publications in 2009 and 2010 showing that, overall, empirical analyses involving randomized algorithms in software engineering tend to not properly account for the random nature of these algorithms. Many of the novel techniques presented clearly appear promising, but the lack of soundness in their empirical evaluations casts unfortunate doubts on their actual usefulness. In software engineering, although there are guidelines on how to carry out empirical analyses involving human subjects, those guidelines are not directly and fully applicable to randomized algorithms. Furthermore, many of the textbooks on statistical analysis are written from the viewpoints of social and natural sciences, which present different challenges from randomized algorithms. To address the questionable overall quality of the empirical analyses reported in the systematic review, this paper provides guidelines on how to carry out and properly analyse randomized algorithms applied to solve software engineering tasks, with a particular focus on software testing, which is by far the most frequent application area of randomized algorithms within software engineering. Copyright © 2012 John Wiley & Sons, Ltd.

510 citations


Journal ArticleDOI
01 Jan 2014-Database
TL;DR: The COMPARTMENTS resource is presented, which integrates all sources listed above as well as the results of automatic text mining, and all localization evidence is mapped onto common protein identifiers and Gene Ontology terms.
Abstract: Information on protein subcellular localization is important to understand the cellular functions of proteins. Currently, such information is manually curated from the literature, obtained from high-throughput microscopy-based screens and predicted from primary sequence. To get a comprehensive view of the localization of a protein, it is thus necessary to consult multiple databases and prediction tools. To address this, we present the COMPARTMENTS resource, which integrates all sources listed above as well as the results of automatic text mining. The resource is automatically kept up to date with source databases, and all localization evidence is mapped onto common protein identifiers and Gene Ontology terms. We further assign confidence scores to the localization evidence to facilitate comparison of different types and sources of evidence. To further improve the comparability, we assign confidence scores based on the type and source of the localization evidence. Finally, we visualize the unified localization evidence for a protein on a schematic cell to provide a simple overview. Database URL: http://compartments.jensenlab.org

459 citations


Journal ArticleDOI
TL;DR: Exome-sequencing data is analyzed of 356 trios with the "classical" epileptic encephalopathies, infantile spasms and Lennox Gastaut syndrome, finding suggestive evidence for a role of three additional genes, and supporting a prominent role for de novo mutations in epilepsy.
Abstract: Emerging evidence indicates that epileptic encephalopathies are genetically highly heterogeneous, underscoring the need for large cohorts of well-characterized individuals to further define the genetic landscape. Through a collaboration between two consortia (EuroEPINOMICS and Epi4K/EPGP), we analyzed exome-sequencing data of 356 trios with the “classical” epileptic encephalopathies, infantile spasms and Lennox Gastaut syndrome, including 264 trios previously analyzed by the Epi4K/EPGP consortium. In this expanded cohort, we find 429 de novo mutations, including de novo mutations in DNM1 in five individuals and de novo mutations in GABBR2, FASN, and RYR3 in two individuals each. Unlike previous studies, this cohort is sufficiently large to show a significant excess of de novo mutations in epileptic encephalopathy probands compared to the general population using a likelihood analysis (p = 8.2 × 10−4), supporting a prominent role for de novo mutations in epileptic encephalopathies. We bring statistical evidence that mutations in DNM1 cause epileptic encephalopathy, find suggestive evidence for a role of three additional genes, and show that at least 12% of analyzed individuals have an identifiable causal de novo mutation. Strikingly, 75% of mutations in these probands are predicted to disrupt a protein involved in regulating synaptic transmission, and there is a significant enrichment of de novo mutations in genes in this pathway in the entire cohort as well. These findings emphasize an important role for synaptic dysregulation in epileptic encephalopathies, above and beyond that caused by ion channel dysfunction.

365 citations


Proceedings ArticleDOI
03 Nov 2014
TL;DR: In this paper, the authors present an efficient method to deanonymize Bitcoin users, which allows to link user pseudonyms to the IP addresses where the transactions are generated, and also show that a natural countermeasure of using Tor or other anonymity services can be cut-off by abusing anti-DoS countermeasures of the Bitcoin network.
Abstract: Bitcoin is a digital currency which relies on a distributed set of miners to mint coins and on a peer-to-peer network to broadcast transactions. The identities of Bitcoin users are hidden behind pseudonyms (public keys) which are recommended to be changed frequently in order to increase transaction unlinkability. We present an efficient method to deanonymize Bitcoin users, which allows to link user pseudonyms to the IP addresses where the transactions are generated. Our techniques work for the most common and the most challenging scenario when users are behind NATs or firewalls of their ISPs. They allow to link transactions of a user behind a NAT and to distinguish connections and transactions of different users behind the same NAT. We also show that a natural countermeasure of using Tor or other anonymity services can be cut-off by abusing anti-DoS countermeasures of the Bitcoin network. Our attacks require only a few machines and have been experimentally verified. The estimated success rate is between 11% and 60% depending on how stealthy an attacker wants to be. We propose several countermeasures to mitigate these new attacks.

357 citations


Posted Content
TL;DR: This work presents an efficient method to deanonymize Bitcoin users, which allows to link user pseudonyms to the IP addresses where the transactions are generated and shows that a natural countermeasure of using Tor or other anonymity services can be cut-off by abusing anti-DoS countermeasures of the Bitcoin network.
Abstract: Bitcoin is a digital currency which relies on a distributed set of miners to mint coins and on a peer-to-peer network to broadcast transactions. The identities of Bitcoin users are hidden behind pseudonyms (public keys) which are recommended to be changed frequently in order to increase transaction unlinkability. We present an efficient method to deanonymize Bitcoin users, which allows to link user pseudonyms to the IP addresses where the transactions are generated. Our techniques work for the most common and the most challenging scenario when users are behind NATs or firewalls of their ISPs. They allow to link transactions of a user behind a NAT and to distinguish connections and transactions of different users behind the same NAT. We also show that a natural countermeasure of using Tor or other anonymity services can be cut-off by abusing anti-DoS countermeasures of the bitcoin network. Our attacks require only a few machines and have been experimentally verified. We propose several countermeasures to mitigate these new attacks.

Journal ArticleDOI
TL;DR: This paper proposes a new secure outsourcing algorithm for (variable-exponent, variable-base) exponentiation modulo a prime in the two untrusted program model and proposes the first efficient outsource-secure algorithm for simultaneous modular exponentiations.
Abstract: With the rapid development of cloud services, the techniques for securely outsourcing the prohibitively expensive computations to untrusted servers are getting more and more attention in the scientific community. Exponentiations modulo a large prime have been considered the most expensive operations in discrete-logarithm-based cryptographic protocols, and they may be burdensome for the resource-limited devices such as RFID tags or smartcards. Therefore, it is important to present an efficient method to securely outsource such operations to (untrusted) cloud servers. In this paper, we propose a new secure outsourcing algorithm for (variable-exponent, variable-base) exponentiation modulo a prime in the two untrusted program model. Compared with the state-of-the-art algorithm, the proposed algorithm is superior in both efficiency and checkability. Based on this algorithm, we show how to achieve outsource-secure Cramer-Shoup encryptions and Schnorr signatures. We then propose the first efficient outsource-secure algorithm for simultaneous modular exponentiations. Finally, we provide the experimental evaluation that demonstrates the efficiency and effectiveness of the proposed outsourcing algorithms and schemes.

Journal ArticleDOI
TL;DR: This paper developed a unified framework describing the thermodynamics of information processing, suggesting that their analyses might be useful for biological sensing, such as copying and erasing, which has associated thermodynamic implications.
Abstract: Information manipulation such as copying and erasing has associated thermodynamic implications. Scientists develop a unified framework describing the thermodynamics of information processing, suggesting that their analyses might be useful for biological sensing.

Journal ArticleDOI
TL;DR: The findings suggest that immune system processes (link to 6p21.3) and possibly lysosomal and autophagy pathways ( link to 11q14) are potentially involved in FTD.
Abstract: Summary Background Frontotemporal dementia (FTD) is a complex disorder characterised by a broad range of clinical manifestations, differential pathological signatures, and genetic variability. Mutations in three genes— MAPT , GRN , and C9orf72 —have been associated with FTD. We sought to identify novel genetic risk loci associated with the disorder. Methods We did a two-stage genome-wide association study on clinical FTD, analysing samples from 3526 patients with FTD and 9402 healthy controls. To reduce genetic heterogeneity, all participants were of European ancestry. In the discovery phase (samples from 2154 patients with FTD and 4308 controls), we did separate association analyses for each FTD subtype (behavioural variant FTD, semantic dementia, progressive non-fluent aphasia, and FTD overlapping with motor neuron disease [FTD-MND]), followed by a meta-analysis of the entire dataset. We carried forward replication of the novel suggestive loci in an independent sample series (samples from 1372 patients and 5094 controls) and then did joint phase and brain expression and methylation quantitative trait loci analyses for the associated (p −8 ) single-nucleotide polymorphisms. Findings We identified novel associations exceeding the genome-wide significance threshold (p −8 ). Combined (joint) analyses of discovery and replication phases showed genome-wide significant association at 6p21.3, HLA locus (immune system), for rs9268877 (p=1·05 × 10 −8 ; odds ratio=1·204 [95% CI 1·11–1·30]), rs9268856 (p=5·51 × 10 −9 ; 0·809 [0·76–0·86]) and rs1980493 (p value=1·57 × 10 −8 , 0·775 [0·69–0·86]) in the entire cohort. We also identified a potential novel locus at 11q14, encompassing RAB38 / CTSC (the transcripts of which are related to lysosomal biology), for the behavioural FTD subtype for which joint analyses showed suggestive association for rs302668 (p=2·44 × 10 −7 ; 0·814 [0·71–0·92]). Analysis of expression and methylation quantitative trait loci data suggested that these loci might affect expression and methylation in cis . Interpretation Our findings suggest that immune system processes (link to 6p21.3) and possibly lysosomal and autophagy pathways (link to 11q14) are potentially involved in FTD. Our findings need to be replicated to better define the association of the newly identified loci with disease and to shed light on the pathomechanisms contributing to FTD. Funding The National Institute of Neurological Disorders and Stroke and National Institute on Aging, the Wellcome/MRC Centre on Parkinson's disease, Alzheimer's Research UK, and Texas Tech University Health Sciences Center.

Proceedings ArticleDOI
21 Jul 2014
TL;DR: The UL HPC facility and the derived deployed services is a complex computing system to manage by its scale and the aspects in relation to the management of such a complex infrastructure, whether technical or administrative are covered.
Abstract: The intensive growth of processing power, data storage and transmission capabilities has revolutionized many aspects of science. These resources are essential to achieve high-quality results in many application areas. In this context, the University of Luxembourg (UL) operates since 2007 an High Performance Computing (HPC) facility and the related storage by a very small team. The aspect of bridging computing and storage is a requirement of UL service - the reasons are both legal (certain data may not move) and performance related. Nowadays, people from the three faculties and/or the two Interdisciplinary centers within the UL, are users of this facility. More specifically, key research priorities such as Systems Bio-medicine (by LCSB) and Security, Reliability & Trust (by SnT) require access to such HPC facilities in order to function in an adequate environment. The management of HPC solutions is a complex enterprise and a constant area for discussion and improvement. The UL HPC facility and the derived deployed services is a complex computing system to manage by its scale: at the moment of writing, it consists of 150 servers, 368 nodes (3880 computing cores) and 1996 TB of shared storage which are all configured, monitored and operated by only three persons using advanced IT automation solutions based on Puppet [1], FAI [2] and Capistrano [3]. This paper covers all the aspects in relation to the management of such a complex infrastructure, whether technical or administrative. Most design choices or implemented approaches have been motivated by several years of experience in addressing research needs, mainly in the HPC area but also in complementary services (typically Web-based). In this context, we tried to answer in a flexible and convenient way many technological issues. This experience report may be of interest for other research centers and universities belonging either to the public or the private sector looking for good if not best practices in cluster architecture and management.

Journal ArticleDOI
TL;DR: In this article, the authors examined the psychometric properties of short scales (with three items) and single-item measures for two core motivational-affective constructs (i.e., academic anxiety and academic self-concept) by conducting systematic comparisons with corresponding long scales across school subjects and within different subject domains.

Journal ArticleDOI
TL;DR: This paper presents the current state of the art on attack and defense modeling approaches that are based on directed acyclic graphs (DAGs), and proposes a taxonomy of the described formalisms.

Journal ArticleDOI
TL;DR: ESC guidelines on diabetes, pre-diabetes, and cardiovascular diseases developed in collaboration with the EASD - summary.
Abstract: ESC guidelines on diabetes, pre-diabetes, and cardiovascular diseases developed in collaboration with the EASD - summary.

Journal ArticleDOI
TL;DR: A multiuser multiple-input single-output interference channel where the receivers are characterized by both quality-of-service (QoS) and radio-frequency (RF) energy harvesting (EH) constraints is considered.
Abstract: We consider a multiuser multiple-input single-output interference channel where the receivers are characterized by both quality-of-service (QoS) and radio-frequency (RF) energy harvesting (EH) constraints. We consider the power splitting RF-EH technique where each receiver divides the received signal into two parts a) for information decoding and b) for battery charging. The minimum required power that supports both the QoS and the RF-EH constraints is formulated as an optimization problem that incorporates the transmitted power and the beamforming design at each transmitter as well as the power splitting ratio at each receiver. We consider both the cases of fixed beamforming and when the beamforming design is incorporated into the optimization problem. For fixed beamforming we study three standard beamforming schemes, the zero-forcing (ZF), the regularized zero-forcing (RZF) and the maximum ratio transmission (MRT); a hybrid scheme, MRT-ZF, comprised of a linear combination of MRT and ZF beamforming is also examined. The optimal solution for ZF beamforming is derived in closed-form, while optimization algorithms based on second-order cone programming are developed for MRT, RZF and MRT-ZF beamforming to solve the problem. In addition, the joint-optimization of beamforming and power allocation is studied using semidefinite programming (SDP) with the aid of rank relaxation.

Journal ArticleDOI
TL;DR: In this paper, neutrophil tissue factor (TF) was found to be implicated in the thrombotic diathesis in AAV, and TF expression was assessed by immunoblotting and confocal microscopy.
Abstract: Objectives Antineutrophil cytoplasmic antibody (ANCA) associated vasculitis (AAV) is characterised by neutrophil activation. An elevated prevalence of venous thromboembolic events has been reported in AAV. Because of the critical role of neutrophils in inflammation associated thrombosis, we asked whether neutrophil tissue factor (TF) may be implicated in the thrombotic diathesis in AAV. Methods Neutrophils from four patients and sera from 17 patients with ANCA associated vasculitis with active disease and remission were studied. TF expression was assessed by immunoblotting and confocal microscopy. Circulating DNA levels were evaluated. TF expressing microparticles (MPs) were measured by flow cytometry and thrombin–antithrombin complex levels by ELISA. Results Peripheral blood neutrophils from four patients with active disease expressed elevated TF levels and released TF expressing neutrophil extracellular traps (NETs) and MPs. TF positive NETs were released by neutrophils isolated from the bronchoalveolar lavage and were detected in nasal and renal biopsy specimens. Elevated levels of circulating DNA and TF expressing neutrophil derived MPs were further observed in sera from patients with active disease. Induction of remission attenuated the aforementioned effects. Control neutrophils treated with sera from patients with active disease released TF bearing NETs and MPs which were abolished after IgG depletion. Treatment of control neutrophils with isolated IgG from sera from patients with active disease also resulted in the release of TF bearing NETs. TF implication in MP dependent thrombin generation was demonstrated by antibody neutralisation studies. Conclusions Expression of TF in NETs and neutrophil derived MPs proposes a novel mechanism for the induction of thrombosis and inflammation in active AAV.

Journal ArticleDOI
TL;DR: A computationally tractable, comprehensive molecular interaction map of Parkinson's disease that integrates pathways implicated in PD pathogenesis such as synaptic and mitochondrial dysfunction, impaired protein degradation, alpha-synuclein pathobiology and neuroinflammation is introduced.
Abstract: Parkinson's disease (PD) is a major neurodegenera- tivechronic disease,most likelycaused by a complex interplay of genetic and environmental factors. Information on various aspects of PD pathogenesis is rapidly increasing and needs to be efficiently organized, so that the resulting data is available for exploration and analysis. Here we introduce a computa- tionally tractable, comprehensive molecular interaction map of PD. This map integrates pathways implicated in PD pathogen- esis such as synaptic and mitochondrial dysfunction, impaired protein degradation, alpha-synuclein pathobiology and neuroinflammation. We also present bioinformatics tools for the analysis, enrichment and annotation of the map, allowing the research community to open new avenues in PD research. The PD map is accessible at http://minerva.uni.lu/pd_map.

Journal ArticleDOI
TL;DR: This work presents fastcore, a generic algorithm for reconstructing context-specific metabolic network models from global genome-wide metabolicnetwork models such as Recon X, and shows that a minimal consistent reconstruction can be defined via a set of sparse modes of the global network.
Abstract: Systemic approaches to the study of a biological cell or tissue rely increasingly on the use of context-specific metabolic network models. The reconstruction of such a model from high-throughput data can routinely involve large numbers of tests under different conditions and extensive parameter tuning, which calls for fast algorithms. We present fastcore, a generic algorithm for reconstructing context-specific metabolic network models from global genome-wide metabolic network models such as Recon X. fastcore takes as input a core set of reactions that are known to be active in the context of interest (e.g., cell or tissue), and it searches for a flux consistent subnetwork of the global network that contains all reactions from the core set and a minimal set of additional reactions. Our key observation is that a minimal consistent reconstruction can be defined via a set of sparse modes of the global network, and fastcore iteratively computes such a set via a series of linear programs. Experiments on liver data demonstrate speedups of several orders of magnitude, and significantly more compact reconstructions, over a rival method. Given its simplicity and its excellent performance, fastcore can form the backbone of many future metabolic network reconstruction algorithms.

Journal ArticleDOI
TL;DR: In this paper, the order-disorder transition in kesterite Cu2ZnSnSe4 (CZTSe), an interesting material for solar cell, has been investigated by spectrophotometry, photoluminescence (PL), and Raman spectroscopy.
Abstract: The order-disorder transition in kesterite Cu2ZnSnSe4 (CZTSe), an interesting material for solar cell, has been investigated by spectrophotometry, photoluminescence (PL), and Raman spectroscopy. Like Cu2ZnSnS4, CZTSe is prone to disorder by Cu-Zn exchanges depending on temperature. Absorption measurements have been used to monitor the changes in band gap energy (Eg) of solar cell grade thin films as a function of the annealing temperature. We show that ordering can increase Eg by 110 meV as compared to fully disordered material. Kinetics simulations show that Eg can be used as an order parameter and the critical temperature for the CZTSe order-disorder transition is 200 ± 20 °C. On the one hand, ordering was found to increase the correlation length of the crystal. But on the other hand, except the change in Eg, ordering did not influence the PL signal of the CZTSe.

Journal ArticleDOI
TL;DR: A detailed solution to tackle the weighted max-min fair multigroup multicast problem under per-antenna power constraints is derived and robust per-Antenna constrained multigroup multipurpose beamforming solutions are proposed.
Abstract: A multiantenna transmitter that conveys independent sets of common data to distinct groups of users is considered. This model is known as physical layer multicasting to multiple cochannel groups. In this context, the practical constraint of a maximum permitted power level radiated by each antenna is addressed. The per-antenna power constrained system is optimized in a maximum fairness sense with respect to predetermined quality of service weights. In other words, the worst scaled user is boosted by maximizing its weighted signal-to-interference plus noise ratio. A detailed solution to tackle the weighted max-min fair multigroup multicast problem under per-antenna power constraints is therefore derived. The implications of the novel constraints are investigated via prominent applications and paradigms. What is more, robust per-antenna constrained multigroup multicast beamforming solutions are proposed. Finally, an extensive performance evaluation quantifies the gains of the proposed algorithm over existing solutions and exhibits its accuracy over per-antenna power constrained systems.

Journal ArticleDOI
TL;DR: In this study, using the fluctuation theorem, universal features of efficiency fluctuations are identified and it is found that the Carnot efficiency is, surprisingly, the least likely in the long time limit.
Abstract: Carnot efficiency is the highest theoretically possible efficiency that a heat engine can have. Verley et al. use the fluctuation theorem to show that the Carnot value is the least likely efficiency in the long time limit.

Posted Content
TL;DR: In this article, the authors analyzed the determinants of the choice of location of international students and developed a small theoretical model allowing to identify the various factors associated to the attraction of migrants as well as the costs of moving abroad.
Abstract: This paper analyzes the determinants of the choice of location of international students. Building on the documented trends in international migration of students, we develop a small theoretical model allowing to identify the various factors associated to the attraction of migrants as well as the costs of moving abroad. Using new data capturing the number of students from a large set of origin countries studying in a set of 13 OECD countries, we assess the importance of the various factors identified in the theory. We find support for a significant network effect in the migration of students, a result so far undocumented in the literature. We also find a significant role for cost factors such as housing prices and for attractiveness variables such as the reported quality of universities. In contrast, we do not find an important role for registration fees. (This abstract was borrowed from another version of this item.)

Journal ArticleDOI
22 Apr 2014-PLOS ONE
TL;DR: Two primer sets will allow minimally biased assessment of eukaryotic diversity in different microbial ecosystems and will be used to amplify 18S rDNA sequences from isolates and from a range of environmental samples.
Abstract: High-throughput sequencing of ribosomal RNA gene (rDNA) amplicons has opened up the door to large-scale comparative studies of microbial community structures. The short reads currently produced by massively parallel sequencing technologies make the choice of sequencing region crucial for accurate phylogenetic assignments. While for 16S rDNA, relevant regions have been well described, no truly systematic design of 18S rDNA primers aimed at resolving eukaryotic diversity has yet been reported. Here we used 31,862 18S rDNA sequences to design a set of broad-taxonomic range degenerate PCR primers. We simulated the phylogenetic information that each candidate primer pair would retrieve using paired- or single-end reads of various lengths, representing different sequencing technologies. Primer pairs targeting the V4 region performed best, allowing discrimination with paired-end reads as short as 150 bp (with 75% accuracy at genus level). The conditions for PCR amplification were optimised for one of these primer pairs and this was used to amplify 18S rDNA sequences from isolates as well as from a range of environmental samples which were then Illumina sequenced and analysed, revealing good concordance between expected and observed results. In summary, the reported primer sets will allow minimally biased assessment of eukaryotic diversity in different microbial ecosystems.

Journal ArticleDOI
TL;DR: In this paper, a Nitche's method is used to couple non-conforming two and three-dimensional non-uniform rational b-splines (NURBS) patches in the context of isogeometric analysis.
Abstract: We present a Nitche's method to couple non-conforming two and three-dimensional non uniform rational b-splines (NURBS) patches in the context of isogeometric analysis. We present results for linear elastostatics in two and and three-dimensions. The method can deal with surface-surface or volume-volume coupling, and we show how it can be used to handle heterogeneities such as inclusions. We also present preliminary results on modal analysis. This simple coupling method has the potential to increase the applicability of NURBS-based isogeometric analysis for practical applications.

Journal ArticleDOI
TL;DR: In this article, the determinants of the choice of location of international students are analyzed based on the documented trends in international migration of students, identifying the various factors associated to the attraction of migrants as well as the costs of moving abroad.

Journal ArticleDOI
TL;DR: This paper proposes and proposes and studies three schemes that enable joint information and energy cooperation between the primary and secondary systems and reveals that the power splitting scheme can achieve larger rate region than the time splitting scheme when the efficiency of the energy transfer is sufficiently large.
Abstract: Cooperation between the primary and secondary systems can improve the spectrum efficiency in cognitive radio networks. The key idea is that the secondary system helps to boost the primary system's performance by relaying, and, in return, the primary system provides more opportunities for the secondary system to access the spectrum. In contrast to most of existing works that only consider information cooperation, this paper studies joint information and energy cooperation between the two systems, i.e., the primary transmitter sends information for relaying and feeds the secondary system with energy as well. This is particularly useful when the secondary transmitter has good channel quality to the primary receiver but is energy constrained. We propose and study three schemes that enable this cooperation. First, we assume there exists an ideal backhaul between the two systems for information and energy transfer. We then consider two wireless information and energy transfer schemes from the primary transmitter to the secondary transmitter using power splitting and time splitting energy harvesting techniques, respectively. For each scheme, the optimal and zero-forcing solutions are derived. Simulation results demonstrate promising performance gain for both systems due to the additional energy cooperation. It is also revealed that the power splitting scheme can achieve larger rate region than the time splitting scheme when the efficiency of the energy transfer is sufficiently large.