scispace - formally typeset
Search or ask a question

Showing papers by "University of Luxembourg published in 2016"


Journal ArticleDOI
Daniel J. Klionsky1, Kotb Abdelmohsen2, Akihisa Abe3, Joynal Abedin4  +2519 moreInstitutions (695)
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.

5,187 citations


Journal ArticleDOI
17 Nov 2016-Cell
TL;DR: Dietary fiber deprivation, together with a fiber-deprived, mucus-eroding microbiota, promotes greater epithelial access and lethal colitis by the mucosal pathogen, Citrobacter rodentium.

1,689 citations


Journal ArticleDOI
TL;DR: The present paper analyzes in detail the potential of 5G technologies for the IoT, by considering both the technological and standardization aspects and illustrates the massive business shifts that a tight link between IoT and 5G may cause in the operator and vendors ecosystem.
Abstract: The IoT paradigm holds the promise to revolutionize the way we live and work by means of a wealth of new services, based on seamless interactions between a large amount of heterogeneous devices. After decades of conceptual inception of the IoT, in recent years a large variety of communication technologies has gradually emerged, reflecting a large diversity of application domains and of communication requirements. Such heterogeneity and fragmentation of the connectivity landscape is currently hampering the full realization of the IoT vision, by posing several complex integration challenges. In this context, the advent of 5G cellular systems, with the availability of a connectivity technology, which is at once truly ubiquitous, reliable, scalable, and cost-efficient, is considered as a potentially key driver for the yet-to emerge global IoT. In the present paper, we analyze in detail the potential of 5G technologies for the IoT, by considering both the technological and standardization aspects. We review the present-day IoT connectivity landscape, as well as the main 5G enablers for the IoT. Last but not least, we illustrate the massive business shifts that a tight link between IoT and 5G may cause in the operator and vendors ecosystem.

1,224 citations


Journal ArticleDOI
Kurt Lejaeghere1, Gustav Bihlmayer2, Torbjörn Björkman3, Torbjörn Björkman4, Peter Blaha5, Stefan Blügel2, Volker Blum6, Damien Caliste7, Ivano E. Castelli8, Stewart J. Clark9, Andrea Dal Corso10, Stefano de Gironcoli10, Thierry Deutsch7, J. K. Dewhurst11, Igor Di Marco12, Claudia Draxl13, Claudia Draxl14, Marcin Dulak15, Olle Eriksson12, José A. Flores-Livas11, Kevin F. Garrity16, Luigi Genovese7, Paolo Giannozzi17, Matteo Giantomassi18, Stefan Goedecker19, Xavier Gonze18, Oscar Grånäs20, Oscar Grånäs12, E. K. U. Gross11, Andris Gulans13, Andris Gulans14, Francois Gygi21, D. R. Hamann22, P. J. Hasnip23, Natalie Holzwarth24, Diana Iusan12, Dominik B. Jochym25, F. Jollet, Daniel M. Jones26, Georg Kresse27, Klaus Koepernik28, Klaus Koepernik29, Emine Kucukbenli8, Emine Kucukbenli10, Yaroslav Kvashnin12, Inka L. M. Locht30, Inka L. M. Locht12, Sven Lubeck13, Martijn Marsman27, Nicola Marzari8, Ulrike Nitzsche28, Lars Nordström12, Taisuke Ozaki31, Lorenzo Paulatto32, Chris J. Pickard33, Ward Poelmans1, Matt Probert23, Keith Refson25, Keith Refson34, Manuel Richter28, Manuel Richter29, Gian-Marco Rignanese18, Santanu Saha19, Matthias Scheffler14, Matthias Scheffler35, Martin Schlipf21, Karlheinz Schwarz5, Sangeeta Sharma11, Francesca Tavazza16, Patrik Thunström5, Alexandre Tkatchenko36, Alexandre Tkatchenko14, Marc Torrent, David Vanderbilt22, Michiel van Setten18, Veronique Van Speybroeck1, John M. Wills37, Jonathan R. Yates26, Guo-Xu Zhang38, Stefaan Cottenier1 
25 Mar 2016-Science
TL;DR: A procedure to assess the precision of DFT methods was devised and used to demonstrate reproducibility among many of the most widely used DFT codes, demonstrating that the precisionof DFT implementations can be determined, even in the absence of one absolute reference code.
Abstract: The widespread popularity of density functional theory has given rise to an extensive range of dedicated codes for predicting molecular and crystalline properties. However, each code implements the formalism in a different way, raising questions about the reproducibility of such predictions. We report the results of a community-wide effort that compared 15 solid-state codes, using 40 different potentials or basis set types, to assess the quality of the Perdew-Burke-Ernzerhof equations of state for 71 elemental crystals. We conclude that predictions from recent codes and pseudopotentials agree very well, with pairwise differences that are comparable to those between different high-precision experiments. Older methods, however, have less precise agreement. Our benchmark provides a framework for users and developers to document the precision of new applications and methodological improvements.

1,141 citations


Proceedings ArticleDOI
14 May 2016
TL;DR: This work presents a growing collection of Android Applications collected from several sources, including the official GooglePlay app market, which contains more than three million apps that have been analysed by tens of different AntiVirus products to know which applications are detected as Malware.
Abstract: We present a growing collection of Android Applications col-lected from several sources, including the official GooglePlay app market. Our dataset, AndroZoo, currently contains more than three million apps, each of which has beenanalysed by tens of different AntiVirus products to knowwhich applications are detected as Malware. We provide thisdataset to contribute to ongoing research efforts, as well asto enable new potential research topics on Android Apps.By releasing our dataset to the research community, we alsoaim at encouraging our fellow researchers to engage in reproducible experiments.

616 citations


Journal ArticleDOI
Anthony M. Reilly1, Richard I. Cooper2, Claire S. Adjiman3, Saswata Bhattacharya4, A. Daniel Boese5, Jan Gerit Brandenburg6, Peter J. Bygrave7, Rita Bylsma8, J.E. Campbell7, Roberto Car9, David H. Case7, Renu Chadha10, Jason C. Cole1, Katherine Cosburn11, Katherine Cosburn12, Herma M. Cuppen8, Farren Curtis12, Farren Curtis13, Graeme M. Day7, Robert A. DiStasio9, Robert A. DiStasio14, Alexander Dzyabchenko, Bouke P. van Eijck15, Dennis M. Elking16, Joost A. van den Ende8, Julio C. Facelli17, Marta B. Ferraro18, Laszlo Fusti-Molnar16, Christina-Anna Gatsiou3, Thomas S. Gee7, René de Gelder8, Luca M. Ghiringhelli4, Hitoshi Goto19, Stefan Grimme6, Rui Guo20, D. W. M. Hofmann21, Johannes Hoja4, Rebecca K. Hylton20, Luca Iuzzolino20, Wojciech Jankiewicz22, Daniël T. de Jong8, John Kendrick1, Niek J. J. de Klerk8, Hsin-Yu Ko9, L. N. Kuleshova, Xiayue Li12, Xiayue Li23, Sanjaya Lohani12, Frank J. J. Leusen1, Albert M. Lund17, Albert M. Lund16, Jian Lv4, Yanming Ma4, Noa Marom13, Noa Marom12, Artëm E. Masunov, Patrick McCabe1, David P. McMahon7, Hugo Meekes8, Michael P. Metz10, Alston J. Misquitta12, Sharmarke Mohamed11, Bartomeu Monserrat24, Richard J. Needs13, Marcus A. Neumann, Jonas Nyman7, Shigeaki Obata19, Harald Oberhofer15, Artem R. Oganov, Anita M. Orendt17, Gabriel Ignacio Pagola18, Constantinos C. Pantelides3, Chris J. Pickard20, Chris J. Pickard1, Rafał Podeszwa22, Louise S. Price20, Sarah L. Price20, Angeles Pulido7, Murray G. Read1, Karsten Reuter15, Elia Schneider20, Christoph Schober15, Gregory P. Shields1, Pawanpreet Singh10, Isaac J. Sugden3, Krzysztof Szalewicz10, Christopher R. Taylor7, Alexandre Tkatchenko25, Alexandre Tkatchenko26, Mark E. Tuckerman27, Mark E. Tuckerman28, Mark E. Tuckerman29, Francesca Vacarro30, Francesca Vacarro12, Manolis Vasileiadis3, Álvaro Vázquez-Mayagoitia2, Leslie Vogt20, Yanchao Wang4, Rona E. Watson20, Gilles A. de Wijs8, Jack Yang7, Qiang Zhu16, Colin R. Groom1 
TL;DR: The results of the sixth blind test of organic crystal structure prediction methods are presented and discussed, highlighting progress for salts, hydrates and bulky flexible molecules, as well as on-going challenges.
Abstract: The sixth blind test of organic crystal structure prediction (CSP) methods has been held, with five target systems: a small nearly rigid molecule, a polymorphic former drug candidate, a chloride salt hydrate, a co-crystal and a bulky flexible molecule. This blind test has seen substantial growth in the number of participants, with the broad range of prediction methods giving a unique insight into the state of the art in the field. Significant progress has been seen in treating flexible molecules, usage of hierarchical approaches to ranking structures, the application of density-functional approximations, and the establishment of new workflows and `best practices' for performing CSP calculations. All of the targets, apart from a single potentially disordered Z' = 2 polymorph of the drug candidate, were predicted by at least one submission. Despite many remaining challenges, it is clear that CSP methods are becoming more applicable to a wider range of real systems, including salts, hydrates and larger flexible molecules. The results also highlight the potential for CSP calculations to complement and augment experimental studies of organic solid forms.

435 citations


Journal ArticleDOI
TL;DR: The ability of HuMiX to recapitulate in vivo transcriptional, metabolic and immunological responses in human intestinal epithelial cells following their co-culture with the commensal Lactobacillus rhamnosus GG (LGG) grown under anaerobic conditions is demonstrated.
Abstract: We thank the scientists and technical staff of the Luxembourg Centre for Systems Biomedicine and Center for Applied Nanobioscience and Medicine, particularly Matthew Barrett and Brett Duane for their excellent technical assistance and engineering support We are grateful to Francois Bernardin, Nathalie Nicot and Laurent Vallar for the microarray analysis; Aidos Baumuratov for imaging support; Linda Wampach for HuMiX illustrations; and Anna Heintz-Buschart for fruitful discussions This work was supported by an ATTRACT programme grant (ATTRACT/A09/03), a CORE programme grant (CORE/11/BM/1186762), a European Union Joint Programming in Neurodegenerative Diseases grant (INTER/JPND/12/01) and a Proof-of-Concept grant (PoC-15/11014639) to PW, Accompany Measures mobility grant (12/AM2c/05) to PW and PS, an INTER mobility grant to PS (INTER/14/7516918), and an Aide a la Formation Recherche (AFR) postdoctoral grant (AFR/PDR 2013-1/BM/5821107) as well as a CORE programme grant (CORE/14/BM/8066232) to JVF, all funded by the Luxembourg National Research Fund (FNR) This work was further supported by a grant attributed to CS-D by the 'Fondation Recherche sur le SIDA du Luxembourg' Bioinformatics analyses presented in this paper were carried out in part using the HPC facilities of the University of Luxembourg (http://hpcunilu)

428 citations


Journal ArticleDOI
TL;DR: The narrow range of chemical analyses in current use by the medical community today will be replaced in the future by analyses that reveal a far more comprehensive metabolic signature, expected to describe global biochemical aberrations that reflect patterns of variance in states of wellness, more accurately describe specific diseases and their progression, and greatly aid in differential diagnosis.
Abstract: Metabolomics is the comprehensive study of the metabolome, the repertoire of biochemicals (or small molecules) present in cells, tissues, and body fluids. The study of metabolism at the global or “-omics” level is a rapidly growing field that has the potential to have a profound impact upon medical practice. At the center of metabolomics, is the concept that a person’s metabolic state provides a close representation of that individual’s overall health status. This metabolic state reflects what has been encoded by the genome, and modified by diet, environmental factors, and the gut microbiome. The metabolic profile provides a quantifiable readout of biochemical state from normal physiology to diverse pathophysiologies in a manner that is often not obvious from gene expression analyses. Today, clinicians capture only a very small part of the information contained in the metabolome, as they routinely measure only a narrow set of blood chemistry analytes to assess health and disease states. Examples include measuring glucose to monitor diabetes, measuring cholesterol and high density lipoprotein/low density lipoprotein ratio to assess cardiovascular health, BUN and creatinine for renal disorders, and measuring a panel of metabolites to diagnose potential inborn errors of metabolism in neonates. We anticipate that the narrow range of chemical analyses in current use by the medical community today will be replaced in the future by analyses that reveal a far more comprehensive metabolic signature. This signature is expected to describe global biochemical aberrations that reflect patterns of variance in states of wellness, more accurately describe specific diseases and their progression, and greatly aid in differential diagnosis. Such future metabolic signatures will: (1) provide predictive, prognostic, diagnostic, and surrogate markers of diverse disease states; (2) inform on underlying molecular mechanisms of diseases; (3) allow for sub-classification of diseases, and stratification of patients based on metabolic pathways impacted; (4) reveal biomarkers for drug response phenotypes, providing an effective means to predict variation in a subject’s response to treatment (pharmacometabolomics); (5) define a metabotype for each specific genotype, offering a functional read-out for genetic variants: (6) provide a means to monitor response and recurrence of diseases, such as cancers: (7) describe the molecular landscape in human performance applications and extreme environments. Importantly, sophisticated metabolomic analytical platforms and informatics tools have recently been developed that make it possible to measure thousands of metabolites in blood, other body fluids, and tissues. Such tools also enable more robust analysis of response to treatment. New insights have been gained about mechanisms of diseases, including neuropsychiatric disorders, cardiovascular disease, cancers, diabetes and a range of pathologies. A series of ground breaking studies supported by National Institute of Health (NIH) through the Pharmacometabolomics Research Network and its partnership with the Pharmacogenomics Research Network illustrate how a patient’s metabotype at baseline, prior to treatment, during treatment, and post-treatment, can inform about treatment outcomes and variations in responsiveness to drugs (e.g., statins, antidepressants, antihypertensives and antiplatelet therapies). These studies along with several others also exemplify how metabolomics data can complement and inform genetic data in defining ethnic, sex, and gender basis for variation in responses to treatment, which illustrates how pharmacometabolomics and pharmacogenomics are complementary and powerful tools for precision medicine. Our metabolomics community believes that inclusion of metabolomics data in precision medicine initiatives is timely and will provide an extremely valuable layer of data that compliments and informs other data obtained by these important initiatives. Our Metabolomics Society, through its “Precision Medicine and Pharmacometabolomics Task Group”, with input from our metabolomics community at large, has developed this White Paper where we discuss the value and approaches for including metabolomics data in large precision medicine initiatives. This White Paper offers recommendations for the selection of state of-the-art metabolomics platforms and approaches that offer the widest biochemical coverage, considers critical sample collection and preservation, as well as standardization of measurements, among other important topics. We anticipate that our metabolomics community will have representation in large precision medicine initiatives to provide input with regard to sample acquisition/preservation, selection of optimal omics technologies, and key issues regarding data collection, interpretation, and dissemination. We strongly recommend the collection and biobanking of samples for precision medicine initiatives that will take into consideration needs for large-scale metabolic phenotyping studies.

403 citations


Proceedings ArticleDOI
01 Feb 2016
TL;DR: This paper presents a novel website fingerprinting attack that outperforms all state-of-the-art methods in terms of classification accuracy while being computationally dramatically more efficient and shows that no existing method scales when applied in realistic settings.
Abstract: The website fingerprinting attack aims to identify the content (i.e., a webpage accessed by a client) of encrypted and anonymized connections by observing patterns of data flows such as packet size and direction. This attack can be performed by a local passive eavesdropper – one of the weakest adversaries in the attacker model of anonymization networks such as Tor. In this paper, we present a novel website fingerprinting attack. Based on a simple and comprehensible idea, our approach outperforms all state-of-the-art methods in terms of classification accuracy while being computationally dramatically more efficient. In order to evaluate the severity of the website fingerprinting attack in reality, we collected the most representative dataset that has ever been built, where we avoid simplified assumptions made in the related work regarding selection and type of webpages and the size of the universe. Using this data, we explore the practical limits of website fingerprinting at Internet scale. Although our novel approach is by orders of magnitude computationally more efficient and superior in terms of detection accuracy, for the first time we show that no existing method – including our own – scales when applied in realistic settings. With our analysis, we explore neglected aspects of the attack and investigate the realistic probability of success for different strategies a real-world adversary may follow.

388 citations


Journal ArticleDOI
TL;DR: In this article, a review summarizes recent progress in this field with emphasis on theoretical and computational developments and their applications to materials including molecular structures, Dirac-like systems, optical metamaterials, composites with nontrivial boundary conditions, and biological matter.
Abstract: Electromagnetic fluctuation-induced interactions known as van der Waals, Casimir, and Casimir-Polder forces are an active and exciting area of research. This review summarizes recent progress in this field with emphasis on theoretical and computational developments and their applications to materials including molecular structures, Dirac-like systems, optical metamaterials, composites with nontrivial boundary conditions, and biological matter.

311 citations


Journal ArticleDOI
TL;DR: It is demonstrated that itaconate acts as an endogenous succinate dehydrogenase inhibitor to cause succinate accumulation and links the innate immune response and tricarboxylic acid metabolism to function of the electron transport chain.

Journal ArticleDOI
TL;DR: An up-to-date review on the coupling of polarization with optical properties in ferroelectrics, highlighting several important issues and parameters, such as the role of domain walls, ways to tune the bandgap, consequences arising from the polarization switchability, and the roles of defects and contact electrodes, as well as the downscaling effects.
Abstract: Ferroelectrics carry a switchable spontaneous electric polarization. This polarization is usually coupled to strain, making ferroelectrics good piezoelectrics. When coupled to magnetism, they become so-called multiferroic systems, a field that has been widely investigated since 2003. While ferroelectrics are birefringent and non-linear optically transparent materials, the coupling of polarization with optical properties has received, since 2009, renewed attention, triggered notably by low-bandgap ferroelectrics suitable for sunlight spectrum absorption and original photovoltaic effects. Consequently, power conversion efficiencies up to 8.1% were recently achieved and values of 19.5% were predicted, making photoferroelectrics promising photovoltaic alternatives. This article aims at providing an up-to-date review on this emerging and rapidly progressing field by highlighting several important issues and parameters, such as the role of domain walls, ways to tune the bandgap, consequences arising from the polarization switchability, and the role of defects and contact electrodes, as well as the downscaling effects. Beyond photovoltaicity, other polarization-related processes are also described, like light-induced deformation (photostriction) or light-assisted chemical reaction (photostriction). It is hoped that this overview will encourage further avenues to be explored and challenged and, as a byproduct, will inspire other research communities in material science, e.g., so-called hybrid halide perovskites.

Journal ArticleDOI
TL;DR: A literature review of 190 application papers, published between 2004 and 2016, by classifying them on the basis of the area of application, the identified theme, the year of publication, and so forth, shows that FAHP is used primarily in the Manufacturing, Industry and Government sectors.
Abstract: A state-of the-art survey of FAHP applications is carried out: 190 papers are reviewedPapers are classified based on their: Application area, Theme, Year, Country, etc.Review is summarized in tabular formats/charts to help readers extract quick info.Results and Findings are made available through an online (free) testbedThe testbed makes fuzzy pairwise comparison matrices (from all papers) available As a practical popular methodology for dealing with fuzziness and uncertainty in Multiple Criteria Decision-Making (MCDM), Fuzzy AHP (FAHP) has been applied to a wide range of applications. As of the time of writing there is no state of the art survey of FAHP, we carry out a literature review of 190 application papers (i.e., applied research papers), published between 2004 and 2016, by classifying them on the basis of the area of application, the identified theme, the year of publication, and so forth. The identified themes and application areas have been chosen based upon the latest state-of-the-art survey of AHP conducted by Vaidya, O., & Kumar, S. (2006). Analytic hierarchy process: An overview of applications. European Journal of operational research, 169(1), 1-29.. To help readers extract quick and meaningful information, the reviewed papers are summarized in various tabular formats and charts. Unlike previous literature surveys, results and findings are made available through an online (and free) testbed, which can serve as a ready reference for those who wish to apply, modify or extend FAHP in various applications areas. This online testbed makes also available one or more fuzzy pairwise comparison matrices (FPCMs) from all the reviewed papers (255źmatrices in total).In terms of results and findings, this survey shows that: (i) FAHP is used primarily in the Manufacturing, Industry and Government sectors; (ii) Asia is the torchbearer in this field, where FAHP is mostly applied in the theme areas of Selection and Evaluation; (iii) a significant amount of research papers (43% of the reviewed literature) combine FAHP with other tools, particularly with TOPSIS, QFD and ANP (AHP's variant); (iv) Chang's extent analysis method, which is used for FPCMs' weight derivation in FAHP, is still the most popular method in spite of a number of criticisms in recent years (considered in 57% of the reviewed literature).

Journal ArticleDOI
TL;DR: The experimental results corroborate the effectiveness of the proposed approach compared with the state of the art, and incorporate deep belief networks for traffic and weather prediction and decision-level data fusion scheme to enhance prediction accuracy using weather conditions.
Abstract: Transportation systems might be heavily affected by factors such as accidents and weather. Specifically, inclement weather conditions may have a drastic impact on travel time and traffic flow. This study has two objectives: first, to investigate a correlation between weather parameters and traffic flow and, second, to improve traffic flow prediction by proposing a novel holistic architecture. It incorporates deep belief networks for traffic and weather prediction and decision-level data fusion scheme to enhance prediction accuracy using weather conditions. The experimental results, using traffic and weather data originated from the San Francisco Bay Area of California, corroborate the effectiveness of the proposed approach compared with the state of the art.

Journal ArticleDOI
TL;DR: In this article, the role of anionic (S/Se) distribution and cationic (Cu/Zn) disorder on the open-circuit voltage (Voc) and the ultimate photovoltaic performance of kesterite devices is clarified.
Abstract: Photovoltaic thin film solar cells based on kesterite Cu2ZnSn(Sx,Se1–x)4 compounds (CZTSSe) have reached >12% sunlight-to-electricity conversion efficiency. This is still far from the >20% record devices known in Cu(In1–y,Gay)Se2 and CdTe parent technologies. A selection of >9% CZTSSe devices reported in the literature is examined to review the progress achieved over the past few years. These devices suffer from a low open-circuit voltage (Voc) never better than 60% of the Voc max, which is expected from the Shockley-Queisser radiative limit (S-Q limit). The possible role of anionic (S/Se) distribution and of cationic (Cu/Zn) disorder on the Voc deficit and on the ultimate photovoltaic performance of kesterite devices, are clarified here. While the S/Se anionic distribution is expected to be homogeneous for any ratio x, some grain-to-grain and other non-uniformity over larger area can be found, as quantified on our CZTSSe films. Nevertheless, these anionic distributions can be considered to have a negligible impact on the Voc deficit. On the Cu/Zn order side, even though significant bandgap changes (>10%) can be observed, a similar conclusion is brought from experimental devices and from calculations, still within the radiative S-Q limit. The implications and future ways for improvement are discussed.

Journal ArticleDOI
TL;DR: The use of bilateral data for the analysis of international migration is at the same time a blessing and a curse as mentioned in this paper, since the dyadic dimension of the data enables researchers to analyze many previously unaddressed questions in the literature.
Abstract: The use of bilateral data for the analysis of international migration is at the same time a blessing and a curse. It is a blessing since the dyadic dimension of the data allows researchers to analyze many previously unaddressed questions in the literature. This paper reviews some of the recent studies using this type of data in a gravity framework in order to identify important factors a ecting international migration ows. Our review demonstrates that considerable e orts have been conducted by many scholars and that overall we have a much better knowledge of the relevant determinants. Still, the use of bilateral data is also a curse. The methodological challenges that are implied by the use of this type of data are numerous and our paper covers some of the most signi cant ones. These include sound theoretical foundations, accounting for multilateral resistance to migration as well the choice of appropriate estimation techniques dealing with the nature of the migration data and with endogeneity concerns

Journal ArticleDOI
TL;DR: This paper proposes to create a new set of features based on analyzing the periodic behavior of the time of a transaction using the von Mises distribution, and examines how the different sets of features have an impact on the results.
Abstract: Credit card fraud detection evaluation measure.Each example is assumed to have different financial cost.Transaction aggregation strategy for predicting fraud.Periodic features using the von Mises distribution.Code is open source and available at albahnsen.com/CostSensitiveClassification. Every year billions of Euros are lost worldwide due to credit card fraud. Thus, forcing financial institutions to continuously improve their fraud detection systems. In recent years, several studies have proposed the use of machine learning and data mining techniques to address this problem. However, most studies used some sort of misclassification measure to evaluate the different solutions, and do not take into account the actual financial costs associated with the fraud detection process. Moreover, when constructing a credit card fraud detection model, it is very important how to extract the right features from the transactional data. This is usually done by aggregating the transactions in order to observe the spending behavioral patterns of the customers. In this paper we expand the transaction aggregation strategy, and propose to create a new set of features based on analyzing the periodic behavior of the time of a transaction using the von Mises distribution. Then, using a real credit card fraud dataset provided by a large European card processing company, we compare state-of-the-art credit card fraud detection models, and evaluate how the different sets of features have an impact on the results. By including the proposed periodic features into the methods, the results show an average increase in savings of 13%.

Journal ArticleDOI
TL;DR: This work presents an integrative approach to resolve the taxonomic and functional attributes of gastrointestinal microbiota at the metagenomic, metatranscriptomic and metaproteomic levels and applies it to samples from four families with multiple cases of type 1 diabetes mellitus.
Abstract: Metagenomic, metatranscriptomic and metaproteomic approaches are used to resolve the taxonomic and functional characteristics of gastrointestinal microbiota from four families with multiple cases of type 1 diabetes mellitus.

Book
23 Aug 2016
TL;DR: Fractional Brownian motion (fBm) as mentioned in this paper is a stochastic process which deviates significantly from Brownian Motion and semimartingales, and others classically used in probability theory.
Abstract: Fractional Brownian motion (fBm) is a stochastic process which deviates significantly from Brownian motion and semimartingales, and others classically used in probability theory. As a centered Gaussian process, it is characterized by the stationarity of its increments and a medium- or long-memory property which is in sharp contrast with martingales and Markov processes. FBm has become a popular choice for applications where classical processes cannot model these non-trivial properties; for instance long memory, which is also known as persistence, is of fundamental importance for financial data and in internet traffic. The mathematical theory of fBm is currently being developed vigorously by a number of stochastic analysts, in various directions, using complementary and sometimes competing tools. This book is concerned with several aspects of fBm, including the stochastic integration with respect to it, the study of its supremum and its appearance as limit of partial sums involving stationary sequences, to name but a few. The book is addressed to researchers and graduate students in probability and mathematical statistics. With very few exceptions (where precise references are given), every stated result is proved.

Journal ArticleDOI
11 Mar 2016-Science
TL;DR: It is demonstrated that a qualitatively correct description of the vdW interactions between polarizable nanostructures over a wide range of finite distances can only be attained by accounting for the wavelike nature of charge density fluctuations.
Abstract: Recent experiments on noncovalent interactions at the nanoscale have challenged the basic assumptions of commonly used particle- or fragment-based models for describing van der Waals (vdW) or dispersion forces. We demonstrate that a qualitatively correct description of the vdW interactions between polarizable nanostructures over a wide range of finite distances can only be attained by accounting for the wavelike nature of charge density fluctuations. By considering a diverse set of materials and biological systems with markedly different dimensionalities, topologies, and polarizabilities, we find a visible enhancement in the nonlocality of the charge density response in the range of 10 to 20 nanometers. These collective wavelike fluctuations are responsible for the emergence of nontrivial modifications of the power laws that govern noncovalent interactions at the nanoscale.

Journal ArticleDOI
TL;DR: This review examines whether psychological preparation has impact on the outcomes of postoperative pain, behavioural recovery, length of stay and negative affect in adults undergoing elective surgery under general anaesthetic.
Abstract: Background In a review and meta-analysis conducted in 1993, psychological preparation was found to be beneficial for a range of outcome variables including pain, behavioural recovery, length of stay and negative affect. Since this review, more detailed bibliographic searching has become possible, additional studies testing psychological preparation for surgery have been completed and hospital procedures have changed. The present review examines whether psychological preparation (procedural information, sensory information, cognitive intervention, relaxation, hypnosis and emotion-focused intervention) has impact on the outcomes of postoperative pain, behavioural recovery, length of stay and negative affect. Objectives To review the effects of psychological preparation on postoperative outcomes in adults undergoing elective surgery under general anaesthetic. Search methods We searched the Cochrane Register of Controlled Trials (CENTRAL 2014, Issue 5), MEDLINE (OVID SP) (1950 to May 2014), EMBASE (OVID SP) (1982 to May 2014), PsycINFO (OVID SP) (1982 to May 2014), CINAHL (EBESCOhost) (1980 to May 2014), Dissertation Abstracts (to May 2014) and Web of Science (1946 to May 2014). We searched reference lists of relevant studies and contacted authors to identify unpublished studies. We reran the searches in July 2015 and placed the 38 studies of interest in the ‘awaiting classification’ section of this review. Selection criteria We included randomized controlled trials of adult participants (aged 16 or older) undergoing elective surgery under general anaesthesia. We excluded studies focusing on patient groups with clinically diagnosed psychological morbidity. We did not limit the search by language or publication status. We included studies testing a preoperative psychological intervention that included at least one of these seven techniques: procedural information; sensory information; behavioural instruction; cognitive intervention; relaxation techniques; hypnosis; emotion-focused intervention. We included studies that examined any one of our postoperative outcome measures (pain, behavioural recovery, length of stay, negative affect) within one month post-surgery. Data collection and analysis One author checked titles and abstracts to exclude obviously irrelevant studies. We obtained full reports of apparently relevant studies; two authors fully screened these. Two authors independently extracted data and resolved discrepancies by discussion. Where possible we used random-effects meta-analyses to combine the results from individual studies. For length of stay we pooled mean differences. For pain and negative affect we used a standardized effect size (the standardized mean difference (SMD), or Hedges’g) to combine data from different outcome measures. If data were not available in a form suitable for meta-analysis we performed a narrative review. Main results Searches identified 5116 unique papers; we retrieved 827 for full screening. In this review, we included 105 studies from 115 papers, in which 10,302 participants were randomized. Mainly as a result of updating the search in July 2015, 38 papers are awaiting classification. Sixty-one of the 105 studies measured the outcome pain, 14 behavioural recovery, 58 length of stay and 49 negative affect. Participants underwent a wide range of surgical procedures, and a range of psychological components were used in interventions, frequently in combination. In the 105 studies, appropriate data were provided for the meta-analysis of 38 studies measuring the outcome postoperative pain (2713 participants), 36 for length of stay (3313 participants) and 31 for negative affect (2496 participants). We narratively reviewed the remaining studies (including the 14 studies with 1441 participants addressing behavioural recovery). When pooling the results for all types of intervention there was low quality evidence that psychological preparation techniques were associated with lower postoperative pain (SMD -0.20, 95% confidence interval (CI) -0.35 to -0.06), length of stay (mean difference -0.52 days, 95% CI - 0.82 to -0.22) and negative affect (SMD -0.35, 95% CI -0.54 to -0.16) compared with controls. Results tended to be similar for all categories of intervention, although there was no evidence that behavioural instruction reduced the outcome pain. However, caution must be exercised when interpreting the results because of heterogeneity in the types of surgery, interventions and outcomes. Narratively reviewed evidence for the outcome behavioural recovery provided very low quality evidence that psychological preparation, in particular behavioural instruction, may have potential to improve behavioural recovery outcomes, but no clear conclusions could be reached. Generally, the evidence suffered from poor reporting, meaning that few studies could be classified as having low risk of bias. Overall, we rated the quality of evidence for each outcome as ‘low’ because of the high level of heterogeneity in meta-analysed studies and the unclear risk of bias. In addition, for the outcome behavioural recovery, too few studies used robust measures and reported suitable data for meta-analysis, so we rated the quality of evidence as ‘very low’. Authors’ conclusions The evidence suggested that psychological preparation may be beneficial for the outcomes postoperative pain, behavioural recovery, negative affect and length of stay, and is unlikely to be harmful. However, at present, the strength of evidence is insufficient to reach firm conclusions on the role of psychological preparation for surgery. Further analyses are needed to explore the heterogeneity in the data, to identify more specifically when intervention techniques are of benefit. As the current evidence quality is low or very low, there is a need for well-conducted and clearly reported research.

Journal ArticleDOI
TL;DR: Clinicians, researchers, and citizens need improved methods, tools, and training to generate, analyze, and query data effectively and contribute to creating the European Single Market for health, which will improve health and healthcare for all Europeans.
Abstract: Medicine and healthcare are undergoing profound changes. Whole-genome sequencing and high-resolution imaging technologies are key drivers of this rapid and crucial transformation. Technological innovation combined with automation and miniaturization has triggered an explosion in data production that will soon reach exabyte proportions. How are we going to deal with this exponential increase in data production? The potential of “big data” for improving health is enormous but, at the same time, we face a wide range of challenges to overcome urgently. Europe is very proud of its cultural diversity; however, exploitation of the data made available through advances in genomic medicine, imaging, and a wide range of mobile health applications or connected devices is hampered by numerous historical, technical, legal, and political barriers. European health systems and databases are diverse and fragmented. There is a lack of harmonization of data formats, processing, analysis, and data transfer, which leads to incompatibilities and lost opportunities. Legal frameworks for data sharing are evolving. Clinicians, researchers, and citizens need improved methods, tools, and training to generate, analyze, and query data effectively. Addressing these barriers will contribute to creating the European Single Market for health, which will improve health and healthcare for all Europeans.

Journal ArticleDOI
TL;DR: Various implementation schemes of parallel reaction monitoring (PRM) are described and their benefits and limitations in terms of quantification performance and confidence in analyte identification are discussed.
Abstract: Targeted mass spectrometry-based approaches are nowadays widely used for quantitative proteomics studies and more recently have been implemented on high resolution/accurate mass (HRAM) instruments resulting in a considerable performance improvement. More specifically, the parallel reaction monitoring technique (PRM) performed on quadrupole-Orbitrap mass spectrometers, leveraging the high resolution and trapping capabilities of the instrument, offers a clear advantage over the conventional selected reaction monitoring (SRM) measurements executed on triple quadrupole instruments. Analyses performed in HRAM mode allow for an improved discrimination between signals derived from analytes and those resulting from matrix interferences translating in the reliable quantification of low abundance components. The purpose of the study defines various implementation schemes of PRM, namely: (i) exploratory experiments assessing the detectability of very large sets of peptides (100-1000), (ii) wide-screen analyses using (crude) internal standards to obtain statistically meaningful (relative) quantitative analyses, and (iii) precise/accurate quantification of a limited number of analytes using calibrated internal standards. Each of the three implementation schemes requires specific acquisition methods with defined parameters to appropriately control the acquisition during the actual peptide elution. This tutorial describes the different PRM approaches and discusses their benefits and limitations in terms of quantification performance and confidence in analyte identification.

Journal ArticleDOI
TL;DR: This survey paper provides a detailed review of the state of the art related to the application of CS in CR communications and provides a classification of the main usage areas based on the radio parameter to be acquired by a wideband CR.
Abstract: Compressive sensing (CS) has received much attention in several fields such as digital image processing, wireless channel estimation, radar imaging, and cognitive radio (CR) communications. Out of these areas, this survey paper focuses on the application of CS in CR communications. Due to the under-utilization of the allocated radio spectrum, spectrum occupancy is usually sparse in different domains such as time, frequency, and space. Such a sparse nature of the spectrum occupancy has inspired the application of CS in CR communications. In this regard, several researchers have already applied the CS theory in various settings considering the sparsity in different domains. In this direction, this survey paper provides a detailed review of the state of the art related to the application of CS in CR communications. Starting with the basic principles and the main features of CS, it provides a classification of the main usage areas based on the radio parameter to be acquired by a wideband CR. Subsequently, we review the existing CS-related works applied to different categories such as wideband sensing, signal parameter estimation and radio environment map (REM) construction, highlighting the main benefits and the related issues. Furthermore, we present a generalized framework for constructing the REM in compressive settings. Finally, we conclude this survey paper with some suggested open research challenges and future directions.

Proceedings ArticleDOI
21 Mar 2016
TL;DR: A new hash function Argon2 is presented, which is oriented at protection of low-entropy secrets without secret keys, which can provide ASIC-and botnet-resistance by filling the memory in 0.6 cycles per byte in the non-compressible way.
Abstract: We present a new hash function Argon2, which is oriented at protection of low-entropy secrets without secret keys. It requires a certain (but tunable) amount of memory, imposes prohibitive time-memory and computation-memory tradeoffs on memory-saving users, and is exceptionally fast on regular PC. Overall, it can provide ASIC-and botnet-resistance by filling the memory in 0.6 cycles per byte in the non-compressible way.

Journal ArticleDOI
TL;DR: How the communication framework underlying construction management systems can be further improved so as to fully or partially automate various communication functions across the construction project lifecycle and how the Internet of Things (IoT) and related standards can contribute to such an improvement is investigated.

Journal ArticleDOI
TL;DR: The vision on advanced precoding techniques and user clustering methods for multibeam broadband fixed satellite communications and practical challenges to deploy precoding schemes and the support introduced in the recently published DVB-S2X standard are provided.
Abstract: Whenever multibeam satellite systems target very aggressive frequency reuse in their coverage area, inter-beam interference becomes the major obstacle for increasing the overall system throughput. As a matter of fact, users located at the beam edges suffer from very large interference for even a moderately aggressive planning of reuse-2. Although solutions for inter-beam interference management have been investigated at the satellite terminal, it turns out that the performance improvement does not justify the increased terminal complexity and cost. In this article, we pay attention to interference mitigation techniques that take place at the transmitter (i.e., the gateway). Based on this understanding, we provide our vision on advanced precoding techniques and user clustering methods for multibeam broadband fixed satellite communications. We also discuss practical challenges to deploy precoding schemes and the support introduced in the recently published DVB-S2X standard. Future challenges for novel configurations employing precoding are also provided.

Journal ArticleDOI
TL;DR: Stable isotope-assisted metabolomics techniques are applied and it is demonstrated that pyruvate oxidation is maintained in mature pro-inflammatory macrophages and that the PDH flux is an important node for M(LPS) macrophage activation.

Journal ArticleDOI
TL;DR: In this article, it was shown from both theoretical and experimental standpoints that disorder of Cu and Zn atoms is in all probability the primary cause of band gap fluctuations in CZTS.
Abstract: Cu2ZnSn(S,Se)4 (CZTS(e)) solar cells suffer from low-open-circuit voltages that have been blamed on the existence of band gap fluctuations, with different possible origins. In this paper, we show from both theoretical and experimental standpoints that disorder of Cu and Zn atoms is in all probability the primary cause of these fluctuations. First, quantification of Cu–Zn disorder in CZTS thin films is presented. The results indicate that disorder is prevalent in the majority of practical samples used for solar cells. Then, ab initio calculations for different arrangements and densities of disorder-induced [CuZn + ZnCu] defect pairs are presented and it is shown that spatial variations in band gap of the order of 200 meV can easily be caused by Cu–Zn disorder, which would cause large voltage losses in solar cells. Experiments using Raman spectroscopy and room temperature photoluminescence combined with in situ heat-treatments show that a shift in the energy of the dominant band-to-band recombination pathway correlates perfectly to the order-disorder transition, which clearly implicates Cu–Zn disorder as the cause of band gap fluctuations in CZTS. Our results suggest that elimination or passivation of Cu–Zn disorder could be very important for future improvements in the efficiency of CZTS(e)-based solar cells.

Journal ArticleDOI
TL;DR: A definition of interoception as based on subjective experience is proposed, and pleas for the use of specific vocabulary in addressing the many aspects that contribute to it are made.
Abstract: Over the course of a century, the meaning of interoception has changed from the restrictive to the inclusive. In its inclusive sense, it bears relevance to every individual via its link to emotion, decision making, time-perception, health, pain, and various other areas of life. While the label for the perception of the body state changes over time, the need for an overarching concept remains. Many aspects can make any particular interoceptive sensation unique and distinct from any other interoceptive sensation. This can range from the sense of agency, to the physical cause of a sensation, the ontogenetic origin, the efferent innervation, and afferent pathways of the tissue involved amongst others. In its overarching meaning, interoception primarily is a product of the central nervous system, a construct based on an integration of various sources, not per se including afferent information. This paper proposes a definition of interoception as based on subjective experience, and pleas for the use of specific vocabulary in addressing the many aspects that contribute to it.