scispace - formally typeset
Search or ask a question

Showing papers by "University of Luxembourg published in 2012"


Book ChapterDOI
01 Jan 2012
TL;DR: In this article, the authors argue that success lies in being able to communicate, share, and use information to solve complex problems, in adapting and innovating in response to new demands and changing circumstances, in marshaling and expanding the power of technology to create new knowledge, and in expanding human capacity and productivity.
Abstract: As the previous chapter indicates, there has been a significant shift in advanced economies from manufacturing to information and knowledge services. Knowledge itself is growing ever more specialized and expanding exponentially. Information and communication technology is transforming the nature of how work is conducted and the meaning of social relationships. Decentralized decision making, information sharing, teamwork, and innovation are key in today’s enterprises. No longer can students look forward to middle class success in the conduct of manual labor or use of routine skills – work that can be accomplished by machines. Rather, whether a technician or a professional person, success lies in being able to communicate, share, and use information to solve complex problems, in being able to adapt and innovate in response to new demands and changing circumstances, in being able to marshal and expand the power of technology to create new knowledge, and in expanding human capacity and productivity.

1,056 citations


01 Jan 2012
TL;DR: In this paper, the phonon dispersion relations of the single-layer and bulk dichalcogenides MoS2 and WS2 were investigated and the behavior of the Raman-active modes A1g and E 1 2g as a function of the number of layers was explored.
Abstract: We report ab initio calculations of the phonon dispersion relations of the single-layer and bulk dichalcogenides MoS2 and WS2. We explore in detail the behavior of the Raman-active modes A1g and E 1 2g as a function of the number of layers. In agreement with recent Raman spectroscopy measurements [C. Lee et al., ACS Nano 4, 2695 (2010)], we find that the A1g mode increases in frequency with an increasing number of layers while the E 1g mode decreases. We explain this decrease by an enhancement of the dielectric screening of the long-range Coulomb interaction between the effective charges with a growing number of layers. This decrease in the long-range part overcompensates for the increase of the short-range interaction due to the weak interlayer interaction.

829 citations


Book
01 May 2012
TL;DR: In this article, the authors provide an ideal introduction both to Stein's method and Malliavin calculus, from the standpoint of normal approximations on a Gaussian space, and explain the connections between Stein's methods and Mallian calculus of variations.
Abstract: Stein's method is a collection of probabilistic techniques that allow one to assess the distance between two probability distributions by means of differential operators. In 2007, the authors discovered that one can combine Stein's method with the powerful Malliavin calculus of variations, in order to deduce quantitative central limit theorems involving functionals of general Gaussian fields. This book provides an ideal introduction both to Stein's method and Malliavin calculus, from the standpoint of normal approximations on a Gaussian space. Many recent developments and applications are studied in detail, for instance: fourth moment theorems on the Wiener chaos, density estimates, Breuer–Major theorems for fractional processes, recursive cumulant computations, optimal rates and universality results for homogeneous sums. Largely self-contained, the book is perfect for self-study. It will appeal to researchers and graduate students in probability and statistics, especially those who wish to understand the connections between Stein's method and Malliavin calculus.

712 citations


Journal ArticleDOI
TL;DR: A simulation environment for energy-aware cloud computing data centers is presented and the effectiveness of the simulator in utilizing power management schema, such as voltage scaling, frequency scaling, and dynamic shutdown that are applied to the computing and networking components are demonstrated.
Abstract: Cloud computing data centers are becoming increasingly popular for the provisioning of computing resources. The cost and operating expenses of data centers have skyrocketed with the increase in computing capacity. Several governmental, industrial, and academic surveys indicate that the energy utilized by computing and communication units within a data center contributes to a considerable slice of the data center operational costs. In this paper, we present a simulation environment for energy-aware cloud computing data centers. Along with the workload distribution, the simulator is designed to capture details of the energy consumed by data center components (servers, switches, and links) as well as packet-level communication patterns in realistic setups. The simulation results obtained for two-tier, three-tier, and three-tier high-speed data center architectures demonstrate the effectiveness of the simulator in utilizing power management schema, such as voltage scaling, frequency scaling, and dynamic shutdown that are applied to the computing and networking components.

599 citations


Journal ArticleDOI
TL;DR: In this paper, the structural and electronic data relevant for the solar cells were summarised and the authors concluded that the equilibrium structure of both Cu2ZnSnS4 and Cu2znSnSe4 is the kesterite structure.
Abstract: Kesterite materials (Cu2ZnSn(S,Se)4) are made from non-toxic, earth-abundant and low-cost raw materials. We summarise here the structural and electronic material data relevant for the solar cells. The equilibrium structure of both Cu2ZnSnS4 and Cu2ZnSnSe4 is the kesterite structure. However, the stannite structure has only a slightly lower binding energy. Because the band gap of the stannite is predicted to be about 100 meV lower than the kesterite band gap, any admixture of stannite will hurt the solar cells. The band gaps of Cu2ZnSnS4 and Cu2ZnSnSe4 are 1.5 and 1.0 eV, respectively. Hardly any experiments on defects are available. Theoretically, the CuZn antisite acceptor is predicted as the most probable defect. The existence region of the kesterite phase is smaller compared with that of chalcopyrites. This makes secondary phases a serious challenge in the development of solar cells. Copyright © 2012 John Wiley & Sons, Ltd.

549 citations


Journal ArticleDOI
TL;DR: What can be considered dysfunctional use of the mobile phone is described and its multifactorial nature is emphasized, and a pathways model that integrates the existing literature is proposed.
Abstract: Despite its unambiguous advantages, cellular phone use has been associated with harmful or potentially disturbing behaviors. Problematic use of the mobile phone is considered as an inability to regulate one's use of the mobile phone, which eventually involves negative consequences in daily life (e.g., financial problems). The current article describes what can be considered dysfunctional use of the mobile phone and emphasizes its multifactorial nature. Validated assessment instruments to measure problematic use of the mobile phone are described. The available literature on risk factors for dysfunctional mobile phone use is then reviewed, and a pathways model that integrates the existing literature is proposed. Finally, the assumption is made that dysfunctional use of the mobile phone is part of a spectrum of cyber addictions that encompasses a variety of dysfunctional behaviors and implies involvement in specific online activities (e.g., video games, gambling, social networks, sex-related websites).

492 citations


Journal ArticleDOI
TL;DR: This tutorial considers alternative confirmatory factor analytic models in depth, addressing their psychometric properties, interpretation of general and specific constructs, and implications for model-based score reliabilities.
Abstract: Many psychological constructs are conceived to be hierarchically structured and thus to operate at various levels of generality. Alternative confirmatory factor analytic (CFA) models can be used to study various aspects of this proposition: (a) The one-factor model focuses on the top of the hierarchy and contains only a general construct, (b) the first-order factor model focuses on the intermediate level of the hierarchy and contains only specific constructs, and both (c) the higher order factor model and (d) the nested-factor model consider the hierarchy in its entirety and contain both general and specific constructs (e.g., bifactor model). This tutorial considers these CFA models in depth, addressing their psychometric properties, interpretation of general and specific constructs, and implications for model-based score reliabilities. The authors illustrate their arguments with normative data obtained for the Wechsler Adult Intelligence Scale and conclude with recommendations on which CFA model is most appropriate for which research and diagnostic purposes.

361 citations


Journal ArticleDOI
TL;DR: An integrative analysis approach and web-application that combines a novel graph-based statistic with an interactive sub-network visualization to accomplish two complementary goals: improving the prioritization of putative functional gene/protein set associations by exploiting information from molecular interaction networks and tissue-specific gene expression data and enabling a direct biological interpretation of the results.
Abstract: Motivation: Assessing functional associations between an experimentally derived gene or protein set of interest and a database of known gene/protein sets is a common task in the analysis of large-scale functional genomics data. For this purpose, a frequently used approach is to apply an over-representation-based enrichment analysis. However, this approach has four drawbacks: (i) it can only score functional associations of overlapping gene/proteins sets; (ii) it disregards genes with missing annotations; (iii) it does not take into account the network structure of physical interactions between the gene/protein sets of interest and (iv) tissue-specific gene/protein set associations cannot be recognized. Results: To address these limitations, we introduce an integrative analysis approach and web-application called EnrichNet. It combines a novel graph-based statistic with an interactive sub-network visualization to accomplish two complementary goals: improving the prioritization of putative functional gene/protein set associations by exploiting information from molecular interaction networks and tissue-specific gene expression data and enabling a direct biological interpretation of the results. By using the approach to analyse sets of genes with known involvement in human diseases, new pathway associations are identified, reflecting a dense sub-network of interactions between their corresponding proteins. Availability: EnrichNet is freely available at http://www.enrichnet.org. Contact: ku.ca.mahgnitton@rogonsarK.oilataN, ul.inu@redienhcs.drahnier or se.oinc@aicnelava Supplementary Information: Supplementary data are available at Bioinformatics Online.

340 citations


Journal ArticleDOI
TL;DR: A general formulation of stochastic thermodynamics is presented for open systems exchanging energy and particles with multiple reservoirs by introducing a partition in terms of "mesostates" (e.g., sets of "microstates"), the consequence on the thermodynamic description of the system is studied in detail.
Abstract: A general formulation of stochastic thermodynamics is presented for open systems exchanging energy and particles with multiple reservoirs. By introducing a partition in terms of ``mesostates'' (e.g., sets of ``microstates''), the consequence on the thermodynamic description of the system is studied in detail. When microstates within mesostates rapidly thermalize, the entire structure of the microscopic theory is recovered at the mesostate level. This is not the case when these microstates remain out of equilibrium, leading to additional contributions to the entropy balance. Some of our results are illustrated for a model of two coupled quantum dots.

335 citations


Journal ArticleDOI
TL;DR: A generic optimization framework for linear precoding design to handle any objective functions of data rate with general linear and nonlinear power constraints is provided and an iterative algorithm which optimizes the precoding vectors and power allocation alternatingly is proposed and most importantly, the proposed algorithm is proved to always converge.
Abstract: Multibeam satellite systems have been employed to provide interactive broadband services to geographical areas under-served by terrestrial infrastructure. In this context, this paper studies joint multiuser linear precoding design in the forward link of fixed multibeam satellite systems. We provide a generic optimization framework for linear precoding design to handle any objective functions of data rate with general linear and nonlinear power constraints. To achieve this, an iterative algorithm which optimizes the precoding vectors and power allocation alternatingly is proposed and most importantly, the proposed algorithm is proved to always converge. The proposed optimization algorithm is also applicable to nonlinear dirty paper coding. As a special case, a more efficient algorithm is devised to find the optimal solution to the problem of maximizing the proportional fairness among served users. In addition, the aforementioned problems and algorithms are extended to the case that each terminal has multiple co-polarization or dual-polarization antennas. Simulation results demonstrate substantial performance improvement of the proposed schemes over conventional multibeam satellite systems, zero-forcing and regularized zero-forcing precoding schemes in terms of meeting the traffic demand, e.g., using real beam patterns, over twice higher throughput can be achieved compared with the conventional scheme. The performance of the proposed linear precoding scheme is also shown to be very close to the dirty paper coding.

296 citations


Journal ArticleDOI
TL;DR: The purpose of this review is to discuss the role of AGEs in cardiovascular disease and in particular in heart failure, focussing on both cellular mechanisms of action as well as highlighting how targeting A GEs may represent a novel therapeutic strategy in the treatment of HF.
Abstract: Advanced glycation end products (AGEs) are produced through the non enzymatic glycation and oxidation of proteins, lipids and nucleic acids. Enhanced formation of AGEs occurs particularly in conditions associated with hyperglycaemia such as diabetes mellitus (DM). AGEs are believed to have a key role in the development and progression of cardiovascular disease in patients with DM through the modification of the structure, function and mechanical properties of tissues through crosslinking intracellular as well as extracellular matrix proteins and through modulating cellular processes through binding to cell surface receptors [receptor for AGEs (RAGE)]. A number of studies have shown a correlation between serum AGE levels and the development and severity of heart failure (HF). Moreover, some studies have suggested that therapies targeted against AGEs may have therapeutic potential in patients with HF. The purpose of this review is to discuss the role of AGEs in cardiovascular disease and in particular in heart failure, focussing on both cellular mechanisms of action as well as highlighting how targeting AGEs may represent a novel therapeutic strategy in the treatment of HF.

Journal ArticleDOI
TL;DR: The results demonstrate that the bilingual advantage is neither confounded with nor limited by socioeconomic and cultural factors and, second, that separable aspects of executive functioning are differentially affected by bilingualism.
Abstract: This study explores whether the cognitive advantage associated with bilingualism in executive functioning extends to young immigrant children challenged by poverty and, if it does, which specific processes are most affected. In the study reported here, 40 Portuguese-Luxembourgish bilingual children from low-income immigrant families in Luxembourg and 40 matched monolingual children from Portugal completed visuospatial tests of working memory, abstract reasoning, selective attention, and interference suppression. Two broad cognitive factors of executive functioning-representation (abstract reasoning and working memory) and control (selective attention and interference suppression)-emerged from principal component analysis. Whereas there were no group differences in representation, the bilinguals performed significantly better than did the monolinguals in control. These results demonstrate, first, that the bilingual advantage is neither confounded with nor limited by socioeconomic and cultural factors and, second, that separable aspects of executive functioning are differentially affected by bilingualism. The bilingual advantage lies in control but not in visuospatial representational processes.

Book ChapterDOI
15 Mar 2012
TL;DR: The work in this paper summarises current work on intercultural citizenship and competences for democratic citizenship, and thoughts about future directions in which the internationalist educational purposes of FLE might be further realised.
Abstract: Traditionally, the notion of citizenship has been linked to the concept of nation, the ‘imagined community’ dependent on a common language. State and nation are assumed to be one: the nation state. The contemporary world has created many other identifications and opportunities for political activities in communities which are transnational. This generates the question of language of communication. The ‘nation state’ has been successful through its education system in imposing a common language within its boundaries. Members of transnational groups therefore bring their own language to interaction, and need to be able to communicate despite the differences. Citizenship across linguistic boundaries thus requires competences of communication as well as those of critical understanding of the conditions under which political activity can take place. Citizenship education prepares young people for political activity up to the level of the state. Foreign language education (FLE) prepares them for interaction with people with another language. The combination of the aims and purposes of citizenship education with those of foreign language education prepares learners for ‘intercultural citizenship’. This chapter summarises current work on intercultural citizenship and competences for democratic citizenship, and thoughts about future directions in which the internationalist educational purposes of FLE might be further realised.

Journal ArticleDOI
TL;DR: In this paper, micro-resolved Raman investigations revealed local variations in the spectra that are attributed to a secondary phase (possibly Cu2SnS3S7), which exemplifies the abilities of micro-resolution Raman measurements in the detection of secondary phases.
Abstract: Secondary phases like Cu2SnS3 are major obstacles for kesterite thin film solar cell applications. We prepare Cu2SnS3 using identical annealing conditions as used for the kesterite films. By x-ray diffraction, the crystal structure of Cu2SnS3 was identified as monoclinic. Polarization-dependent Raman investigations allowed the identification of the dominant peaks at 290 cm−1 and 352 cm−1 with the main A′ symmetry vibrational modes from the monoclinic Cu2SnS3 phase. Furthermore, micro-resolved Raman investigations revealed local variations in the spectra that are attributed to a secondary phase (possibly Cu2Sn3S7). This exemplifies the abilities of micro-resolved Raman measurements in the detection of secondary phases.

Book ChapterDOI
15 Apr 2012
TL;DR: A compression technique that reduces the public key size of van Dijk, Gentry, Halevi and Vaikuntanathan's (DGHV) fully homomorphic scheme over the integers from O(λ7) to O( λ5) remains semantically secure, but in the random oracle model.
Abstract: We describe a compression technique that reduces the public key size of van Dijk, Gentry, Halevi and Vaikuntanathan's (DGHV) fully homomorphic scheme over the integers from O(λ7) to O(λ5). Our variant remains semantically secure, but in the random oracle model. We obtain an implementation of the full scheme with a 10.1 MB public key instead of 802 MB using similar parameters as in [7]. Additionally we show how to extend the quadratic encryption technique of [7] to higher degrees, to obtain a shorter public-key for the basic scheme. This paper also describes a new modulus switching technique for the DGHV scheme that enables to use the new FHE framework without bootstrapping from Brakerski, Gentry and Vaikuntanathan with the DGHV scheme. Finally we describe an improved attack against the Approximate GCD Problem on which the DGHV scheme is based, with complexity O(2ρ) instead of O(23ρ/2).

Journal ArticleDOI
TL;DR: In this paper, a ternary compound has been produced via the annealing of an electrodeposited precursor in a sulfur and tin sulfide environment, and the obtained absorber layer has been structurally investigated by X-ray diffraction and results indicate the crystal structure to be monoclinic.

Journal ArticleDOI
TL;DR: It is predicted that systems approaches will empower the transition from conventional reactive medical practice to a more proactive P4 medicine focused on wellness, and will reverse the escalating costs of drug development an will have enormous social and economic benefits.
Abstract: Personalized medicine is a term for a revolution in medicine that envisions the individual patient as the central focus of healthcare in the future. The term "personalized medicine", however, fails to reflect the enormous dimensionality of this new medicine that will be predictive, preventive, personalized, and participatory-a vision of medicine we have termed P4 medicine. This reflects a paradigm change in how medicine will be practiced that is revolutionary rather than evolutionary. P4 medicine arises from the confluence of a systems approach to medicine and from the digitalization of medicine that creates the large data sets necessary to deal with the complexities of disease. We predict that systems approaches will empower the transition from conventional reactive medical practice to a more proactive P4 medicine focused on wellness, and will reverse the escalating costs of drug development an will have enormous social and economic benefits. Our vision for P4 medicine in 10 years is that each patient will be associated with a virtual data cloud of billions of data points and that we will have the information technology for healthcare to reduce this enormous data dimensionality to simple hypotheses about health and/or disease for each individual. These data will be multi-scale across all levels of biological organization and extremely heterogeneous in type - this enormous amount of data represents a striking signal-to-noise (S/N) challenge. The key to dealing with this S/N challenge is to take a "holistic systems approach" to disease as we will discuss in this article.

Journal ArticleDOI
TL;DR: In this article, a strain smoothing procedure for the extended finite element method (XFEM) is presented, which is tailored to linear elastic fracture mechanics and, in this context, to outperform the standard XFEM.

Journal ArticleDOI
TL;DR: In this paper, the authors survey and assess the intrinsic links between production, factor substitution, and normalization, defined by the fixing of baseline values for relevant variables, and discuss the benefits normalization brings for empirical estimation and empirical growth research.
Abstract: The elasticity of substitution between capital and labor and, in turn, the direction of technical change are critical parameters in many fields of economics. Until recently, though, the application of production functions with specifically non-unitary substitution elasticities (i.e., non-Cobb–Douglas) was hampered by empirical and theoretical uncertainties. As recently revealed, ‘normalization’ of production-technology systems holds out the promise of resolving many of those uncertainties. We survey and assess the intrinsic links between production (as conceptualized in a production function), factor substitution (as made most explicit in Constant Elasticity of Substitution functions) and normalization (defined by the fixing of baseline values for relevant variables). First, we recall how the normalized Constant Elasticity of Substitution function came into existence and what normalization implies for its formal properties. Then we deal with the key role of normalization in recent advances in the theory of business cycles and of economic growth. Next, we discuss the benefits normalization brings for empirical estimation and empirical growth research. Finally, we identify promising areas of future research.

Proceedings ArticleDOI
29 Nov 2012
TL;DR: A novel Traffic Aware Scheduling Algorithm (TASA) is conceived by extending the theoretically well-established graph theory methods of matching and coloring by means of an innovative approach based on network topology and traffic load to support emerging industrial applications requiring low latency at low duty cycle and power consumption.
Abstract: The Time Synchronized Channel Hopping (TSCH) protocol is part of the newly defined IEEE 802.15.4e standard and represents the latest generation of highly reliable low-power MAC protocols. With implementation details left open, we conceive here a novel Traffic Aware Scheduling Algorithm (TASA) by extending the theoretically well-established graph theory methods of matching and coloring by means of an innovative approach based on network topology and traffic load. TASA is able to support emerging industrial applications requiring low latency at low duty cycle and power consumption. Preliminary simulation results have also been reported to highlight the effectiveness of the proposed algorithm.

Posted Content
01 Jan 2012
TL;DR: In this article, van Dijk, Gentry, Halevi and Vaikuntanathan proposed a scheme with a 10.1 MB public key instead of 802 MB using similar parameters as in [8].
Abstract: We describe a compression technique that reduces the public key size of van Dijk, Gentry, Halevi and Vaikuntanathan’s (DGHV) fully homomorphic scheme over the integers from O(λ) to O(λ). Our variant remains semantically secure, but in the random oracle model. We obtain an implementation of the full scheme with a 10.1 MB public key instead of 802 MB using similar parameters as in [8]. Additionally we show how to extend the quadratic encryption technique of [8] to higher degrees, to obtain a shorter public-key for the basic scheme. This paper also describes a new modulus switching technique for the DGHV scheme that enables to use the new FHE framework without bootstrapping from Brakerski, Gentry and Vaikuntanathan with the DGHV scheme. Finally we describe an improved attack against the Approximate GCD Problem on which the DGHV scheme is based, with complexity O(2) instead of O(2).

Journal ArticleDOI
TL;DR: It is demonstrated that SVA elements are mobilized in HeLa cells only in the presence of both L1-encoded proteins, ORF1p and ORF2p, and SVA trans-mobilization rates exceeded pseudogene formation frequencies by 12- to 300-fold in He La-HA cells, indicating that S VA elements represent a preferred substrate for L1 proteins.
Abstract: SINE-VNTR-Alu (SVA) elements are non-autonomous, hominid-specific non-LTR retrotransposons and distinguished by their organization as composite mobile elements. They represent the evolutionarily youngest, currently active family of human non-LTR retrotransposons, and sporadically generate disease-causing insertions. Since preexisting, genomic SVA sequences are characterized by structural hallmarks of Long Interspersed Elements 1 (LINE-1, L1)-mediated retrotransposition, it has been hypothesized for several years that SVA elements are mobilized by the L1 protein machinery in trans. To test this hypothesis, we developed an SVA retrotransposition reporter assay in cell culture using three different human-specific SVA reporter elements. We demonstrate that SVA elements are mobilized in HeLa cells only in the presence of both L1-encoded proteins, ORF1p and ORF2p. SVA trans-mobilization rates exceeded pseudogene formation frequencies by 12- to 300-fold in HeLa-HA cells, indicating that SVA elements represent a preferred substrate for L1 proteins. Acquisition of an AluSp element increased the trans-mobilization frequency of the SVA reporter element by ~25-fold. Deletion of (CCCTCT)(n) repeats and Alu-like region of a canonical SVA reporter element caused significant attenuation of the SVA trans-mobilization rate. SVA de novo insertions were predominantly full-length, occurred preferentially in G+C-rich regions, and displayed all features of L1-mediated retrotransposition which are also observed in preexisting genomic SVA insertions.

Book ChapterDOI
10 Sep 2012
TL;DR: This paper proposes a new secure outsourcing algorithm for (variable-exponent, variable-base) exponentiation modulo a prime in the two untrusted program model and proposes the first efficient outsource-secure algorithm for simultaneous modular exponentiations.
Abstract: Modular exponentiations have been considered the most expensive operation in discrete-logarithm based cryptographic protocols. In this paper, we propose a new secure outsourcing algorithm for exponentiation modular a prime in the one-malicious model. Compared with the state-of-the-art algorithm [33], the proposed algorithm is superior in both efficiency and checkability. We then utilize this algorithm as a subroutine to achieve outsource-secure Cramer-Shoup encryptions and Schnorr signatures. Besides, we propose the first outsource-secure and efficient algorithm for simultaneous modular exponentiations. Moreover, we prove that both the algorithms can achieve the desired security notions.

Proceedings ArticleDOI
14 Jun 2012
TL;DR: Dexpler, a software package which converts Dalvik bytecode to Jimple, which is built on top of Dedexer and Soot.
Abstract: This paper introduces Dexpler, a software package which converts Dalvik bytecode to Jimple. Dexpler is built on top of Dedexer and Soot. As Jimple is Soot's main internal representation of code, the Dalvik bytecode can be manipulated with any Jimple based tool, for instance for performing point-to or flow analysis.

Journal ArticleDOI
TL;DR: A simple and useful thermodynamic approach to the prediction of reactions taking place during thermal treatment of layers of multinary semiconductor compounds on different substrates has been developed.
Abstract: A simple and useful thermodynamic approach to the prediction of reactions taking place during thermal treatment of layers of multinary semiconductor compounds on different substrates has been developed. The method, which uses the extensive information for the possible binary compounds to assess the stability of multinary phases, is illustrated with the examples of Cu(In,Ga)Se(2) and Cu(2)ZnSnSe(4) as well as other less-studied ternary and quaternary semiconductors that have the potential for use as absorbers in photovoltaic devices.

Journal ArticleDOI
19 Sep 2012-PLOS ONE
TL;DR: This study demonstrates the involvement of autophagic machinery in the extracellular delivery of TF in NETs and the subsequent activation of coagulation cascade, providing evidence for the implication of this process in coagulopathy and inflammatory response in sepsis.
Abstract: Background: Sepsis is associated with systemic inflammatory responses and induction of coagulation system. Neutrophil extracellular traps (NETs) constitute an antimicrobial mechanism, recently implicated in thrombosis via platelet entrapment and aggregation. Methodology/Principal Findings: In this study, we demonstrate for the first time the localization of thrombogenic tissue factor (TF) in NETs released by neutrophils derived from patients with gram-negative sepsis and normal neutrophils treated with either serum from septic patients or inflammatory mediators involved in the pathogenesis of sepsis. Localization of TF in acidified autophagosomes was observed during this process, as indicated by positive LC3B and LysoTracker staining. Moreover, phosphatidylinositol 3-kinase inhibition with 3-MA or inhibition of endosomal acidification with bafilomycin A1 hindered the release of TF-bearing NETs. TF present in NETs induced thrombin generation in culture supernatants, which further resulted in protease activated receptor-1 signaling. Conclusions/Significance: This study demonstrates the involvement of autophagic machinery in the extracellular delivery of TF in NETs and the subsequent activation of coagulation cascade, providing evidence for the implication of this process in coagulopathy and inflammatory response in sepsis.

Journal ArticleDOI
10 Dec 2012-PLOS ONE
TL;DR: The possibility that plasma RNAs of exogenous origin may serve as signaling molecules mediating for example the human-microbiome interaction and may affect and/or indicate the state of human health is raised.
Abstract: Human plasma has long been a rich source for biomarker discovery. It has recently become clear that plasma RNA molecules, such as microRNA, in addition to proteins are common and can serve as biomarkers. Surveying human plasma for microRNA biomarkers using next generation sequencing technology, we observed that a significant fraction of the circulating RNA appear to originate from exogenous species. With careful analysis of sequence error statistics and other controls, we demonstrated that there is a wide range of RNA from many different organisms, including bacteria and fungi as well as from other species. These RNAs may be associated with protein, lipid or other molecules protecting them from RNase activity in plasma. Some of these RNAs are detected in intracellular complexes and may be able to influence cellular activities under in vitro conditions. These findings raise the possibility that plasma RNAs of exogenous origin may serve as signaling molecules mediating for example the human-microbiome interaction and may affect and/or indicate the state of human health.

Journal ArticleDOI
01 Feb 2012-Appetite
TL;DR: First evidence that distinct dimensions of food cravings are differentially related to success and failure in dieting is provided, and the German version of the FCQs has good psychometric properties.

Posted Content
TL;DR: A formal definition and a process theory of CPS applicable to the interdisciplinary field are presented, portraying CPS as knowledge acquisition and knowledge application concerning the goal-oriented control of systems that contain many highly interrelated elements.
Abstract: This article is about Complex Problem Solving (CPS), its history in a variety of research domains (e.g., human problem solving, expertise, decision making, and intelligence), and a formal definition and a process theory of CPS applicable to the interdisciplinary field. CPS is portrayed as (a) knowledge acquisition and (b) knowledge application concerning the goal-oriented control of systems that contain many highly interrelated elements (i.e., complex systems). The impact of implicit and explicit knowledge as well as systematic strategy selection on the solution process are discussed, emphasizing the importance of (1) information generation (due to the initial intransparency of the situation), (2) information reduction (due to the overcharging complexity of the problem’s structure), (3) model building (due to the interconnectedness of the variables), (4) dynamic decision making (due to the eigendynamics of the system), and (5) evaluation (due to many, interfering and/or ill-defined goals).

Journal ArticleDOI
TL;DR: In this paper, the most commonly used model was social cognitive theory (SCT)/social learning theory (SLT) either as a single model or in combination with other behavioural models, and interventions that combined high levels of parental involvement and interactive school-based learning, targeted physical activity and dietary change, and included long-term follow-up, appeared most effective.
Abstract: Summary The aim of this comprehensive systematic review was to identify the most effective behavioural models and behaviour change strategies, underpinning preschool- and school-based interventions aimed at preventing obesity in 4–6-year-olds. Searching was conducted from April 1995 to April 2010 using MEDLINE, EMBASE, CINAHL, PsycINFO and The Cochrane Library. Epidemiological studies relevant to the research question with controlled assignment of participants were included in the review, if they had follow-up periods of 6 months or longer. Outcomes included markers of weight gain; markers of body composition; physical activity behaviour changes and dietary behaviour changes. Twelve studies were included in the review. The most commonly used model was social cognitive theory (SCT)/social learning theory (SLT) either as a single model or in combination with other behavioural models. Studies that used SCT/SLT in the development of the intervention had significant favourable changes in one, or more, outcome measures. In addition, interventions that (i) combined high levels of parental involvement and interactive school-based learning; (ii) targeted physical activity and dietary change; and (iii) included long-term follow-up, appeared most effective. It is suggested that interventions should also be focused on developing children's (and parents') perceived competence at making dietary and physical changes.