scispace - formally typeset
Search or ask a question

Showing papers by "University of Cyprus published in 2016"


Journal ArticleDOI
Daniel J. Klionsky1, Kotb Abdelmohsen2, Akihisa Abe3, Joynal Abedin4  +2519 moreInstitutions (695)
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.

5,187 citations


Journal ArticleDOI
26 Jul 2016-eLife
TL;DR: The height differential between the tallest and shortest populations was 19-20 cm a century ago, and has remained the same for women and increased for men a century later despite substantial changes in the ranking of countries.
Abstract: Being taller is associated with enhanced longevity, and higher education and earnings. We reanalysed 1472 population-based studies, with measurement of height on more than 18.6 million participants to estimate mean height for people born between 1896 and 1996 in 200 countries. The largest gain in adult height over the past century has occurred in South Korean women and Iranian men, who became 20.2 cm (95% credible interval 17.5–22.7) and 16.5 cm (13.3–19.7) taller, respectively. In contrast, there was little change in adult height in some sub-Saharan African countries and in South Asia over the century of analysis. The tallest people over these 100 years are men born in the Netherlands in the last quarter of 20th century, whose average heights surpassed 182.5 cm, and the shortest were women born in Guatemala in 1896 (140.3 cm; 135.8–144.8). The height differential between the tallest and shortest populations was 19-20 cm a century ago, and has remained the same for women and increased for men a century later despite substantial changes in the ranking of countries.

1,348 citations


Journal ArticleDOI
S. Adrián-Martínez1, M. Ageron2, Felix Aharonian3, Sebastiano Aiello  +243 moreInstitutions (24)
TL;DR: In this article, the main objectives of the KM3NeT Collaboration are (i) the discovery and subsequent observation of high-energy neutrino sources in the Universe and (ii) the determination of the mass hierarchy of neutrinos.
Abstract: The main objectives of the KM3NeT Collaboration are (i) the discovery and subsequent observation of high-energy neutrino sources in the Universe and (ii) the determination of the mass hierarchy of neutrinos. These objectives are strongly motivated by two recent important discoveries, namely: (1) the high-energy astrophysical neutrino signal reported by IceCube and (2) the sizable contribution of electron neutrinos to the third neutrino mass eigenstate as reported by Daya Bay, Reno and others. To meet these objectives, the KM3NeT Collaboration plans to build a new Research Infrastructure consisting of a network of deep-sea neutrino telescopes in the Mediterranean Sea. A phased and distributed implementation is pursued which maximises the access to regional funds, the availability of human resources and the synergistic opportunities for the Earth and sea sciences community. Three suitable deep-sea sites are selected, namely off-shore Toulon (France), Capo Passero (Sicily, Italy) and Pylos (Peloponnese, Greece). The infrastructure will consist of three so-called building blocks. A building block comprises 115 strings, each string comprises 18 optical modules and each optical module comprises 31 photo-multiplier tubes. Each building block thus constitutes a three-dimensional array of photo sensors that can be used to detect the Cherenkov light produced by relativistic particles emerging from neutrino interactions. Two building blocks will be sparsely configured to fully explore the IceCube signal with similar instrumented volume, different methodology, improved resolution and complementary field of view, including the galactic plane. One building block will be densely configured to precisely measure atmospheric neutrino oscillations.

729 citations


Journal ArticleDOI
Vardan Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam  +2283 moreInstitutions (141)
TL;DR: Combined fits to CMS UE proton–proton data at 7TeV and to UEProton–antiproton data from the CDF experiment at lower s, are used to study the UE models and constrain their parameters, providing thereby improved predictions for proton-proton collisions at 13.
Abstract: New sets of parameters ("tunes") for the underlying-event (UE) modeling of the PYTHIA8, PYTHIA6 and HERWIG++ Monte Carlo event generators are constructed using different parton distribution functions. Combined fits to CMS UE data at sqrt(s) = 7 TeV and to UE data from the CDF experiment at lower sqrt(s), are used to study the UE models and constrain their parameters, providing thereby improved predictions for proton-proton collisions at 13 TeV. In addition, it is investigated whether the values of the parameters obtained from fits to UE observables are consistent with the values determined from fitting observables sensitive to double-parton scattering processes. Finally, comparisons of the UE tunes to "minimum bias" (MB) events, multijet, and Drell-Yan (q q-bar to Z / gamma* to lepton-antilepton + jets) observables at 7 and 8 TeV are presented, as well as predictions of MB and UE observables at 13 TeV.

686 citations


Journal ArticleDOI
TL;DR: The authors developed a theory-based performance evaluation framework and examined the assessment of such performance outcomes in 998 empirical studies published in the top 15 marketing journals from 1981 through 2014, revealing a large number of different performance outcome measures used in prior empirical research that may be only weakly related to one another.
Abstract: Research in marketing has increasingly focused on building knowledge about how firms’ marketing contributes to performance outcomes. A key precursor to accurately diagnosing the value firms’ marketing creates is conceptualizing and operationalizing appropriate ways to assess performance outcomes. Yet, to date, there has been little conceptual development and no systematic examination of how researchers in marketing should conceptualize and measure the performance outcomes associated with firms’ marketing. The authors develop a theory-based performance evaluation framework and examine the assessment of such performance outcomes in 998 empirical studies published in the top 15 marketing journals from 1981 through 2014. The results reveal a large number of different performance outcome measures used in prior empirical research that may be only weakly related to one another, making it difficult to synthesize findings across studies. In addition, the authors identify significant problems in how perform...

393 citations


Journal ArticleDOI
TL;DR: It is found that the high optical absorption of a diketopyrrolopyrrole-thienothiophene copolymer can be explained by the high persistence length of the polymer, and high absorption in other polymers with high theoretical persistence length is demonstrated.
Abstract: The specific optical absorption of an organic semiconductor is critical to the performance of organic optoelectronic devices. For example, higher light-harvesting efficiency can lead to higher photocurrent in solar cells that are limited by sub-optimal electrical transport. Here, we compare over 40 conjugated polymers, and find that many different chemical structures share an apparent maximum in their extinction coefficients. However, a diketopyrrolopyrrole-thienothiophene copolymer shows remarkably high optical absorption at relatively low photon energies. By investigating its backbone structure and conformation with measurements and quantum chemical calculations, we find that the high optical absorption can be explained by the high persistence length of the polymer. Accordingly, we demonstrate high absorption in other polymers with high theoretical persistence length. Visible light harvesting may be enhanced in other conjugated polymers through judicious design of the structure.

283 citations


Journal Article
TL;DR: There is a lack of adequate empirical evidence in terms of the effectiveness of the frameworks proposed herein, but it is expected that the knowledge and research base will dramatically increase over the next several years, as more countries around the world add computer science as a separate school subject to their K-6 curriculum.
Abstract: Adding computer science as a separate school subject to the core K-6 curriculum is a complex issue with educational challenges. The authors herein address two of these challenges: (1) the design of the curriculum based on a generic computational thinking framework, and (2) the knowledge teachers need to teach the curriculum. The first issue is discussed within a perspective of designing an authentic computational thinking curriculum with a focus on real-world problems. The second issue is addressed within the framework of technological pedagogical content knowledge explicating in detail the body of knowledge that teachers need to have to be able to teach computational thinking in a K-6 environment. An example of how these ideas can be applied in practice is also given. While it is recognized there is a lack of adequate empirical evidence in terms of the effectiveness of the frameworks proposed herein, it is expected that our knowledge and research base will dramatically increase over the next several years, as more countries around the world add computer science as a separate school subject to their K-6 curriculum.

257 citations


Journal ArticleDOI
TL;DR: The significant role of UV at high pH and CO3(•-) on OTC removal from contaminated water was therefore demonstrated both kinetically and mechanistically.

216 citations


Posted Content
TL;DR: In this paper, the authors combine image-based Fully Convolutional Networks (FCNs) and surface-based CRFs to yield coherent segmentations of 3D shapes, which significantly outperforms the state-of-the-art methods in the currently largest segmentation benchmark (ShapeNet).
Abstract: This paper introduces a deep architecture for segmenting 3D objects into their labeled semantic parts. Our architecture combines image-based Fully Convolutional Networks (FCNs) and surface-based Conditional Random Fields (CRFs) to yield coherent segmentations of 3D shapes. The image-based FCNs are used for efficient view-based reasoning about 3D object parts. Through a special projection layer, FCN outputs are effectively aggregated across multiple views and scales, then are projected onto the 3D object surfaces. Finally, a surface-based CRF combines the projected outputs with geometric consistency cues to yield coherent segmentations. The whole architecture (multi-view FCNs and CRF) is trained end-to-end. Our approach significantly outperforms the existing state-of-the-art methods in the currently largest segmentation benchmark (ShapeNet). Finally, we demonstrate promising segmentation results on noisy 3D shapes acquired from consumer-grade depth cameras.

200 citations


Journal ArticleDOI
TL;DR: The research field of giant clusters is under continuous evolution and their intriguing structural characteristics and magnetism properties that attract the interest of synthetic Inorganic Chemists promise a brilliant future for this class of compounds.
Abstract: In this review, aspects of the syntheses, structures and magnetic properties of giant 3d and 3d/4f paramagnetic metal clusters in moderate oxidation states are discussed. The term "giant clusters" is used herein to denote metal clusters with nuclearity of 30 or greater. Many synthetic strategies towards such species have been developed and are discussed in this paper. Attempts are made to categorize some of the most successful methods to giant clusters, but it will be pointed out that the characteristics of the crystal structures of such compounds including nuclearity, shape, architecture, etc. are unpredictable depending on the specific structural features of the included organic ligands, reaction conditions and other factors. The majority of the described compounds in this review are of special interest not only for their fascinating nanosized structures but also because they sometimes display interesting magnetic phenomena, such as ferromagnetic exchange interactions, large ground state spin values, single-molecule magnetism behaviour or impressively large magnetocaloric effects. In addition, they often possess the properties of both the quantum and the classical world, and thus their systematic study offers the potential for the discovery of new physical phenomena, as well as a better understanding of the existing ones. The research field of giant clusters is under continuous evolution and their intriguing structural characteristics and magnetism properties that attract the interest of synthetic Inorganic Chemists promise a brilliant future for this class of compounds.

192 citations


Journal ArticleDOI
TL;DR: This paper serves as a survey of the most significant work performed in the area of mobile phone computing combined with the IoT/WoT, and a selection of over 100 papers is presented, which constitute the mostsignificant work in the field up to date.
Abstract: As the Internet of Things (IoT) and the Web of Things (WoT) are becoming a reality, their interconnection with mobile phone computing is increasing. Mobile phone integrated sensors offer advanced services, which when combined with Web-enabled real-world devices located near the mobile user (e.g., body area networks, radio-frequency identification tags, energy monitors, environmental sensors, etc.), have the potential of enhancing the overall user knowledge, perception and experience, encouraging more informed choices and better decisions. This paper serves as a survey of the most significant work performed in the area of mobile phone computing combined with the IoT/WoT. A selection of over 100 papers is presented, which constitute the most significant work in the field up to date, categorizing these papers into ten different domains, according to the area of application (i.e., health, sports, gaming, transportation, and agriculture), the nature of interaction (i.e., participatory sensing, eco-feedback, actuation, and control), or the communicating actors involved (i.e., things and people). Open issues and research challenges are identified, analyzed and discussed.

Journal ArticleDOI
TL;DR: In this article, the relationship between gender diversity and environmental sustainability is investigated, and the authors find that both demographic and structural gender diversity are significant predictors of a firm's environmental sustainability initiatives.
Abstract: In this paper, we investigate the relationship between gender and environmental sustainability. Based on a sample of 296 firms, drawn from the population of US publicly traded firms over a five-year period, we empirically test whether firms that have (1) more gender diverse boards of directors and (2) more policies and practices that enable or reinforce gender diversity throughout the organization, adopted more environmentally responsible policies and practices. We find that both ‘demographic’ and ‘structural’ gender diversity are significant predictors of a firm's environmental sustainability initiatives. Our findings show gender diversity is a sustainability issue as well. Copyright © 2016 John Wiley & Sons, Ltd and ERP Environment

Journal ArticleDOI
TL;DR: The Synergy Expert Group comprised world-leading researchers in health and social psychology and behavioural medicine who convened to discuss priority issues in planning interventions in health contexts and develop a set of recommendations for future research and practice.
Abstract: The current article details a position statement and recommendations for future research and practice on planning and implementation intentions in health contexts endorsed by the Synergy Expert Group. The group comprised world-leading researchers in health and social psychology and behavioural medicine who convened to discuss priority issues in planning interventions in health contexts and develop a set of recommendations for future research and practice. The expert group adopted a nominal groups approach and voting system to elicit and structure priority issues in planning interventions and implementation intentions research. Forty-two priority issues identified in initial discussions were further condensed to 18 key issues, including definitions of planning and implementation intentions and 17 priority research areas. Each issue was subjected to voting for consensus among group members and formed the basis of the position statement and recommendations. Specifically, the expert group endorsed statements and recommendations in the following areas: generic definition of planning and specific definition of implementation intentions, recommendations for better testing of mechanisms, guidance on testing the effects of moderators of planning interventions, recommendations on the social aspects of planning interventions, identification of the preconditions that moderate effectiveness of planning interventions and recommendations for research on how people use plans.

Journal ArticleDOI
TL;DR: The relevance of different transepts of the urban water cycle on the potential enrichment and spread of antibiotic resistance is reviewed and some gaps of knowledge, research needs, and control measures are suggested.
Abstract: Over the last decade, numerous evidences have contributed to establish a link between the natural and human-impacted environments and the growing public health threat that is the antimicrobial resistance. In the environment, in particular in areas subjected to strong anthropogenic pressures, water plays a major role on the transformation and transport of contaminants including antibiotic residues, antibiotic-resistant bacteria, and antibiotic resistance genes. Therefore, the urban water cycle, comprising water abstraction, disinfection, and distribution for human consumption, and the collection, treatment, and delivery of wastewater to the environment, is a particularly interesting loop to track the fate of antibiotic resistance in the environment and to assess the risks of its transmission back to humans. In this article, the relevance of different transepts of the urban water cycle on the potential enrichment and spread of antibiotic resistance is reviewed. According to this analysis, some gaps of knowledge, research needs, and control measures are suggested. The critical rationale behind the measures suggested and the desirable involvement of some key action players is also discussed.

Journal ArticleDOI
TL;DR: In the cosmological scenario in $f(T)$ gravity, this paper derived exact solutions for an isotropic and homogeneous universe containing a dust fluid and radiation and for an empty anisotropic Bianchi I universe.
Abstract: In the cosmological scenario in $f(T)$ gravity, we find analytical solutions for an isotropic and homogeneous universe containing a dust fluid and radiation and for an empty anisotropic Bianchi I universe. The method that we apply is that of movable singularities of differential equations. For the isotropic universe, the solutions are expressed in terms of a Laurent expansion, while for the anisotropic universe we find a family of exact Kasner-like solutions in vacuum. Finally, we discuss when a nonlinear $f(T)$-gravity theory provides solutions for the teleparallel equivalence of general relativity and derive conditions for exact solutions of general relativity to solve the field equations of an $f(T)$ theory.

Journal ArticleDOI
TL;DR: In this article, the authors considered a full-duplex decode-and-forward (FD) system where the time-switching protocol is employed by the multiantenna relay to receive energy from the source and transmit information to the destination.
Abstract: We consider a full-duplex (FD) decode-and-forward system in which the time-switching protocol is employed by the multiantenna relay to receive energy from the source and transmit information to the destination. The instantaneous throughput is maximized by optimizing receive and transmit beamformers at the relay and the time-split parameter. We study both optimum and suboptimum schemes. The reformulated problem in the optimum scheme achieves closed-form solutions in terms of transmit beamformer for some scenarios. In other scenarios, the optimization problem is formulated as a semidefinite relaxation problem and a rank-one optimum solution is always guaranteed. In the suboptimum schemes, the beamformers are obtained using maximum ratio combining, zero-forcing, and maximum ratio transmission. When beamformers have closed-form solutions, the achievable instantaneous and delay-constrained throughput are analytically characterized. Our results reveal that beamforming increases both the energy harvesting and loop interference suppression capabilities at the FD relay. Moreover, simulation results demonstrate that the choice of the linear processing scheme as well as the time-split plays a critical role in determining the FD gains.

Journal ArticleDOI
Vardan Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam2  +2249 moreInstitutions (180)
TL;DR: In this article, a search for narrow resonances in proton-proton collisions at square root(s) = 13 TeV is presented, and the invariant mass distribution of the two leading jets is measured with the CMS detector using a data set corresponding to an integrated luminosity of 2.4 inverse femtobarns.
Abstract: A search for narrow resonances in proton-proton collisions at sqrt(s) = 13 TeV is presented. The invariant mass distribution of the two leading jets is measured with the CMS detector using a data set corresponding to an integrated luminosity of 2.4 inverse femtobarns. The highest observed dijet mass is 6.1 TeV. The distribution is smooth and no evidence for resonant particles is observed. Upper limits at 95% confidence level are set on the production cross section for narrow resonances with masses above 1.5 TeV. When interpreted in the context of specific models, the limits exclude string resonances with masses below 7.0 TeV, scalar diquarks below 6.0 TeV, axigluons and colorons below 5.1 TeV, excited quarks below 5.0 TeV, color-octet scalars below 3.1 TeV, and W' bosons below 2.6 TeV. These results significantly extend previously published limits.

Journal ArticleDOI
Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam2  +2255 moreInstitutions (183)
TL;DR: In this paper, the results on two-particle angular correlations for charged particles produced in pp collisions at a center-of-mass energy of 13 TeV were presented, and the data were taken with the CMS detector at the LHC.
Abstract: Results on two-particle angular correlations for charged particles produced in pp collisions at a center-of-mass energy of 13 TeV are presented. The data were taken with the CMS detector at the LHC and correspond to an integrated luminosity of about 270 nb^(−1). The correlations are studied over a broad range of pseudorapidity (|η| 2.0), near-side (Δϕ≈0) structure emerges in the two-particle Δη–Δϕ correlation functions. The magnitude of the correlation exhibits a pronounced maximum in the range 1.0

Journal ArticleDOI
TL;DR: The impact of baseline mutation patterns on susceptibility to antiretroviral drugs and the impact on baseline susceptibility is largest for nonnucleoside reverse transcriptase inhibitors.
Abstract: Background. Numerous studies have shown that baseline drug resistance patterns may influence the outcome of antiretroviral therapy. Therefore, guidelines recommend drug resistance testing to guide the choice of initial regimen. In addition to optimizing individual patient management, these baseline resistance data enable transmitted drug resistance (TDR) to be surveyed for public health purposes. The SPREAD program systematically collects data to gain insight into TDR occurring in Europe since 2001. Methods. Demographic, clinical, and virological data from 4140 antiretroviral-naive human immunodeficiency virus (HIV)-infected individuals from 26 countries who were newly diagnosed between 2008 and 2010 were analyzed. Evidence of TDR was defined using the WHO list for surveillance of drug resistance mutations. Prevalence of TDR was assessed over time by comparing the results to SPREAD data from 2002 to 2007. Baseline susceptibility to antiretroviral drugs was predicted using the Stanford HIVdb program version 7.0. Results. The overall prevalence of TDR did not change significantly over time and was 8.3% (95% confidence interval, 7.2%-9.5%) in 2008-2010. The most frequent indicators of TDR were nucleoside reverse transcriptase inhibitor (NRTI) mutations (4.5%), followed by nonnucleoside reverse transcriptase inhibitor (NNRTI) mutations (2.9%) and protease inhibitor mutations (2.0%). Baseline mutations were most predictive of reduced susceptibility to initial NNRTI-based regimens: 4.5% and 6.5% of patient isolates were predicted to have resistance to regimens containing efavirenz or rilpivirine, respectively, independent of current NRTI backbones. Conclusions. Although TDR was highest for NRTIs, the impact of baseline drug resistance patterns on susceptibility was largest for NNRTIs. The prevalence of TDR assessed by epidemiological surveys does not clearly indicate to what degree susceptibility to different drug classes is affected.

Journal ArticleDOI
TL;DR: In this article, the IEEE PES working group on cascading failure reviews and synthesizes how benchmarking and validation can be done for cascading analysis, summarizes and reviews the cascading test cases that are available to the international community, and makes recommendations for improving the state of the art.
Abstract: Cascading failure in electric power systems is a complicated problem for which a variety of models, software tools, and analytical tools have been proposed but are difficult to verify. Benchmarking and validation are necessary to understand how closely a particular modeling method corresponds to reality, what engineering conclusions may be drawn from a particular tool, and what improvements need to be made to the tool in order to reach valid conclusions. The community needs to develop the test cases tailored to cascading that are central to practical benchmarking and validation. In this paper, the IEEE PES working group on cascading failure reviews and synthesizes how benchmarking and validation can be done for cascading failure analysis, summarizes and reviews the cascading test cases that are available to the international community, and makes recommendations for improving the state of the art.

Journal ArticleDOI
TL;DR: This work investigates the satisfactory location for the sources outside the closure of the domain of the problem under consideration by means of a leave-one-out cross validation algorithm and obtains locations of the sources which lead to highly accurate results, at a relatively low cost.
Abstract: The satisfactory location for the sources outside the closure of the domain of the problem under consideration remains one of the major issues in the application of the method of fundamental solutions (MFS). In this work we investigate this issue by means of two algorithms, one based on the satisfaction of the boundary conditions and one based on the leave-one-out cross validation algorithm. By applying these algorithms to several numerical examples for the Laplace and biharmonic equations in a variety of geometries in two and three dimensions, we obtain locations of the sources which lead to highly accurate results, at a relatively low cost.

Journal ArticleDOI
Khachatryan1, Albert M. Sirunyan2, Armen Tumasyan2, Wolfgang Adam  +2332 moreInstitutions (183)
TL;DR: In this article, a search for single top quark production in the s channel in proton-proton collisions with the CMS detector at the CERN LHC in decay modes of the top quarks containing a muon or an electron in the final state is presented.
Abstract: A search is presented for single top quark production in the s channel in proton-proton collisions with the CMS detector at the CERN LHC in decay modes of the top quark containing a muon or an electron in the final state. The signal is extracted through a maximum-likelihood fit to the distribution of a multivariate discriminant defined using boosted decision trees to separate the expected signal contribution from background processes. Data collected at centre-of-mass energies of 7 and 8 TeV yield cross sections of 7.1 +/- 8.1 pb and 13.4 +/- 7.3 pb, respectively, and a best fit value of 2.0 +/- 0.9 for the combined ratio of the measured and expected values. The signal significance is 2.5 standard deviations, and the upper limit on the rate relative to the standard model expectation is 4.7 at 95% confidence level.

Journal ArticleDOI
Vardan Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam2  +2287 moreInstitutions (178)
TL;DR: In this article, the angular distribution and the differential branching fraction of the decay B0 to K*0(892) mu mu are studied using data corresponding to an integrated luminosity of 20.5 inverse femtobarns collected with the CMS detector at the LHC in pp collisions at sqrt(s) = 8 TeV.

Journal ArticleDOI
Jean Bousquet, Peter Hellings1, Ioana Agache2, A. Bedbrook  +315 moreInstitutions (141)
TL;DR: The aim of the novel ARIA approach is to provide an active and healthy life to rhinitis sufferers, whatever their age, sex or socio-economic status, in order to reduce health and social inequalities incurred by the disease.
Abstract: The Allergic Rhinitis and its Impact on Asthma (ARIA) initiative commenced during a World Health Organization workshop in 1999. The initial goals were (1) to propose a new allergic rhinitis classification, (2) to promote the concept of multi-morbidity in asthma and rhinitis and (3) to develop guidelines with all stakeholders that could be used globally for all countries and populations. ARIA-disseminated and implemented in over 70 countries globally-is now focusing on the implementation of emerging technologies for individualized and predictive medicine. MASK [MACVIA (Contre les Maladies Chroniques pour un Vieillissement Actif)-ARIA Sentinel NetworK] uses mobile technology to develop care pathways for the management of rhinitis and asthma by a multi-disciplinary group and by patients themselves. An app (Android and iOS) is available in 20 countries and 15 languages. It uses a visual analogue scale to assess symptom control and work productivity as well as a clinical decision support system. It is associated with an inter-operable tablet for physicians and other health care professionals. The scaling up strategy uses the recommendations of the European Innovation Partnership on Active and Healthy Ageing. The aim of the novel ARIA approach is to provide an active and healthy life to rhinitis sufferers, whatever their age, sex or socio-economic status, in order to reduce health and social inequalities incurred by the disease.

Journal ArticleDOI
Vardan Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam2  +2239 moreInstitutions (171)
TL;DR: In this paper, a search for the resonant production of high-mass photon pairs is presented based on samples of proton-proton collision data collected by the CMS experiment at center-of-mass energies of 8 and 13 TeV, corresponding to integrated luminosities of 19.7 and 3.3 fb(-1).
Abstract: A search for the resonant production of high-mass photon pairs is presented. The analysis is based on samples of proton-proton collision data collected by the CMS experiment at center-of-mass energies of 8 and 13 TeV, corresponding to integrated luminosities of 19.7 and 3.3 fb(-1), respectively. The interpretation of the search results focuses on spin-0 and spin-2 resonances with masses between 0.5 and 4 TeV and with widths, relative to the mass, between 1.4 x 10(-4) and 5.6 x 10(-2). Limits are set on scalar resonances produced through gluon-gluon fusion, and on Randall-Sundrum gravitons. A modest excess of events compatible with a narrow resonance with a mass of about 750 GeV is observed. The local significance of the excess is approximately 3.4 standard deviations. The significance is reduced to 1.6 standard deviations once the effect of searching under multiple signal hypotheses is considered. More data are required to determine the origin of this excess.

Journal ArticleDOI
TL;DR: The iLIR database is developed, listing all the putative canonical LIRCPs identified in silico in the proteomes of 8 model organisms using the iL IR server, and a curated text-mining analysis of the literature permitted us to identify novel putative LICRPs in mammals that have not been associated with autophagy.
Abstract: Atg8-family proteins are the best-studied proteins of the core autophagic machinery. They are essential for the elongation and closure of the phagophore into a proper autophagosome. Moreover, Atg8-family proteins are associated with the phagophore from the initiation of the autophagic process to, or just prior to, the fusion between autophagosomes with lysosomes. In addition to their implication in autophagosome biogenesis, they are crucial for selective autophagy through their ability to interact with selective autophagy receptor proteins necessary for the specific targeting of substrates for autophagic degradation. In the past few years it has been revealed that Atg8-interacting proteins include not only receptors but also components of the core autophagic machinery, proteins associated with vesicles and their transport, and specific proteins that are selectively degraded by autophagy. Atg8-interacting proteins contain a short linear LC3-interacting region/LC3 recognition sequence/Atg8-interacting motif (LIR/LRS/AIM) motif which is responsible for their interaction with Atg8-family proteins. These proteins are referred to as LIR-containing proteins (LIRCPs). So far, many experimental efforts have been carried out to identify new LIRCPs, leading to the characterization of some of them in the past 10 years. Given the need for the identification of LIRCPs in various organisms, we developed the iLIR database ( https://ilir.warwick.ac.uk ) as a freely available web resource, listing all the putative canonical LIRCPs identified in silico in the proteomes of 8 model organisms using the iLIR server, combined with a Gene Ontology (GO) term analysis. Additionally, a curated text-mining analysis of the literature permitted us to identify novel putative LICRPs in mammals that have not previously been associated with autophagy.

Journal ArticleDOI
TL;DR: In this article, a search for fermionic top quark partners T of charge 2/3 was carried out in proton-proton collisions corresponding to an integrated luminosity of 19.7 inverse femtobarns collected at a center-of-mass energy of sqrt(s) = 8 TeV with the CMS detector at the LHC.
Abstract: A search for fermionic top quark partners T of charge 2/3 is presented. The search is carried out in proton-proton collisions corresponding to an integrated luminosity of 19.7 inverse femtobarns collected at a center-of-mass energy of sqrt(s) = 8 TeV with the CMS detector at the LHC. The T quarks are assumed to be produced strongly in pairs and can decay into tH, tZ, and bW. The search is performed in five exclusive channels: a single-lepton channel, a multilepton channel, two all-hadronic channels optimized either for the bW or the tH decay, and one channel in which the Higgs boson decays into two photons. The results are found to be compatible with the standard model expectations in all the investigated final states. A statistical combination of these results is performed and lower limits on the T quark mass are set. Depending on the branching fractions, lower mass limits between 720 and 920 GeV at 95% confidence level are found. These are among the strongest limits on vector-like T quarks obtained to date.

Journal ArticleDOI
16 Sep 2016-PLOS ONE
TL;DR: The microattribution approach is implemented to assess the pharmacogenomic biomarkers allelic spectrum in 18 European populations, mostly from developing European countries, by analyzing 1,931 pharmacogenomics biomarkers in 231 genes, showing significant inter-population pharmacogenome biomarker allele frequency differences.
Abstract: Pharmacogenomics aims to correlate inter-individual differences of drug efficacy and/or toxicity with the underlying genetic composition, particularly in genes encoding for protein factors and enzymes involved in drug metabolism and transport. In several European populations, particularly in countries with lower income, information related to the prevalence of pharmacogenomic biomarkers is incomplete or lacking. Here, we have implemented the microattribution approach to assess the pharmacogenomic biomarkers allelic spectrum in 18 European populations, mostly from developing European countries, by analyzing 1,931 pharmacogenomics biomarkers in 231 genes. Our data show significant inter-population pharmacogenomic biomarker allele frequency differences, particularly in 7 clinically actionable pharmacogenomic biomarkers in 7 European populations, affecting drug efficacy and/or toxicity of 51 medication treatment modalities. These data also reflect on the differences observed in the prevalence of high-risk genotypes in these populations, as far as common markers in the CYP2C9, CYP2C19, CYP3A5, VKORC1, SLCO1B1 and TPMT pharmacogenes are concerned. Also, our data demonstrate notable differences in predicted genotype-based warfarin dosing among these populations. Our findings can be exploited not only to develop guidelines for medical prioritization, but most importantly to facilitate integration of pharmacogenomics and to support pre-emptive pharmacogenomic testing. This may subsequently contribute towards significant cost-savings in the overall healthcare expenditure in the participating countries, where pharmacogenomics implementation proves to be cost-effective.

Journal ArticleDOI
TL;DR: Improved techniques are used to evaluate the disconnected quark loops to sufficient accuracy to determine the strange and charm nucleon σ terms in addition to the light quark content σ_{πN}.
Abstract: We evaluate the light, strange, and charm scalar content of the nucleon using one lattice QCD ensemble generated with two degenerate light quarks with mass fixed to their physical value. We use improved techniques to evaluate the disconnected quark loops to sufficient accuracy to determine the strange and charm nucleon σ terms in addition to the light quark content σ_{πN}. We find σ_{πN}=37.2(2.6)(4.7/2.9) MeV, σ_{s}=41.1(8.2)(7.8/5.8) MeV, and σ_{c}=79(21)(12/8) MeV, where the first error is statistical and the second is the systematic error due to the determination of the lattice spacing, the assessment of finite volume, and residual excited state effects.

Journal ArticleDOI
TL;DR: The LEAP Group obtained scientific qualification advice from the European Medicines Agency on the population selection criteria, clinical end points and biomarker methodologies to be used, and whether data generated in this study would be accepted in regulatory decisions for future clinical trials.
Abstract: Autism spectrum disorder (ASD) is one of the most common neurodevelopmental dis­ orders, but effective medical treatments for the core symptoms of the disorder are still lack­ ing. According to the Diagnostic and Statistical Manual of Mental Disorders, fifth edition (DSM­5), the core symptoms of ASD comprise deficits in social communication and inter­ action, and repetitive and restricted behaviours, which include sensory abnormalities. Novel genetic and preclinical approaches now pro­ vide unprecedented opportunities to identify the underpinning pathophysiological mecha­ nisms and aetiology­based treatment targets, as discussed in a Review article by Ghosh et al. (Drug discovery for autism spectrum dis order: challenges and opportunities. Nat. Rev. Drug Discov. 12, 777–790 (2013))1. This has led to more interest from the pharmaceutical indus­ try in an area in which the overall risk of failure is seen as very high because key par ameters of drug efficiency are not yet established and the regulatory environment is uncertain. For exam­ ple, industry has recently invested in several pre­competitive projects, such as the European Union (EU) Innovative Medicines Initiative (IMI)­brokered public–private partnership EU­AIMS (European Autism Inter ventions — A Multicentre Study for Developing New Medications)2. However, even when new compounds that show preclinical promise for ASD are found, there are still considerable challenges in test­ ing them in clinical trials. For instance, the current practice of testing treatments in clini­ cally and biologically heterogeneous patient groups hampers the ability of investigators to detect potentially significant efficacy sig­ nals in specific subgroups who ‘respond’. There fore, we need biomarkers that stratify patient populations according to distinct biological subtypes. So far, the identification and valid ation of biomarkers has been limited by studies with small sample sizes that have insufficient power and/or because studies use different (and often not standardized) meas­ ures. We also need quantifiable, reproducible outcome measures — including surrogate end points — that are sensitive to change, in order to assess treatment efficacy. Currently, the EU­AIMS Longitudinal European Autism Project (LEAP) is the world­ wide largest multicentre, multidisciplinary study to identify stratification biomarkers for ASD and biomarkers that may serve as surro­ gate end points. In total, the study will include approximately 450 individuals with ASD between the ages of 6 and 30 years, and 350 control participants with typical development or mild intellectual disabilities. All participants are comprehensively characterized in terms of their clinical symptom profile, comorbidities, quality of life, level of adaptive function, neuro­ cognitive profile, brain structure and function (assessed using structural magnetic resonance imaging (sMRI), functional MRI (fMRI) and electroencephalogram (EEG)), biochemi­ cal biomarkers, prenatal environmental risk factors and genomics (see Supplementary information S1 (table)). To understand whether data generated in this study would be accepted in regulatory decisions for future clinical trials, the LEAP Group obtained scientific qualification advice from the European Medicines Agency (EMA) on the population selection criteria, clinical end points and biomarker methodologies to be used. The EMA’s Committee for Medicinal Products for Human Use (CHMP) offers tailored advice to support the qualification of innovative methods that have been developed for a specific intended use in the context of research into and development of pharma­ ceuticals. The goal of using qualified meth­ ods is to enable a more robust assessment of risks versus benefits in clinical trials. Another advantage of the procedure of qualifying these methods is that, once qualified, these clini­ cal study instruments may be applied by any investigator in subsequent clinical research, thus ensuring greater scientific rigour.