scispace - formally typeset
Search or ask a question

Showing papers by "Brunel University London published in 2014"


Journal ArticleDOI
TL;DR: In this paper, the authors discuss several seminal theories of creativity and innovation and then apply a comprehensive levels-of-analysis framework to review extant research into individual, team, organizational, and multilevel innovation.

1,882 citations


Journal ArticleDOI
S. Chatrchyan, Khachatryan1, Albert M. Sirunyan, Armen Tumasyan  +2384 moreInstitutions (207)
26 May 2014
TL;DR: In this paper, a description of the software algorithms developed for the CMS tracker both for reconstructing charged-particle trajectories in proton-proton interactions and for using the resulting tracks to estimate the positions of the LHC luminous region and individual primary-interaction vertices is provided.
Abstract: A description is provided of the software algorithms developed for the CMS tracker both for reconstructing charged-particle trajectories in proton-proton interactions and for using the resulting tracks to estimate the positions of the LHC luminous region and individual primary-interaction vertices. Despite the very hostile environment at the LHC, the performance obtained with these algorithms is found to be excellent. For tt events under typical 2011 pileup conditions, the average track-reconstruction efficiency for promptly-produced charged particles with transverse momenta of p_T > 0.9GeV is 94% for pseudorapidities of |η| < 0.9 and 85% for 0.9 < |η| < 2.5. The inefficiency is caused mainly by hadrons that undergo nuclear interactions in the tracker material. For isolated muons, the corresponding efficiencies are essentially 100%. For isolated muons of p_T = 100GeV emitted at |η| < 1.4, the resolutions are approximately 2.8% in p_T, and respectively, 10μm and 30μm in the transverse and longitudinal impact parameters. The position resolution achieved for reconstructed primary vertices that correspond to interesting pp collisions is 10–12μm in each of the three spatial dimensions. The tracking and vertexing software is fast and flexible, and easily adaptable to other functions, such as fast tracking for the trigger, or dedicated tracking for electrons that takes into account bremsstrahlung.

559 citations


Journal ArticleDOI
TL;DR: In this paper, the authors presented the results of a project with the European Research Council and EPLANET (European Union) with the objective of supporting the development of a research network in the field of nuclear energy.
Abstract: Austrian Federal Ministry of Science and Research and the Austrian Science Fund; the Belgian Fonds de la Recherche Scientifique and Fonds voor Wetenschappelijk Onderzoek; the Brazilian Funding Agencies (CNPq, CAPES, FAPERJ, and FAPESP); the Bulgarian Ministry of Education and Science; CERN; the Chinese Academy of Sciences, Ministry of Science and Technology, and National Natural Science Foundation of China; the Colombian Funding Agency (COLCIENCIAS); the Croatian Ministry of Science, Education and Sport, and the Croatian Science Foundation; the Research Promotion Foundation, Cyprus; the Ministry of Education and Research, Recurrent Financing Contract No. SF0690030s09 and European Regional Development Fund, Estonia; the Academy of Finland, Finnish Ministry of Education and Culture, and Helsinki Institute of Physics; the Institut National de Physique Nucleaire et de Physique des Particules/CNRS and Commissariat a l’Energie Atomique et aux Energies Alternatives/CEA, France; the Bundesministerium fur Bildung und Forschung, Deutsche Forschungsgemeinschaft, and Helmholtz-Gemeinschaft Deutscher Forschungszentren, Germany; the General Secretariat for Research and Technology, Greece; the National Scientific Research Foundation and National Innovation Office, Hungary; the Department of Atomic Energy and the Department of Science and Technology, India; the Institute for Studies in Theoretical Physics and Mathematics, Iran; the Science Foundation, Ireland; the Istituto Nazionale di Fisica Nucleare, Italy; the Korean Ministry of Education, Science and Technology and the World Class University program of NRF, Republic of Korea; the Lithuanian Academy of Sciences; the Mexican Funding Agencies (CINVESTAV, CONACYT, SEP, and UASLP-FAI); the Ministry of Business, Innovation and Employment, New Zealand; the Pakistan Atomic Energy Commission; the Ministry of Science and Higher Education and the National Science Centre, Poland; the Fundacao para a Ciencia e a Tecnologia, Portugal; JINR, Dubna, the Ministry of Education and Science of the Russian Federation, the Federal Agency of Atomic Energy of the Russian Federation, Russian Academy of Sciences, and the Russian Foundation for Basic Research; the Ministry of Education, Science and Technological Development of Serbia; the Secretaria de Estado de Investigacion, Desarrollo e Innovacion and Programa Consolider-Ingenio 2010, Spain; the Swiss Funding Agencies (ETH Board, ETH Zurich, PSI, SNF, UniZH, Canton Zurich, and SER); the National Science Council, Taipei; the Thailand Center of Excellence in Physics, the Institute for the Promotion of Teaching Science and Technology of Thailand, Special Task Force for Activating Research and the National Science and Technology Development Agency of Thailand; the Scientific and Technical Research Council of Turkey and the Turkish Atomic Energy Authority; the Science and Technology Facilities Council, United Kingdom; the U.S. Department of Energy and the U.S. National Science Foundation.Individuals have received support from the Marie-Curie program and the European Research Council and EPLANET (European Union); the Leventis Foundation; the A. P. Sloan Foundation; the Alexander von Humboldt Foundation; the Belgian Federal Science Policy Office; the Fonds pour la Formation a la Recherche dans l’Industrie et dans l’Agriculture (FRIA-Belgium); the Agentschap voor Innovatie door Wetenschap en Technologie (IWT-Belgium); the Ministry of Education, Youth and Sports (MEYS) of the Czech Republic; the Council of Science and Industrial Research, India; the Compagnia di San Paolo (Torino); the HOMING PLUS programme of Foundation for Polish Science, cofinanced by EU, Regional Development Fund; and the Thalis and Aristeia programmes cofinanced by EU-ESF and the Greek NSRF.

512 citations


Journal ArticleDOI
TL;DR: The aim of this Commission is to provide the strongest evidence base through involvement of experts from a wide cross-section of disciplines, making firm recommendations to reduce the unacceptable premature mortality and disease burden from avoidable causes and to improve the standard of care for patients with liver disease in hospital.

491 citations


Journal ArticleDOI
TL;DR: In this paper, the diphoton decay mode of the recently discovered Higgs boson and measurement of some of its properties are reported using the entire dataset collected by the CMS experiment in proton-proton collisions during the 2011 and 2012 LHC running periods.
Abstract: Observation of the diphoton decay mode of the recently discovered Higgs boson and measurement of some of its properties are reported. The analysis uses the entire dataset collected by the CMS experiment in proton-proton collisions during the 2011 and 2012 LHC running periods. The data samples correspond to integrated luminosities of 5.1 inverse femtobarns at sqrt(s) = 7 TeV and 19.7 inverse femtobarns at 8 TeV. A clear signal is observed in the diphoton channel at a mass close to 125 GeV with a local significance of 5.7 sigma, where a significance of 5.2 sigma is expected for the standard model Higgs boson. The mass is measured to be 124.70 +/- 0.34 GeV = 124.70 +/- 0.31 (stat) +/- 0.15 (syst) GeV, and the best-fit signal strength relative to the standard model prediction is 1.14 +0.26/-0.23 = 1.14 +/- 0.21 (stat) +0.09/-0.05 (syst) +0.13/-0.09 (theo). Additional measurements include the signal strength modifiers associated with different production mechanisms, and hypothesis tests between spin-0 and spin-2 models.

486 citations


Journal ArticleDOI
TL;DR: The application of SDE in three popular Pareto-based algorithms demonstrates its usefulness in handling many-objective problems and an extensive comparison with five state-of-the-art EMO algorithms reveals its competitiveness in balancing convergence and diversity of solutions.
Abstract: It is commonly accepted that Pareto-based evolutionary multiobjective optimization (EMO) algorithms encounter difficulties in dealing with many-objective problems. In these algorithms, the ineffectiveness of the Pareto dominance relation for a high-dimensional space leads diversity maintenance mechanisms to play the leading role during the evolutionary process, while the preference of diversity maintenance mechanisms for individuals in sparse regions results in the final solutions distributed widely over the objective space but distant from the desired Pareto front. Intuitively, there are two ways to address this problem: 1) modifying the Pareto dominance relation and 2) modifying the diversity maintenance mechanism in the algorithm. In this paper, we focus on the latter and propose a shift-based density estimation (SDE) strategy. The aim of our study is to develop a general modification of density estimation in order to make Pareto-based algorithms suitable for many-objective optimization. In contrast to traditional density estimation that only involves the distribution of individuals in the population, SDE covers both the distribution and convergence information of individuals. The application of SDE in three popular Pareto-based algorithms demonstrates its usefulness in handling many-objective problems. Moreover, an extensive comparison with five state-of-the-art EMO algorithms reveals its competitiveness in balancing convergence and diversity of solutions. These findings not only show that SDE is a good alternative to tackle many-objective problems, but also present a general extension of Pareto-based algorithms in many-objective optimization.

466 citations


Journal ArticleDOI
J. P. Lees1, V. Poireau1, V. Tisserand1, E. Grauges2  +308 moreInstitutions (73)
TL;DR: In this article, the authors presented a search for a dark photon in the reaction e^{+}e^{-}→γA^{'], A^{'}→e''+''e''-e''−γA''−E''−μ'' −μ'' -E'' −γA''.
Abstract: Dark sectors charged under a new Abelian interaction have recently received much attention in the context of dark matter models. These models introduce a light new mediator, the so-called dark photon (A^{'}), connecting the dark sector to the standard model. We present a search for a dark photon in the reaction e^{+}e^{-}→γA^{'}, A^{'}→e^{+}e^{-}, μ^{+}μ^{-} using 514 fb^{-1} of data collected with the BABAR detector. We observe no statistically significant deviations from the standard model predictions, and we set 90% confidence level upper limits on the mixing strength between the photon and dark photon at the level of 10^{-4}-10^{-3} for dark photon masses in the range 0.02-10.2 GeV. We further constrain the range of the parameter space favored by interpretations of the discrepancy between the calculated and measured anomalous magnetic moment of the muon.

436 citations


Journal ArticleDOI
Adrian John Bevan1, B. Golob2, Th. Mannel3, S. Prell4  +2061 moreInstitutions (171)
TL;DR: The physics of the SLAC and KEK B Factories are described in this paper, with a brief description of the detectors, BaBar and Belle, and data taking related issues.
Abstract: This work is on the Physics of the B Factories. Part A of this book contains a brief description of the SLAC and KEK B Factories as well as their detectors, BaBar and Belle, and data taking related issues. Part B discusses tools and methods used by the experiments in order to obtain results. The results themselves can be found in Part C.

413 citations


Proceedings Article
S. Chatrchyan1, Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1  +2179 moreInstitutions (201)
30 Jul 2014

409 citations


Journal ArticleDOI
TL;DR: In this paper, the Hermiticity condition in quantum mechanics required for the characterization of physical observables and generators of unitary motions can be relaxed into a wider class of operators whose eigenvalues are real and whose Eigenstates are complete.
Abstract: The Hermiticity condition in quantum mechanics required for the characterization of (a) physical observables and (b) generators of unitary motions can be relaxed into a wider class of operators whose eigenvalues are real and whose eigenstates are complete. In this case, the orthogonality of eigenstates is replaced by the notion of biorthogonality that defines the relation between the Hilbert space of states and its dual space. The resulting quantum theory, which might appropriately be called 'biorthogonal quantum mechanics', is developed here in some detail in the case for which the Hilbert-space dimensionality is finite. Specifically, characterizations of probability assignment rules, observable properties, pure and mixed states, spin particles, measurements, combined systems and entanglements, perturbations, and dynamical aspects of the theory are developed. The paper concludes with a brief discussion on infinite-dimensional systems.

398 citations


Posted Content
TL;DR: The authors revisited the relationship between financial development and economic growth in a panel of 52 middle income countries over the 1980-2008 period, using pooled mean group estimator in a dynamic heterogeneous panel setting.
Abstract: We revisit the relationship between financial development and economic growth in a panel of 52 middle income countries over the 1980-2008 period, using pooled mean group estimator in a dynamic heterogeneous panel setting. We show that financial development does not have a linear positive long-run impact on economic growth in this sample. When we consider a non-linear relationship between financial development and growth, we find an inverted U-shaped relationship between finance and growth in the long run. In the short-run, the relationship is insignificant. This finding suggests that middle income countries face a threshold point after which financial development no longer contributes to economic growth.

Journal ArticleDOI
Vardan Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam  +2121 moreInstitutions (139)
TL;DR: In this paper, searches for the direct electroweak production of supersymmetric charginos, neutralinos, and sleptons in a variety of signatures with leptons and W, Z, and Higgs bosons are presented.
Abstract: Searches for the direct electroweak production of supersymmetric charginos, neutralinos, and sleptons in a variety of signatures with leptons and W, Z, and Higgs bosons are presented. Results are based on a sample of proton-proton collision data collected at center-of-mass energy sqrt(s) = 8 TeV with the CMS detector in 2012, corresponding to an integrated luminosity of 19.5 inverse femtobarns. The observed event rates are in agreement with expectations from the standard model. These results probe charginos and neutralinos with masses up to 720 GeV, and sleptons up to 260 GeV, depending on the model details.

Journal ArticleDOI
S. Chatrchyan1, Vardan Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1  +2280 moreInstitutions (177)
TL;DR: In this paper, a search for a standard model Higgs boson decaying into a pair of tau leptons is performed using events recorded by the CMS experiment at the LHC in 2011 and 2012.
Abstract: A search for a standard model Higgs boson decaying into a pair of tau leptons is performed using events recorded by the CMS experiment at the LHC in 2011 and 2012. The dataset corresponds to an integrated luminosity of 4.9 inverse femtobarns at a centre-of-mass energy of 7 TeV and 19.7 inverse femtobarns at 8 TeV. Each tau lepton decays hadronically or leptonically to an electron or a muon, leading to six different final states for the tau-lepton pair, all considered in this analysis. An excess of events is observed over the expected background contributions, with a local significance larger than 3 standard deviations for m[H] values between 115 and 130 GeV. The best fit of the observed H to tau tau signal cross section for m[H] = 125 GeV is 0.78 +- 0.27 times the standard model expectation. These observations constitute evidence for the 125 GeV Higgs boson decaying to a pair of tau leptons.

Journal ArticleDOI
TL;DR: EQ-5D was valid and responsive for skin conditions and most cancers; in vision, its performance varied according to aetiology; and performance was poor for hearing impairments and there was no evidence in skin conditions.
Abstract: Background: The National Institute for Health and Care Excellence recommends the use of generic preference-based measures (GPBMs) of health for its Health Technology Assessments (HTAs). However, these data may not be available or appropriate for all health conditions. Objectives: To determine whether GPBMs are appropriate for some key conditions and to explore alternative methods of utility estimation when data from GPBMs are unavailable or inappropriate. Design: The project was conducted in three stages: (1) A systematic review of the psychometric properties of three commonly used GPBMs [EQ-5D, SF-6D and Health Utilities Index Mark 3 (HUI3)] in four broadly defined conditions: visual impairment, hearing impairment, cancer and skin conditions. (2) Potential modelling approaches to ‘map’ EQ-5D values from condition-specific and clinical measures of health [European Organisation for Research and Treatment of Cancer Quality-of-life Questionnaire Core 30 (EORTC QLQ-C30) and Functional Assessment of Cancer Therapy – General Scale (FACT-G)] are compared for predictive ability and goodness of fit using two separate data sets. (3) Three potential extensions to the EQ-5D are developed as ‘bolt-on’ items relating to hearing, tiredness and vision. They are valued using the time trade-off method. A second valuation study is conducted to fully value the EQ-5D with and without the vision bolt-on item in an additional sample of 300 people. Setting: The valuation surveys were conducted using face-to-face interviews in the respondents’ homes. Participants: Two representative samples of the UK general population from Yorkshire (n = 600). Interventions: None. Main outcome measures: Comparisons of EQ-5D, SF-6D and HUI3 in four conditions with various generic and condition-specific measures. Mapping functions were estimated between EORTC QLQ-C30 and FACT-G with EQ-5D. Three bolt-ons to the EQ-5D were developed: EQ + hearing/vision/tiredness. A full valuation study was conducted for the EQ + vision. Results: (1) EQ-5D was valid and responsive for skin conditions and most cancers; in vision, its performance varied according to aetiology; and performance was poor for hearing impairments. The HUI3 performed well for hearing and vision disorders. It also performed well in cancers although evidence was limited and there was no evidence in skin conditions. There were limited data for SF-6D in all four conditions and limited evidence on reliability of all instruments. (2) Mapping algorithms were estimated to predict EQ-5D values from alternative cancer-specific measures of health. Response mapping using all the domain scores was the best performing model for the EORTC QLQ-C30. In an exploratory analysis, a limited dependent variable mixture model performed better than an equivalent linear model. In the full analysis for the FACT-G, linear regression using ordinary least squares gave the best predictions followed by the tobit model. (3) The exploratory valuation study found that bolt-on items for vision, hearing and tiredness had a significant impact on values of the health states, but the direction and magnitude of differences depended on the severity of the health state. The vision bolt-on item had a statistically significant impact on EQ-5D health state values and a full valuation model was estimated. Conclusions: EQ-5D performs well in studies of cancer and skin conditions. Mapping techniques provide a solution to predict EQ-5D values where EQ-5D has not been administered. For conditions where EQ-5D was found to be inappropriate, including some vision disorders and for hearing, bolt-ons provide a promising solution. More primary research into the psychometric properties of the generic preference-based measures is required, particularly in cancer and for the assessment of reliability. Further research is needed for the development and valuation of bolt-ons to EQ-5D. Funding: This project was funded by the UK Medical Research Council (MRC) as part of the MRC-NIHR methodology research programme (reference G0901486) and will be published in full in Health Technology Assessment; Vol. 18, No. 9. See the NIHR Journals Library website for further project information.

Journal ArticleDOI
TL;DR: A holistic, global view of environmental exposure to pharmaceuticals encompassing terrestrial, freshwater and marine ecosystems in high- and low-income countries is taken, and studies on uptake, trophic transfer and indirect effects of pharmaceuticals acting via food webs are presented.
Abstract: Global pharmaceutical consumption is rising with the growing and ageing human population and more intensive food production Recent studies have revealed pharmaceutical residues in a wide range of ecosystems and organisms Environmental concentrations are often low, but pharmaceuticals typically are designed to have biological effects at low doses, acting on physiological systems that can be evolutionarily conserved across taxa This Theme Issue introduces the latest research investigating the risks of environmentally relevant concentrations of pharmaceuticals to vertebrate wildlife We take a holistic, global view of environmental exposure to pharmaceuticals encompassing terrestrial, freshwater and marine ecosystems in high- and low-income countries Based on both field and laboratory data, the evidence for and relevance of changes to physiology and behaviour, in addition to mortality and reproductive effects, are examined in terms of the population- and community-level consequences of pharmaceutical exposure on wildlife Studies on uptake, trophic transfer and indirect effects of pharmaceuticals acting via food webs are presented Given the logistical and ethical complexities of research in this area, several papers focus on techniques for prioritizing which compounds are most likely to harm wildlife and how modelling approaches can make predictions about the bioavailability, metabolism and toxicity of pharmaceuticals in non-target species This Theme Issue aims to help clarify the uncertainties, highlight opportunities and inform ongoing scientific and policy debates on the impacts of pharmaceuticals in the environment

Journal ArticleDOI
TL;DR: A meta-analysis of all relevant, high quality primary studies of defect prediction to determine what factors influence predictive performance finds that the choice of classifier has little impact upon performance and the major explanatory factor is the researcher group.
Abstract: Background . The ability to predict defect-prone software components would be valuable. Consequently, there have been many empirical studies to evaluate the performance of different techniques endeavouring to accomplish this effectively. However no one technique dominates and so designing a reliable defect prediction model remains problematic. Objective. We seek to make sense of the many conflicting experimental results and understand which factors have the largest effect onpredictive performance. Method. We conduct a meta-analysis of all relevant, high quality primary studies of defect prediction to determine what factors influence predictive performance. This is based on 42 primary studies that satisfy our inclusion criteria that collectively report 600 sets of empirical prediction results. By reverse engineering a common response variable we build arandom effects ANOVA model to examine the relative contribution of four model building factors (classifier, data set, input metrics and researcher group) to model prediction performance. Results. Surprisingly we find that the choice of classifier has little impact upon performance (1.3 percent) and in contrast the major (31 percent) explanatory factor is the researcher group. It matters more who does the work than what is done. Conclusion. To overcome this high level of researcher bias, defect prediction researchers should (i) conduct blind analysis, (ii) improve reporting protocols and (iii) conduct more intergroup studies in order to alleviate expertise issues. Lastly, research is required to determine whether this bias is prevalent in other applications domains.

Journal ArticleDOI
TL;DR: In this paper, a relatively complete definition of the paradigm of human centred design is formulated and reflections upon the meaning of the word ‘design' are made and a simple structural model of the design questions addressed.
Abstract: Reflections upon the meaning of the word ‘design’ are made and a relatively complete definition of the paradigm of human centred design is formulated. Aspects of both the background and the current practice of the paradigm are presented, as is a basic structural model of the design questions addressed. Examples are provided of the economic benefit of human centred design in business settings as an approach for designing products, systems and services which are physically, perceptually, cognitively and emotionally intuitive. Examples are further provided of the coherence of the paradigm with the logic and structure of several currently popular marketing and banding frameworks. Finally, some strategic implications of adopting human centred design as a business strategy are suggested.

Journal ArticleDOI
TL;DR: A short-term effect on pain is found of active high-frequency stimulation of the motor cortex in single-dose studies of repetitive transcranial magnetic stimulation and this equates to a 12% reduction in pain, which does not exceed the pre-established criteria for a minimal clinically important difference.
Abstract: Copyright © 2011 The Cochrane Collaboration. This review is published as a Cochrane Review in the Cochrane Database of Systematic Reviews 2010, Issue 9. Cochrane Reviews are regularly updated as new evidence emerges and in response to comments and criticisms, and the Cochrane Database of Systematic Reviews should be consulted for the most recent version of the Review.

Journal ArticleDOI
TL;DR: The results of a series of qualitative interviews with scientists who participated in the ‘OPAL’ portfolio of citizen science projects that has been running in England since 2007 are presented: What were their experiences of participating in citizen science?
Abstract: Citizen science as a way of communicating science and doing public engagement has over the past decade become the focus of considerable hopes and expectations. It can be seen as a win–win situation, where scientists get help from the public and the participants get a public engagement experience that involves them in real and meaningful scientific research. In this paper we present the results of a series of qualitative interviews with scientists who participated in the ‘OPAL’ portfolio of citizen science projects that has been running in England since 2007: What were their experiences of participating in citizen science? We highlight two particular sets of issues that our participants have voiced, methodological/epistemological and ethical issues. While we share the general enthusiasm over citizen science, we hope that the research in this paper opens up more debate over the potential pitfalls of citizen science as seen by the scientists themselves.

Journal ArticleDOI
TL;DR: In this paper, a search for new physics in multijet events with large missing transverse momentum produced in proton-proton collisions at 8 TeV using a data sample corresponding to an integrated luminosity of 19.5 inverse femtobarns collected with the CMS detector at the LHC.
Abstract: A search for new physics is performed in multijet events with large missing transverse momentum produced in proton-proton collisions at sqrt(s)=8 TeV using a data sample corresponding to an integrated luminosity of 19.5 inverse femtobarns collected with the CMS detector at the LHC. The data sample is divided into three jet multiplicity categories (3-5, 6-7, and 8 or more jets), and studied further in bins of two variables: the scalar sum of jet transverse momenta and the missing transverse momentum. The observed numbers of events in various categories are consistent with backgrounds expected from standard model processes. Exclusion limits are presented for several simplified supersymmetric models of squark or gluino pair production.

Journal ArticleDOI
TL;DR: In this paper, a search for the standard model Higgs boson decaying to bb¯ when produced in association with a weak vector boson (V) is reported for the following channels: W(μν)H, W(eν), W(τν), H, Z(μμ), Z(ee, H, and Z(νν), where the search is performed in data samples corresponding to integrated luminosities of up to 5.1 inverse femtobarns at s√=7
Abstract: A search for the standard model Higgs boson (H) decaying to bb¯ when produced in association with a weak vector boson (V) is reported for the following channels: W(μν)H, W(eν)H, W(τν)H, Z(μμ)H, Z(ee)H, and Z(νν)H. The search is performed in data samples corresponding to integrated luminosities of up to 5.1 inverse femtobarns at s√=7 TeV and up to 18.9 fb−1 at s√=8 TeV, recorded by the CMS experiment at the LHC. An excess of events is observed above the expected background with a local significance of 2.1 standard deviations for a Higgs boson mass of 125 GeV, consistent with the expectation from the production of the standard model Higgs boson. The signal strength corresponding to this excess, relative to that of the standard model Higgs boson, is 1.0±0.5.

Journal ArticleDOI
17 Jan 2014
TL;DR: In this article, a search for the standard model Higgs boson decaying to a W-boson pair at the LHC is reported, and an excess of events above background is observed.
Abstract: A search for the standard model Higgs boson decaying to a W-boson pair at the LHC is reported. The event sample corresponds to an integrated luminosity of 4.9 fb−1 and 19.4 fb−1 collected with the CMS detector in pp collisions at s√ = 7 and 8 TeV, respectively. The Higgs boson candidates are selected in events with two or three charged leptons. An excess of events above background is observed, consistent with the expectation from the standard model Higgs boson with a mass of around 125 GeV. The probability to observe an excess equal or larger than the one seen, under the background-only hypothesis, corresponds to a significance of 4.3 standard deviations for m H = 125.6 GeV. The observed signal cross section times the branching fraction to WW for m H = 125.6 GeV is 0.72+0.20−0.18 times the standard model expectation. The spin-parity J P = 0+ hypothesis is favored against a narrow resonance with J P = 2+ or J P = 0− that decays to a W-boson pair. This result provides strong evidence for a Higgs-like boson decaying to a W-boson pair.

Journal ArticleDOI
TL;DR: The social component of the biopsychosocial model is important to patients but not well represented in current core-sets of outcome measures, and researchers should consider social factors to help develop a portfolio of more relevant outcome measures.
Abstract: Copyright @ 2014 Froud et al.; licensee BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited.

Journal ArticleDOI
TL;DR: This paper advocate the use of a simple and effective stable matching (STM) model to coordinate the selection process in MOEA/D and demonstrated that user-preference information can be readily used in the proposed algorithm to find a region that decision makers are interested in.
Abstract: Multiobjective evolutionary algorithm based on decomposition (MOEA/D) decomposes a multiobjective optimization problem into a set of scalar optimization subproblems and optimizes them in a collaborative manner. Subproblems and solutions are two sets of agents that naturally exist in MOEA/D. The selection of promising solutions for subproblems can be regarded as a matching between subproblems and solutions. Stable matching, proposed in economics, can effectively resolve conflicts of interests among selfish agents in the market. In this paper, we advocate the use of a simple and effective stable matching (STM) model to coordinate the selection process in MOEA/D. In this model, subproblem agents can express their preferences over the solution agents, and vice versa. The stable outcome produced by the STM model matches each subproblem with one single solution, and it tradeoffs convergence and diversity of the evolutionary search. Comprehensive experiments have shown the effectiveness and competitiveness of our MOEA/D algorithm with the STM model. We have also demonstrated that user-preference information can be readily used in our proposed algorithm to find a region that decision makers are interested in.


Journal ArticleDOI
TL;DR: Examining population genetic structures and effective population sizes of wild roach living in English rivers contaminated with estrogenic effluents demonstrates that roach populations living in some effluent-contaminated river stretches, where feminization is widespread, are self-sustaining.
Abstract: Background: Treated effluents from wastewater treatment works can comprise a large proportion of the flow of rivers in the developed world. Exposure to these effluents, or the steroidal estrogens they contain, feminizes wild male fish and can reduce their reproductive fitness. Long-term experimental exposures have resulted in skewed sex ratios, reproductive failures in breeding colonies, and population collapse. This suggests that environmental estrogens could threaten the sustainability of wild fish populations. Results: Here we tested this hypothesis by examining population genetic structures and effective population sizes (Ne) of wild roach (Rutilus rutilus L.) living in English rivers contaminated with estrogenic effluents. Ne was estimated from DNA microsatellite genotypes using approximate Bayesian computation and sibling assignment methods. We found no significant negative correlation between Ne and the predicted estrogen exposure at 28 sample sites. Furthermore, examination of the population genetic structure of roach in the region showed that some populations have been confined to stretches of river with a high proportion of estrogenic effluent for multiple generations and have survived, apparently without reliance on immigration of fish from less polluted sites. Conclusions: These results demonstrate that roach populations living in some effluent-contaminated river stretches, where feminization is widespread, are self-sustaining. Although we found no evidence to suggest that exposure to estrogenic effluents is a significant driving factor in determining the size of roach breeding populations, a reduction in Ne of up to 65% is still possible for the most contaminated sites because of the wide confidence intervals associated with the statistical model.

Journal ArticleDOI
TL;DR: The authors showed that deliberate practice is not sufficient to explain individual differences in performance in the two most widely studied domains in expertise research (chess and music) and proposed a theoretical framework that takes into account several potentially relevant explanatory constructs.

Journal ArticleDOI
TL;DR: It can be argued that well-designed end-user security education contributes to thwart phishing threats, as the interaction effect of conceptual and procedural knowledge positively impacts on computer users' self-efficacy, which enhances their phishing threat avoidance behaviour.

Journal ArticleDOI
S. Chatrchyan1, Vardan Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1  +2230 moreInstitutions (144)
TL;DR: The observed (expected) upper limit on the invisible branching fraction at 0.58 (0.44) is interpreted in terms of a Higgs-portal model of dark matter interactions.
Abstract: A search for invisible decays of Higgs bosons is performed using the vector boson fusion and associated ZH production modes. In the ZH mode, the Z boson is required to decay to a pair of charged leptons or a $b\bar{b}$ quark pair. The searches use the 8 TeV pp collision dataset collected by the CMS detector at the LHC, corresponding to an integrated luminosity of up to 19.7 inverse femtobarns. Certain channels include data from 7 TeV collisions corresponding to an integrated luminosity of 4.9 inverse femtobarns. The searches are sensitive to non-standard-model invisible decays of the recently observed Higgs boson, as well as additional Higgs bosons with similar production modes and large invisible branching fractions. In all channels, the observed data are consistent with the expected standard model backgrounds. Limits are set on the production cross section times invisible branching fraction, as a function of the Higgs boson mass, for the vector boson fusion and ZH production modes. By combining all channels, and assuming standard model Higgs boson cross sections and acceptances, the observed (expected) upper limit on the invisible branching fraction at $m_H$=125 GeV is found to be 0.58 (0.44) at 95% confidence level. We interpret this limit in terms of a Higgs-portal model of dark matter interactions.

Journal ArticleDOI
TL;DR: Experimental results on lots of datasets show the effectiveness of the NPSVM in both sparseness and classification accuracy, and confirm the above conclusion further.
Abstract: We propose a novel nonparallel classifier, called nonparallel support vector machine (NPSVM), for binary classification. Our NPSVM that is fully different from the existing nonparallel classifiers, such as the generalized eigenvalue proximal support vector machine (GEPSVM) and the twin support vector machine (TWSVM), has several incomparable advantages: 1) two primal problems are constructed implementing the structural risk minimization principle; 2) the dual problems of these two primal problems have the same advantages as that of the standard SVMs, so that the kernel trick can be applied directly, while existing TWSVMs have to construct another two primal problems for nonlinear cases based on the approximate kernel-generated surfaces, furthermore, their nonlinear problems cannot degenerate to the linear case even the linear kernel is used; 3) the dual problems have the same elegant formulation with that of standard SVMs and can certainly be solved efficiently by sequential minimization optimization algorithm, while existing GEPSVM or TWSVMs are not suitable for large scale problems; 4) it has the inherent sparseness as standard SVMs; 5) existing TWSVMs are only the special cases of the NPSVM when the parameters of which are appropriately chosen. Experimental results on lots of datasets show the effectiveness of our method in both sparseness and classification accuracy, and therefore, confirm the above conclusion further. In some sense, our NPSVM is a new starting point of nonparallel classifiers.