scispace - formally typeset
Search or ask a question

Showing papers by "University of Grenoble published in 2012"


Journal ArticleDOI
Georges Aad1, T. Abajyan2, Brad Abbott3, Jalal Abdallah4  +2964 moreInstitutions (200)
TL;DR: In this article, a search for the Standard Model Higgs boson in proton-proton collisions with the ATLAS detector at the LHC is presented, which has a significance of 5.9 standard deviations, corresponding to a background fluctuation probability of 1.7×10−9.

9,282 citations


Journal ArticleDOI
Martin Peifer1, Lynnette Fernandez-Cuesta1, Martin L. Sos1, Julie George1, Danila Seidel1, Lawryn H. Kasper, Dennis Plenker1, Frauke Leenders1, Ruping Sun2, Thomas Zander1, Roopika Menon3, Mirjam Koker1, Ilona Dahmen1, Christian Müller1, Vincenzo Di Cerbo2, Hans Ulrich Schildhaus1, Janine Altmüller1, Ingelore Baessmann1, Christian Becker1, Bram De Wilde4, Jo Vandesompele4, Diana Böhm3, Sascha Ansén1, Franziska Gabler1, Ines Wilkening1, Stefanie Heynck1, Johannes M. Heuckmann1, Xin Lu1, Scott L. Carter5, Kristian Cibulskis5, Shantanu Banerji5, Gad Getz5, Kwon-Sik Park6, Daniel Rauh7, Christian Grütter7, Matthias Fischer1, Laura Pasqualucci8, Gavin M. Wright9, Zoe Wainer9, Prudence A. Russell10, Iver Petersen11, Yuan Chen11, Erich Stoelben, Corinna Ludwig, Philipp A. Schnabel, Hans Hoffmann, Thomas Muley, Michael Brockmann, Walburga Engel-Riedel, Lucia Anna Muscarella12, Vito Michele Fazio12, Harry J.M. Groen13, Wim Timens13, Hannie Sietsma13, Erik Thunnissen14, Egber Smit14, Daniëlle A M Heideman14, Peter J.F. Snijders14, Federico Cappuzzo, C. Ligorio15, Stefania Damiani15, John K. Field16, Steinar Solberg17, Odd Terje Brustugun17, Marius Lund-Iversen17, Jörg Sänger, Joachim H. Clement11, Alex Soltermann18, Holger Moch18, Walter Weder18, Benjamin Solomon19, Jean-Charles Soria20, Pierre Validire, Benjamin Besse20, Elisabeth Brambilla21, Christian Brambilla21, Sylvie Lantuejoul21, Philippe Lorimier21, Peter M. Schneider1, Michael Hallek1, William Pao22, Matthew Meyerson5, Matthew Meyerson23, Julien Sage6, Jay Shendure24, Robert Schneider25, Robert Schneider2, Reinhard Büttner1, Jürgen Wolf1, Peter Nürnberg1, Sven Perner3, Lukas C. Heukamp1, Paul K. Brindle, Stefan A. Haas2, Roman K. Thomas1 
TL;DR: This study implicates histone modification as a major feature of SCLC, reveals potentially therapeutically tractable genomic alterations and provides a generalizable framework for the identification of biologically relevant genes in the context of high mutational background.
Abstract: Small-cell lung cancer (SCLC) is an aggressive lung tumor subtype with poor prognosis(1-3). We sequenced 29 SCLC exomes, 2 genomes and 15 transcriptomes and found an extremely high mutation rate of 7.4 +/- 1 protein-changing mutations per million base pairs. Therefore, we conducted integrated analyses of the various data sets to identify pathogenetically relevant mutated genes. In all cases, we found evidence for inactivation of TP53 and RB1 and identified recurrent mutations in the CREBBP, EP300 and MLL genes that encode histone modifiers. Furthermore, we observed mutations in PTEN, SLIT2 and EPHA7, as well as focal amplifications of the FGFR1 tyrosine kinase gene. Finally, we detected many of the alterations found in humans in SCLC tumors from Tp53 and Rb1 double knockout mice(4). Our study implicates histone modification as a major feature of SCLC, reveals potentially therapeutically tractable genomic alterations and provides a generalizable framework for the identification of biologically relevant genes in the context of high mutational background.

1,177 citations


Journal ArticleDOI
01 Nov 2012
TL;DR: In this paper, an attempt has been made to develop a theoretical framework and then to study the framework by means of an empirical study using perceptions and practices of selected French companies, and a summary of findings and conclusions are reported.
Abstract: Sustainable business development has received much attention over the past decade owing to the significant attention given by governments and both profit and not-for-profit organizations to environmental, social and corporate responsibility. The emergence of a changing economic order has also made companies around the world seriously think about manufacturing and service sustainability. Global markets and operations have prompted companies to revisit their corporate, business and functional strategies in addition to focusing on outsourcing, virtual enterprise and supply chain management. Sustainability research on supply management has received limited attention. Nevertheless, considering the physically disbursed enterprise environment, supply management is critical for organizational competitiveness. Realizing the importance of sustainability in supply management, an attempt has been made to develop a theoretical framework and then to study the framework by means of an empirical study using perceptions and practices of selected French companies. Finally, a summary of findings and conclusions are reported.

720 citations


Posted Content
TL;DR: In this paper, an attempt has been made to develop a theoretical framework and then to study the framework by means of an empirical study using perceptions and practices of selected French companies, and a summary of findings and conclusions are reported.
Abstract: Sustainable business development has received much attention over the past decade owing to the significant attention given by governments and both profit and not-for-profit organizations to environmental, social and corporate responsibility. The emergence of a changing economic order has also made companies around the world seriously think about manufacturing and service sustainability. Global markets and operations have prompted companies to revisit their corporate, business and functional strategies in addition to focusing on outsourcing, virtual enterprise and supply chain management. Sustainability research on supply management has received limited attention. Nevertheless, considering the physically disbursed enterprise environment, supply management is critical for organizational competitiveness. Realizing the importance of sustainability in supply management, an attempt has been made to develop a theoretical framework and then to study the framework by means of an empirical study using perceptions and practices of selected French companies. Finally, a summary of findings and conclusions are reported.

629 citations


Journal ArticleDOI
TL;DR: A new classification framework for brain-computer interface (BCI) based on motor imagery using spatial covariance matrices as EEG signal descriptors and relying on Riemannian geometry to directly classify these matrices using the topology of the manifold of symmetric and positive definite matrices.
Abstract: This paper presents a new classification framework for brain-computer interface (BCI) based on motor imagery. This framework involves the concept of Riemannian geometry in the manifold of covariance matrices. The main idea is to use spatial covariance matrices as EEG signal descriptors and to rely on Riemannian geometry to directly classify these matrices using the topology of the manifold of symmetric and positive definite (SPD) matrices. This framework allows to extract the spatial information contained in EEG signals without using spatial filtering. Two methods are proposed and compared with a reference method [multiclass Common Spatial Pattern (CSP) and Linear Discriminant Analysis (LDA)] on the multiclass dataset IIa from the BCI Competition IV. The first method, named minimum distance to Riemannian mean (MDRM), is an implementation of the minimum distance to mean (MDM) classification algorithm using Riemannian distance and Riemannian mean. This simple method shows comparable results with the reference method. The second method, named tangent space LDA (TSLDA), maps the covariance matrices onto the Riemannian tangent space where matrices can be vectorized and treated as Euclidean objects. Then, a variable selection procedure is applied in order to decrease dimensionality and a classification by LDA is performed. This latter method outperforms the reference method increasing the mean classification accuracy from 65.1% to 70.2%.

591 citations


Journal ArticleDOI
TL;DR: This article showed that, despite projected reductions in tropical cyclone frequency, projected increases in demographic pressure and tropical cyclones intensity can be expected to exacerbate disaster risk, even with reported losses.
Abstract: Assessments of tropical cyclone risk trends are typically based on reported losses, which are biased by improvements in information access. Now research based on thousands of physically observed events and contextual factors shows that, despite projected reductions in tropical cyclone frequency, projected increases in demographic pressure and tropical cyclone intensity can be expected to exacerbate disaster risk.

526 citations


Journal ArticleDOI
TL;DR: Sustainable business development (SBD) in manufacturing and services (M&S) has become a crucial issue in recent years owing to the impact of global warming, terrorism, earthquakes, hurricanes, and carbon footprint awareness, to site but a few causes as mentioned in this paper.

522 citations


Journal ArticleDOI
TL;DR: In this article, it was shown that a holomorphic line bundle on a projective manifold is pseudo-effective if and only if its degree on any member of a covering family of curves is non-negative.
Abstract: We prove that a holomorphic line bundle on a projective manifold is pseudo-effective if and only if its degree on any member of a covering family of curves is non-negative. This is a consequence of a duality statement between the cone of pseudo-effective divisors and the cone of " movable curves " , which is obtained from a general theory of movable intersections and approximate Zariski decomposition for closed positive (1, 1)-currents. As a corollary, a projective manifold has a pseudo-effective canonical bundle if and only if it is not uniruled. We also prove that a 4-fold with a canonical bundle which is pseudo-effective and of numerical class zero in restriction to curves of a good covering family, has non-negative Kodaira dimension.

461 citations


Journal ArticleDOI
Georges Aad1, Brad Abbott2, J. Abdallah3, S. Abdel Khalek4  +3073 moreInstitutions (193)
TL;DR: In this paper, a Fourier analysis of the charged particle pair distribution in relative azimuthal angle (Delta phi = phi(a)-phi(b)) is performed to extract the coefficients v(n,n) =.
Abstract: Differential measurements of charged particle azimuthal anisotropy are presented for lead-lead collisions at root sNN = 2.76 TeV with the ATLAS detector at the LHC, based on an integrated luminosity of approximately 8 mu b(-1). This anisotropy is characterized via a Fourier expansion of the distribution of charged particles in azimuthal angle relative to the reaction plane, with the coefficients v(n) denoting the magnitude of the anisotropy. Significant v(2)-v(6) values are obtained as a function of transverse momentum (0.5 = 3 are found to vary weakly with both eta and centrality, and their p(T) dependencies are found to follow an approximate scaling relation, v(n)(1/n)(p(T)) proportional to v(2)(1/2)(p(T)), except in the top 5% most central collisions. A Fourier analysis of the charged particle pair distribution in relative azimuthal angle (Delta phi = phi(a)-phi(b)) is performed to extract the coefficients v(n,n) = . For pairs of charged particles with a large pseudorapidity gap (|Delta eta = eta(a) - eta(b)| > 2) and one particle with p(T) < 3 GeV, the v(2,2)-v(6,6) values are found to factorize as v(n,n)(p(T)(a), p(T)(b)) approximate to v(n) (p(T)(a))v(n)(p(T)(b)) in central and midcentral events. Such factorization suggests that these values of v(2,2)-v(6,6) are primarily attributable to the response of the created matter to the fluctuations in the geometry of the initial state. A detailed study shows that the v(1,1)(p(T)(a), p(T)(b)) data are consistent with the combined contributions from a rapidity-even v(1) and global momentum conservation. A two-component fit is used to extract the v(1) contribution. The extracted v(1) isobserved to cross zero at pT approximate to 1.0 GeV, reaches a maximum at 4-5 GeV with a value comparable to that for v(3), and decreases at higher p(T).

435 citations


Journal ArticleDOI
TL;DR: Even if HFOs are promising biomarkers of epileptic tissue, there are still uncertainties about mechanisms of generation, methods of analysis, and clinical applicability, and large multicenter prospective studies are needed prior to widespread clinical application.

382 citations


Journal ArticleDOI
TL;DR: In this paper, a maturity model is built to access the current PSS design across 20 dimensions, highlighting that only three dimensions are strongly treated: design processes for integrating products and services, definitions of new terminologies and considerations concerning planning and designing life cycle phases.
Abstract: Product–service systems (PSS), motivated to fulfil customers’ needs, are seen as good strategies to face today's competitive business environment. The field of PSS research is however not fully mature and many different methodologies are proposed for the PSS design. This paper seeks to understand the directions taken in eight state-of-the-art methodologies so as to identify common needs in future research. The methodologies are studied across their authors’ views and definitions of services, PSS and their objectives and challenges, along with the tools that have been developed. A maturity model is built to access the current PSS design across 20 dimensions. The model highlights that only three dimensions are strongly treated: design processes for integrating products and services, definitions of new terminologies and considerations concerning planning and designing life-cycle phases. To enhance the industrial application, collaboration between researchers and practitioners can be spurred through two chall...

Journal ArticleDOI
TL;DR: The recent development of analytical techniques and methods enables accurate selenium measurements of environmental concentrations, which will lead to a better understanding of biogeochemical processes, which may enable us to predict the distribution of Se health hazards in areas where this is currently unknown.
Abstract: Selenium is a natural trace element that is of fundamental importance to human health. The extreme geographical variation in selenium concentrations in soils and food crops has resulted in significant health problems related to deficient or excess levels of selenium in the environment. To deal with these kinds of problems in the future it is essential to get a better understanding of the processes that control the global distribution of selenium. The recent development of analytical techniques and methods enables accurate selenium measurements of environmental concentrations, which will lead to a better understanding of biogeochemical processes. This improved understanding may enable us to predict the distribution of selenium in areas where this is currently unknown. These predictions are essential to prevent future Se health hazards in a world that is increasingly affected by human activities.

Journal ArticleDOI
02 Mar 2012-Science
TL;DR: The presence of a rare mitochondrial DNA haplotype of spruce that appears unique to Scandinavia and with its highest frequency to the west is shown—an area believed to sustain ice-free refugia during most of the last ice age, challenging current views on survival and spread of trees as a response to climate changes.
Abstract: It is commonly believed that trees were absent in Scandinavia during the last glaciation and first recolonized the Scandinavian Peninsula with the retreat of its ice sheet some 9000 years ago. Here, we show the presence of a rare mitochondrial DNA haplotype of spruce that appears unique to Scandinavia and with its highest frequency to the west—an area believed to sustain ice-free refugia during most of the last ice age. We further show the survival of DNA from this haplotype in lake sediments and pollen of Trondelag in central Norway dating back ~10,300 years and chloroplast DNA of pine and spruce in lake sediments adjacent to the ice-free Andoya refugium in northwestern Norway as early as ~22,000 and 17,700 years ago, respectively. Our findings imply that conifer trees survived in ice-free refugia of Scandinavia during the last glaciation, challenging current views on survival and spread of trees as a response to climate changes.

Journal ArticleDOI
TL;DR: Flexible-meccano will be useful for researchers who wish to compare experimental data with those expected from a fully disordered protein, researchers who see experimental evidence of deviation from 'random coil' behaviour in their protein, or researchers who are interested in working with a broad ensemble of conformers representing the flexibility of the IDP of interest.
Abstract: Motivation: Intrinsically disordered proteins (IDPs) represent a significant fraction of the human proteome. The classical structure function paradigm that has successfully underpinned our understanding of molecular biology breaks down when considering proteins that have no stable tertiary structure in their functional form. One convenient approach is to describe the protein in terms of an equilibrium of rapidly inter-converting conformers. Currently, tools to generate such ensemble descriptions are extremely rare, and poorly adapted to the prediction of experimental data. Results: We present flexible-meccano—a highly efficient algorithm that generates ensembles of molecules, on the basis of amino acid-specific conformational potentials and volume exclusion. Conformational sampling depends uniquely on the primary sequence, with the possibility of introducing additional local or long-range conformational propensities at an amino acid-specific resolution. The algorithm can also be used to calculate expected values of experimental parameters measured at atomic or molecular resolution, such as nuclear magnetic resonance (NMR) and small angle scattering, respectively. We envisage that flexible-meccano will be useful for researchers who wish to compare experimental data with those expected from a fully disordered protein, researchers who see experimental evidence of deviation from ‘random coil’ behaviour in their protein, or researchers who are interested in working with a broad ensemble of conformers representing the flexibility of the IDP of interest. Availability: A fully documented multi-platform executable is provided, with examples, at http://www.ibs.fr/science-213/scientific-output/software/flexible-meccano/ Contact: martin.blackledge@ibs.fr

Journal ArticleDOI
TL;DR: Under natural hypothesis on the set of all solutions to the problem obtained when the parameter varies, it is proved that three greedy algorithms converge; the last algorithm, based on the use of an a posteriori estimator, is the approach actually employed in the calculations.
Abstract: The convergence and efficiency of the reduced basis method used for the approximation of the solutions to a class of problems written as a parametrized PDE depends heavily on the choice of the elements that constitute the "reduced basis". The purpose of this paper is to analyze the a priori convergence for one of the approaches used for the selection of these elements, the greedy algorithm. Under natural hypothesis on the set of all solutions to the problem obtained when the parameter varies, we prove that three greedy algorithms converge; the last algorithm, based on the use of an a posteriori estimator, is the approach actually employed in the calculations.

Journal ArticleDOI
TL;DR: Please cite this paper as: Roustit and Cracowski (2012).
Abstract: For more than two decades, methods for the non-invasive exploration of cutaneous microcirculation have been mainly based on optical microscopy and laser Doppler techniques. In this review, we discuss the advantages and drawbacks of these techniques. Although optical microscopy-derived techniques, such as nailfold videocapillaroscopy, have found clinical applications, they mainly provide morphological information about the microvessels. Laser Doppler techniques coupled with reactivity tests are widespread in the field of microvascular function research, but many technical issues need to be taken into account when performing these tests. Post-occlusive reactive hyperemia and local thermal hyperemia have been shown to be reliable tests, although their underlying mechanisms are not yet fully understood. Acetylcholine and sodium nitroprusside iontophoresis, despite their wide use as specific tests of endothelium-dependent and -independent function, respectively, show limitations. The influence of the skin site, recording conditions, and the way of expressing data are also reviewed. Finally, we focus on promising tools such as laser speckle contrast imaging.

Journal ArticleDOI
01 Mar 2012
TL;DR: The findings of the empirical study suggest that effective SCRM is based on collaboration (collaborative meetings, timely and relevant information exchanges) and the establishment of joint and common transverse processes with industrial partners.
Abstract: The risk thematic is not new in management, but it is a recent and growing subject in supply chain management. Supply Chain Risk Management (SCRM) plays a major role in successfully managing business processes in a proactive manner. Supply chain risk has multiple sources including process, control, demand, supply and environment. Supply chain management, faced with these risks, requires specific and adequate responses such as techniques, attitude and strategies for management of risk. This paper is based on an empirical study of 142 general managers and logistics and supply chain managers in 50 different French companies. It demonstrates that for organizations to be effective, SCRM must be a management function that is inter-organizational in nature and closely related to strategic and operational realities of the activity in question. Moreover, the findings of our empirical study suggest that effective SCRM is based on collaboration (collaborative meetings, timely and relevant information exchanges) and the establishment of joint and common transverse processes with industrial partners.

Journal ArticleDOI
T. Aaltonen1, V. M. Abazov2, Brad Abbott3, Bobby Samir Acharya4  +868 moreInstitutions (117)
TL;DR: An excess of events in the data is interpreted as evidence for the presence of a new particle consistent with the standard model Higgs boson, which is produced in association with a weak vector boson and decays to a bottom-antibottom quark pair.
Abstract: We combine searches by the CDF and D0 Collaborations for the associated production of a Higgs boson with a W or Z boson and subsequent decay of the Higgs boson to a bottom-antibottom quark pair. The data, originating from Fermilab Tevatron p (p) over bar collisions at root s = 1.96 TeV, correspond to integrated luminosities of up to 9.7 fb(-1). The searches are conducted for a Higgs boson with mass in the range 100-150 GeV/c(2). We observe an excess of events in the data compared with the background predictions, which is most significant in the mass range between 120 and 135 GeV/c(2). The largest local significance is 3.3 standard deviations, corresponding to a global significance of 3.1 standard deviations. We interpret this as evidence for the presence of a new particle consistent with the standard model Higgs boson, which is produced in association with a weak vector boson and decays to a bottom-antibottom quark pair.

Journal ArticleDOI
TL;DR: The findings suggest that low-level exposure to PCB (or correlated exposures) impairs fetal growth, but that exposure to p,p´-DDE does not.
Abstract: Objectives: Exposure to high concentrations of persistent organochlorines may cause fetal toxicity, but the evidence at low exposure levels is limited. Large studies with substantial exposure contr...

Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah, A. A. Abdelalim3  +3002 moreInstitutions (178)
TL;DR: In this article, the authors describe the measurement of elliptic flow of charged particles in lead-lead collisions at root s(NN) = 2.76 TeV using the ATLAS detector at the Large Hadron Collider (LHC).

Journal ArticleDOI
TL;DR: In this article, the authors used high resolution and energy filtered transmission electron microscopy (HRTEM, EFTEM) to study mineral-fluid interfaces using TEM foils cut directly across the reaction boundaries, which allowed measurements to be made directly in cross section at nanometer to sub-nanometer resolution.

Journal ArticleDOI
TL;DR: In this paper, a compilation of the X-ray absorption near-edge structure (XANES) spectra of most naturally occurring manganates, synthetic analogs of known structure and chemical composition, and pure-valence phase species is presented and made available as an open source.
Abstract: The valence states of Mn in mixed-valent layer and tunnel structure manganese dioxides (MnO2), usually referred to as phyllomanganates and tectomanganates, can be measured by X-ray absorption near-edge structure (XANES) spectroscopy with a precision and accuracy that are difficult to estimate owing to the paucity of well-characterized standards. A compilation of the Mn K -edge XANES spectra of most naturally occurring manganates, synthetic analogs of known structure and chemical composition, and pure-valence phase species is presented and made available as an open source. We intend this compilation to serve as a basis for the spectroscopic determination of the fractions of the Mn 2+, 3+, and 4+ valences in mixed-valent manganates and phase mixtures. The XANES derivatives of tectomanganates and phyllomanganates with no or little Mn3+ in the MnO2 layer exhibit intensities, shapes, and relative energy positions of the main features characteristics of a particular valence composition. For these compounds, valence fractions can be derived using linear combination fitting analysis. Best quantitative results are obtained when the unknown spectrum is fit to a weighted sum of all reference spectra in the database with the fractions of species constrained to be non-negative (Combo method). The accuracy of the average valence is estimated to 0.04 v.u. in the range of 3+ to 4+, and decreases when the proportion of divalent Mn is higher than 15%. The accuracy of the method is also lower in (layer Mn3+, Mn4+) manganates, because the XANES features are affected non-additively by the amount and distribution of the Jahn-Teller Mn3+ cations. The merit of the Combo method for the determination of manganese valence sums relative to the methods based on calibration curves is discussed.

Journal ArticleDOI
TL;DR: In this paper, LiDAR, laser profilometer, and white light interferometer were used to measure the 3D topography of the same objects, i.e., five exhumed slip surfaces (Vuache-Sillingy, Bolu, Corona Heights, Dixie Valley, Magnola).
Abstract: [1] We report on the topographic roughness measurements of five exhumed faults and thirteen surface earthquake ruptures over a large range of scales: from 50 μm to 50 km. We used three scanner devices (LiDAR, laser profilometer, white light interferometer), spanning complementary scale ranges from 50 μm to 10 m, to measure the 3-D topography of the same objects, i.e., five exhumed slip surfaces (Vuache-Sillingy, Bolu, Corona Heights, Dixie Valley, Magnola). A consistent geometrical property, i.e., self-affinity, emerges as the morphology of the slip surfaces shows at first order, a linear behavior on a log-log plot where axes are fault roughness and spatial length scale, covering five decades of length-scales. The observed fault roughness is scale dependent, with an anisotropic self-affine behavior described by four parameters: two power law exponents H, constant among all the faults studied but slightly anisotropic (H∥ = 0.58 ± 0.07 in the slip direction and H⊥ = 0.81 ± 0.04 perpendicular to it), and two pre-factors showing variability over the faults studied. For larger scales between 200 m and 50 km, we have analyzed the 2-D roughness of the surface rupture of thirteen major continental earthquakes. These ruptures show geometrical properties consistent with the slip-perpendicular behavior of the smaller-scale measurements. Our analysis suggests that the inherent non-alignment between the exposed traces and the along or normal slip direction results in sampling the slip-perpendicular geometry. Although a data gap exists between the scanned fault scarps and rupture traces, the measurements are consistent within the error bars with a single geometrical description, i.e., consistent dimensionality, over nine decades of length scales.

Journal ArticleDOI
Georges Aad1, Brad Abbott2, J. Abdallah, A. A. Abdelalim3  +3034 moreInstitutions (195)
TL;DR: In this paper, the production cross sections of the inclusive Drell-Yan processes W-+/- -> l nu and Z/gamma* -> ll (l = e, mu) are measured in proton-proton collisions at root s = 7 TeV with the ATLAS detector.
Abstract: The production cross sections of the inclusive Drell-Yan processes W-+/- -> l nu and Z/gamma* -> ll (l = e, mu) are measured in proton-proton collisions at root s = 7 TeV with the ATLAS detector. The cross sections are reported integrated over a fiducial kinematic range, extrapolated to the full range, and also evaluated differentially as a function of the W decay lepton pseudorapidity and the Z boson rapidity, respectively. Based on an integrated luminosity of about 35 pb(-1) collected in 2010, the precision of these measurements reaches a few percent. The integrated and the differential W-+/- and Z/gamma* cross sections in the e and mu channels are combined, and compared with perturbative QCD calculations, based on a number of different parton distribution sets available at next-to-next-to-leading order.

Book ChapterDOI
18 Apr 2012
TL;DR: Recent improvements of the Grid’5000 software and services stack are presented to support large-scale experiments using virtualization technologies as building blocks to help with the management of applications dealing with tremendous amount of data.
Abstract: Almost ten years after its premises, the Grid’5000 testbed has become one of the most complete testbed for designing or evaluating large-scale distributed systems. Initially dedicated to the study of High Performance Computing, the infrastructure has evolved to address wider concerns related to Desktop Computing, the Internet of Services and more recently the Cloud Computing paradigm. This paper present recent improvements of the Grid’5000 software and services stack to support large-scale experiments using virtualization technologies as building blocks. Such contributions include the deployment of customized software environments, the reservation of dedicated network domain and the possibility to isolate them from the others, and the automation of experiments with a REST API. We illustrate the interest of these contributions by describing three different use-cases of large-scale experiments on the Grid’5000 testbed. The first one leverages virtual machines to conduct larger experiments spread over 4000 peers. The second one describes the deployment of 10000 KVM instances over 4 Grid’5000 sites. Finally, the last use case introduces a one-click deployment tool to easily deploy major IaaS solutions. The conclusion highlights some important challenges of Grid’5000 related to the use of OpenFlow and to the management of applications dealing with tremendous amount of data.

Journal ArticleDOI
Betty Abelev1, Jaroslav Adam2, Dagmar Adamová3, Andrew Marshall Adare4  +999 moreInstitutions (83)
TL;DR: The ALICE experiment has measured the inclusive J/psi production in Pb-Pb collisions at root s(NN) = 2.76 TeV down to zero transverse momentum in the rapidity range 2.5 < y < 4.
Abstract: The ALICE experiment has measured the inclusive J/psi production in Pb-Pb collisions at root s(NN) = 2.76 TeV down to zero transverse momentum in the rapidity range 2.5 < y < 4. A suppression of the inclusive J/psi yield in Pb-Pb is observed with respect to the one measured in pp collisions scaled by the number of binary nucleon-nucleon collisions. The nuclear modification factor, integrated over the 0%-80% most central collisions, is 0.545 +/- 0.032(stat) +/- 0.083dsyst_ and does not exhibit a significant dependence on the collision centrality. These features appear significantly different from measurements at lower collision energies. Models including J/psi production from charm quarks in a deconfined partonic phase can describe our data.

Journal ArticleDOI
TL;DR: The gold-standard treatment for OSA, nasal continuous positive airway pressure (CPAP), might improve cardiac symptoms and hemodynamic parameters in patients with the disease, however, large clinical trials are required to improve the understanding of the cardiac consequences of OSA.
Abstract: Obstructive sleep apnea (OSA) is associated with cardiovascular morbidity and mortality, largely as a result of myocardial anomalies. Numerous mechanisms cause OSA-related myocardial damage. The majority are initiated as a result of OSA-induced, chronic, intermittent hypoxia. The most-important mechanisms that lead to myocardial damage are increased sympathetic activity, endothelial dysfunction, systemic inflammation, oxidative stress, and metabolic anomalies. All these mechanisms promote the development of hypertension, which is common in patients with OSA. Hypertensive cardiomyopathy and coronary heart disease, as well as obesity-related, diabetic, and tachycardia-induced cardiomyopathies, are also associated with OSA. Left ventricular hypertrophy, myocardial fibrosis, atrial dilatation, and left ventricular systolic and diastolic dysfunction in patients with OSA explain the association of the disease with these clinical outcomes. The gold-standard treatment for OSA, nasal continuous positive airway pressure (CPAP), might improve cardiac symptoms and hemodynamic parameters in patients with the disease. However, large clinical trials are required to improve our understanding of the cardiac consequences of OSA, and determine the effect of treatment, particularly CPAP, on myocardial damage in symptomatic patients and primary prevention of cardiovascular disorders.

Journal ArticleDOI
Georges Aad1, Brad Abbott2, J. Abdallah3, S. Abdel Khalek  +3097 moreInstitutions (195)
TL;DR: In this article, a search for the standard model Higgs boson is performed in the diphoton decay channel, and the largest excess with respect to the background-only hypothesis is observed at 126.5 GeV, with a local significance of 2.8 standard deviations.
Abstract: A search for the standard model Higgs boson is performed in the diphoton decay channel. The data used correspond to an integrated luminosity of 4.9 fb-1 collected with the ATLAS detector at the Large Hadron Collider in proton-proton collisions at a center-of-mass energy of √s=7 TeV. In the diphoton mass range 110–150 GeV, the largest excess with respect to the background-only hypothesis is observed at 126.5 GeV, with a local significance of 2.8 standard deviations. Taking the look-elsewhere effect into account in the range 110–150 GeV, this significance becomes 1.5 standard deviations. The standard model Higgs boson is excluded at 95% confidence level in the mass ranges of 113–115 GeV and 134.5–136 GeV.

Journal ArticleDOI
K. Aamodt1, Betty Abelev2, A. Abrahantes Quintana, Dagmar Adamová3  +931 moreInstitutions (76)
TL;DR: In this paper, the shape of the pair correlation distributions is studied in a variety of collision centrality classes between 0 and 50% of the total hadronic cross section for particles in the pseudorapidity interval |eta| 0.76 TeV for transverse momenta 0.25 p(T)(a).

Journal ArticleDOI
TL;DR: The Perception-for-Action-Control Theory (PACT) as mentioned in this paper proposes a theoretical framework connecting perceptual shaping and motor procedural knowledge in speech multisensory processing in the human brain.