scispace - formally typeset
Search or ask a question

Showing papers by "University of Grenoble published in 2011"


Journal ArticleDOI
TL;DR: Focal cortical dysplasias (FCD) are localized regions of malformed cerebral cortex and are very frequently associated with epilepsy in both children and adults.
Abstract: Purpose Focal cortical dysplasias (FCD) are localized regions of malformed cerebral cortex and are very frequently associated with epilepsy in both children and adults. A broad spectrum of histopathology has been included in the diagnosis of FCD. An ILAE task force proposes an international consensus classification system to better characterize specific clinicopathological FCD entities. Methods Thirty-two Task Force members have reevaluated available data on electroclinical presentation, imaging, neuropathological examination of surgical specimens as well as postsurgical outcome. Key findings The ILAE Task Force proposes a three-tiered classification system. FCD Type I refers to isolated lesions, which present either as radial (FCD Type Ia) or tangential (FCD Type Ib) dyslamination of the neocortex, microscopically identified in one or multiple lobes. FCD Type II is an isolated lesion characterized by cortical dyslamination and dysmorphic neurons without (Type IIa) or with balloon cells (Type IIb). Hence, the major change since a prior classification represents the introduction of FCD Type III, which occurs in combination with hippocampal sclerosis (FCD Type IIIa), or with epilepsy-associated tumors (FCD Type IIIb). FCD Type IIIc is found adjacent to vascular malformations, whereas FCD Type IIId can be diagnosed in association with epileptogenic lesions acquired in early life (i.e., traumatic injury, ischemic injury or encephalitis). Significance This three-tiered classification system will be an important basis to evaluate imaging, electroclinical features, and postsurgical seizure control as well as to explore underlying molecular pathomechanisms in FCD.

1,395 citations


Book ChapterDOI
14 Jul 2011
TL;DR: A scalable reachability algorithm for hybrid systems with piecewise affine, non-deterministic dynamics that combines polyhedra and support function representations of continuous sets to compute an over-approximation of the reachable states is presented.
Abstract: We present a scalable reachability algorithm for hybrid systems with piecewise affine, non-deterministic dynamics. It combines polyhedra and support function representations of continuous sets to compute an over-approximation of the reachable states. The algorithm improves over previous work by using variable time steps to guarantee a given local error bound. In addition, we propose an improved approximation model, which drastically improves the accuracy of the algorithm. The algorithm is implemented as part of SpaceEx, a new verification platform for hybrid systems, available at spaceex.imag.fr. Experimental results of full fixed-point computations with hybrid systems with more than 100 variables illustrate the scalability of the approach.

901 citations


Journal ArticleDOI
29 Jul 2011-Science
TL;DR: An analysis of protein-protein interactions in Arabidopsis identifies the plant interactome and demonstrated plant immune system functions for 15 of 17 tested host proteins that interact with effectors from both pathogens.
Abstract: Plants generate effective responses to infection by recognizing both conserved and variable pathogen-encoded molecules. Pathogens deploy virulence effector proteins into host cells, where they interact physically with host proteins to modulate defense. We generated an interaction network of plant-pathogen effectors from two pathogens spanning the eukaryote-eubacteria divergence, three classes of Arabidopsis immune system proteins, and ~8000 other Arabidopsis proteins. We noted convergence of effectors onto highly interconnected host proteins and indirect, rather than direct, connections between effectors and plant immune receptors. We demonstrated plant immune system functions for 15 of 17 tested host proteins that interact with effectors from both pathogens. Thus, pathogens from different kingdoms deploy independently evolved virulence proteins that interact with a limited set of highly connected cellular hubs to facilitate their diverse life-cycle strategies.

739 citations


Journal ArticleDOI
TL;DR: In this paper, the centrality dependence of the chargedparticle multiplicity density at midrapidity in Pb-Pb collisions at root s(NN) = 2: 76 TeV is presented.
Abstract: The centrality dependence of the charged-particle multiplicity density at midrapidity in Pb-Pb collisions at root s(NN) = 2: 76 TeV is presented. The charged-particle density normalized per participating nucleon pair increases by about a factor of 2 from peripheral (70%-80%) to central (0%-5%) collisions. The centrality dependence is found to be similar to that observed at lower collision energies. The data are compared with models based on different mechanisms for particle production in nuclear collisions.

553 citations


Journal ArticleDOI
01 Sep 2011
TL;DR: Non-small cell lung carcinomas (NSCLC), in patients with advanced stage disease, are to be classified into more specific types, such as adenocarcinoma or squamous cell carcinoma, whenever possible.
Abstract: Introduction: The American Thoracic Society is a cosponsor of a newly published lung adenocarcinoma classification.Methods: An international multidisciplinary panel of experts was formed. A systematic review was performed and recommendations were graded by strength and quality of the evidence using the Grading of Recommendations, Assessment, Development and Evaluation (GRADE) approach.Results: The classification addresses both resection specimens and small biopsies/cytology. The terms bronchioloalveolar carcinoma and mixed subtype adenocarcinoma are no longer used. For resection specimens, new concepts are introduced such as adenocarcinoma in situ and minimally invasive adenocarcinoma for small solitary adenocarcinomas with pure lepidic growth and predominant lepidic growth with ≤ 5 mm invasion, respectively. Invasive adenocarcinomas are classified by predominant pattern after using comprehensive histologic subtyping with lepidic, acinar, papillary, and solid patterns; micropapillary is added. In the new ...

536 citations


Journal ArticleDOI
TL;DR: In this article, the authors measured the transverse momentum spectra of primary charged particles in Pb-Pb collisions at root s(NN) = 2.76 TeV at the ALICE Collaboration at the LHC.

519 citations


Journal ArticleDOI
K. Aamodt1, Betty Abelev2, A. Abrahantes Quintana, Dagmar Adamová3  +972 moreInstitutions (84)
11 Jul 2011
TL;DR: The first measurement of the triangular v3, quadrangular v4, and pentagonal v5 charged particle flow in Pb-Pb collisions is reported, and a double peaked structure in the two-particle azimuthal correlations is observed, which can be naturally explained from the measured anisotropic flow Fourier coefficients.
Abstract: We report on the first measurement of the triangular nu(3), quadrangular nu(4), and pentagonal nu(5) charged particle flow in Pb-Pb collisions at root s(NN) = 2.76 TeV measured with the ALICE detector at the CERN Large Hadron Collider. We show that the triangular flow can be described in terms of the initial spatial anisotropy and its fluctuations, which provides strong constraints on its origin. In the most central events, where the elliptic flow nu(2) and nu(3) have similar magnitude, a double peaked structure in the two-particle azimuthal correlations is observed, which is often interpreted as a Mach cone response to fast partons. We show that this structure can be naturally explained from the measured anisotropic flow Fourier coefficients.

515 citations


Journal ArticleDOI
TL;DR: The goal of this paper is to summarize the new proposals from ECIL 3, based on the results of studies published after the ECIL 2 meeting, and the prophylactic recommendations for hematopoietic stem cell transplant recipients were formulated differently.
Abstract: In 2005, several groups, including the European Group for Blood and Marrow Transplantation, the European Organization for Treatment and Research of Cancer, the European Leukemia Net and the Immunocompromised Host Society created the European Conference on Infections in Leukemia (ECIL). The main goal of ECIL is to elaborate guidelines, or recommendations, for the management of infections in leukemia and stem cell transplant patients. The first sets of ECIL slides about the management of invasive fungal disease were made available on the web in 2006 and the papers were published in 2007. The third meeting of the group (ECIL 3) was held in September 2009 and the group updated its previous recommendations. The goal of this paper is to summarize the new proposals from ECIL 3, based on the results of studies published after the ECIL 2 meeting: (1) the prophylactic recommendations for hematopoietic stem cell transplant recipients were formulated differently, by splitting the neutropenic and the GVHD phases and taking into account recent data on voriconazole; (2) micafungin was introduced as an alternative drug for empirical antifungal therapy; (3) although several studies were published on preemptive antifungal approaches in neutropenic patients, the group decided not to propose any recommendation, as the only randomized study comparing an empirical versus a preemptive approach showed a significant excess of fungal disease in the preemptive group.

467 citations


Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, A. A. Abdelalim4  +3034 moreInstitutions (179)
TL;DR: In this article, a search for squarks and gluinos in final states containing jets, missing transverse momentum and no electrons or muons is presented, and the data were recorded by the ATLAS experiment in sqrt(s) = 7 TeV proton-proton collisions at the Large Hadron Collider.

452 citations


Journal ArticleDOI
TL;DR: It is shown that up to 86% of terrestrial and 83% of freshwater ecoregions will be exposed to average monthly temperature patterns >2 SDs (2σ) of the 1961–1990 baseline, including 82% of critically endangered e coregions, and many Global 200 ecoreGions may be under substantial climatic stress by 2100.
Abstract: The current rate of warming due to increases in greenhouse gas (GHG) emissions is very likely unprecedented over the last 10,000 y. Although the majority of countries have adopted the view that global warming must be limited to 600 realizations from climate model ensembles, we show that up to 86% of terrestrial and 83% of freshwater ecoregions will be exposed to average monthly temperature patterns >2 SDs (2σ) of the 1961-1990 baseline, including 82% of critically endangered ecoregions. The entire range of 89 ecoregions will experience extreme monthly temperatures with a local warming of <2 °C. Tropical and subtropical ecoregions, and mangroves, face extreme conditions earliest, some with <1 °C warming. In contrast, few ecoregions within Boreal Forests and Tundra biomes will experience such extremes this century. On average, precipitation regimes do not exceed 2σ of the baseline period, although considerable variability exists across the climate realizations. Further, the strength of the correlation between seasonal temperature and precipitation changes over numerous ecoregions. These results suggest many Global 200 ecoregions may be under substantial climatic stress by 2100.

334 citations


Journal ArticleDOI
TL;DR: In a superconductor, the electrons are paired up and scanning tunnelling microscopy shows that the pairs localize together rather than breaking up and forming localized single electrons in the insulating state as discussed by the authors.
Abstract: Disorder leads to localization of electrons at low temperatures, changing metals to insulators. In a superconductor the electrons are paired up, and scanning tunnelling microscopy shows that the pairs localize together rather than breaking up and forming localized single electrons in the insulating state.

Journal ArticleDOI
TL;DR: An improvement of an algorithm of Aurenhammer, Hoffmann and Aronov to find a least square matching between a probability density and finite set of sites with mass constraints, in the Euclidean plane is proposed.
Abstract: In this paper, we propose an improvement of an algorithm of Aurenhammer, Hoffmann and Aronov to find a least square matching between a probability density and finite set of sites with mass constraints, in the Euclidean plane. Our algorithm exploits the multiscale nature of this optimal transport problem. We iteratively simplify the target using Lloyd's algorithm, and use the solution of the simplified problem as a rough initial solution to the more complex one. This approach allows for fast estimation of distances between measures related to optimal transport (known as Earth-mover or Wasserstein distances). We also discuss the implementation of these algorithms, and compare the original one to its multiscale counterpart.

Journal ArticleDOI
V. M. Abazov1, Brad Abbott2, Bobby Samir Acharya3, Mark Raymond Adams4  +432 moreInstitutions (83)
TL;DR: In this paper, the forward-backward asymmetry in top quark-antiquark production in proton-antiproton collisions in the final state containing a lepton and at least four jets was measured.
Abstract: We present a measurement of forward-backward asymmetry in top quark-antiquark production in proton-antiproton collisions in the final state containing a lepton and at least four jets. Using a dataset corresponding to an integrated luminosity of $5.4\,\mathrm {fb}^{-1}$, collected by the \DZ\ experiment at the Fermilab Tevatron Collider, we measure the \ttbar\ forward-backward asymmetry to be $(9.2 \pm 3.7)$% at the reconstruction level. When corrected for detector acceptance and resolution, the asymmetry is found to be $(19.6 \pm 6.5)$%. We also measure a corrected asymmetry based on the lepton from a top quark decay, found to be $(15.2 \pm 4.0)$%. The results are compared to predictions based on the next-to-leading-order QCD generator {\sc mc@nlo}. The sensitivity of the measured and predicted asymmetries to the modeling of gluon radiation is discussed.

Journal ArticleDOI
TL;DR: The performance of newly synthesized chitosan-coated silver nanotriangles (Chit-AgNTs) with strong resonances in near-infrared (NIR) to operate as photothermal agents against a line of human non-small lung cancer cells is reported.

Journal ArticleDOI
D. Aad1, D. Aad2, Brad Abbott3, Brad Abbott1  +5600 moreInstitutions (187)
TL;DR: In this article, measurements of luminosity obtained using the ATLAS detector during early running of the Large Hadron Collider (LHC) at root s = 7 TeV are presented, independently determined using several detectors and multiple algorithms, each having different acceptances, systematic uncertainties and sensitivity to background.
Abstract: Measurements of luminosity obtained using the ATLAS detector during early running of the Large Hadron Collider (LHC) at root s = 7 TeV are presented. The luminosity is independently determined using several detectors and multiple algorithms, each having different acceptances, systematic uncertainties and sensitivity to background. The ratios of the luminosities obtained from these methods are monitored as a function of time and of mu, the average number of inelastic interactions per bunch crossing. Residual time- and mu-dependence between the methods is less than 2% for 0 < mu < 2.5. Absolute luminosity calibrations, performed using beam separation scans, have a common systematic uncertainty of +/- 11%, dominated by the measurement of the LHC beam currents. After calibration, the luminosities obtained from the different methods differ by at most +/- 2%. The visible cross sections measured using the beam scans are compared to predictions obtained with the PYTHIA and PHOJET event generators and the ATLAS detector simulation.

Journal ArticleDOI
TL;DR: Replacing compact subsets by measures, a notion of distance function to a probability distribution in ℝd is introduced and it is shown that it is possible to reconstruct offsets of sampled shapes with topological guarantees even in the presence of outliers.
Abstract: Data often comes in the form of a point cloud sampled from an unknown compact subset of Euclidean space. The general goal of geometric inference is then to recover geometric and topological features (e.g., Betti numbers, normals) of this subset from the approximating point cloud data. It appears that the study of distance functions allows one to address many of these questions successfully. However, one of the main limitations of this framework is that it does not cope well with outliers or with background noise. In this paper, we show how to extend the framework of distance functions to overcome this problem. Replacing compact subsets by measures, we introduce a notion of distance function to a probability distribution in ℝ d . These functions share many properties with classical distance functions, which make them suitable for inference purposes. In particular, by considering appropriate level sets of these distance functions, we show that it is possible to reconstruct offsets of sampled shapes with topological guarantees even in the presence of outliers. Moreover, in settings where empirical measures are considered, these functions can be easily evaluated, making them of particular practical interest.

Journal ArticleDOI
TL;DR: The first measurement of two-pion Bose-Einstein correlations in central Pb-Pb collisions at root(NN)-N-S = 2.76 TeV at the Large Hadron Collider is presented in this article.

Journal ArticleDOI
TL;DR: In this article, the relative impact of a long-term brand management instrument (brand personality) and a short-term marketing mix instrument (sales promotions) on brand equity formation is assessed.

Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, R. Ansari2, M. Arnaud3  +188 moreInstitutions (43)
TL;DR: The Planck High Frequency Instrument (HFI) is designed to measure the temperature and polarization anisotropies of the Cosmic Microwave Background and galactic foregrounds in six wide bands at an angular resolution of 10' (100 GHz), 7' (143 GHz), and 5' (217 GHz) as mentioned in this paper.
Abstract: The Planck High Frequency Instrument (HFI) is designed to measure the temperature and polarization anisotropies of the Cosmic Microwave Background and galactic foregrounds in six wide bands centered at 100, 143, 217, 353, 545 and 857 GHz at an angular resolution of 10' (100 GHz), 7' (143 GHz), and 5' (217 GHz and higher). HFI has been operating flawlessly since launch on 14 May 2009. The bolometers cooled to 100 mK as planned. The settings of the readout electronics, such as the bolometer bias current, that optimize HFI's noise performance on orbit are nearly the same as the ones chosen during ground testing. Observations of Mars, Jupiter, and Saturn verified both the optical system and the time response of the detection chains. The optical beams are close to predictions from physical optics modeling. The time response of the detection chains is close to pre-launch measurements. The detectors suffer from an unexpected high flux of cosmic rays related to low solar activity. Due to the redundancy of Planck's observations strategy, the removal of a few percent of data contaminated by glitches does not affect significantly the sensitivity. The cosmic rays heat up significantly the bolometer plate and the modulation on periods of days to months of the heat load creates a common drift of all bolometer signals which do not affect the scientific capabilities. Only the high energy cosmic rays showers induce inhomogeneous heating which is a probable source of low frequency noise.

Journal ArticleDOI
TL;DR: A high-fat diet results in both a decrease in mitochondrial quinone pool and a profound modification in mitochondrial lipid composition, which appears to play a key role in the resulting inhibition of fatty acid oxidation and of mitochondrial oxidative-phosphorylation associated with an increased mitochondrial ROS production.

Journal ArticleDOI
TL;DR: In this paper, the mechanical behavior of the 6061-T6 aluminium alloy at room temperature for various previous thermal histories representative of an electron beam welding was described, and a fast-heating device was designed to control and apply thermal loadings on tensile specimens.
Abstract: This paper describes the mechanical behavior of the 6061-T6 aluminium alloy at room temperature for various previous thermal histories representative of an electron beam welding. A fast-heating device has been designed to control and apply thermal loadings on tensile specimens. Tensile tests show that the yield stress at ambient temperature decreases if the maximum temperature reached increases or if the heating rate decreases. This variation of the mechanical properties is the result of microstructural changes which have been observed by Transmission Electron Microscopy (TEM).

Journal ArticleDOI
TL;DR: It is shown how even small amounts of uncertainty can create consumer confusion that reduces or eliminates the value to firms of adopting voluntary labels.
Abstract: Labels certify that a product meets some standard for quality, but often consumers are unsure of the exact standard that the label represents. Focusing on the case of ecolabels for environmental quality, we show how even small amounts of uncertainty can create consumer confusion that reduces or eliminates the value to firms of adopting voluntary labels. First, consumers are most suspicious of a label when a product with a bad reputation has it, so labels are often unpersuasive at showing that a seemingly bad product is actually good. Second, label proliferation aggravates the effect of uncertainty, causing the informativeness of labels to decrease rather than increase. Third, uncertainty makes labeling and nonlabeling equilibria more likely to coexist as the number of labels increases, so consumers face greater strategic uncertainty over how to interpret the presence or absence of a label. Finally, a label can be legitimitized or spoiled for other products when a product with a good or bad reputation displays it, so firms may adopt labels strategically to manipulate such information spillovers, which further exacerbates label confusion. Managers can reduce label confusion by supporting mandatory labeling or by undertaking investments to make certain labels “focal.” This paper was accepted by Pradeep Chintagunta and Preyas Desai, special issue editors. This paper was accepted by Pradeep Chintagunta and Preyas Desai, special issue editors.

Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, A. A. Abdelalim4  +3163 moreInstitutions (177)
TL;DR: In this article, the anti-kt algorithm is used to identify jets, with two jet resolution parameters, R = 0.4 and 0.6, and the dominant uncertainty comes from the jet energy scale, which is determined to within 7% for central jets above 60 GeV transverse momentum.
Abstract: Jet cross sections have been measured for the first time in proton-proton collisions at a centre-of-mass energy of 7 TeV using the ATLAS detector. The measurement uses an integrated luminosity of 17 nb-1 recorded at the Large Hadron Collider. The anti-kt algorithm is used to identify jets, with two jet resolution parameters, R = 0.4 and 0.6. The dominant uncertainty comes from the jet energy scale, which is determined to within 7% for central jets above 60 GeV transverse momentum. Inclusive single-jet differential cross sections are presented as functions of jet transverse momentum and rapidity. Dijet cross sections are presented as functions of dijet mass and the angular variable $\chi$. The results are compared to expectations based on next-to-leading-order QCD, which agree with the data, providing a validation of the theory in a new kinematic regime.

Journal ArticleDOI
Georges Aad1, Brad Abbott2, J. Abdallah3, A. A. Abdelalim4  +3139 moreInstitutions (192)
TL;DR: In this paper, a measurement of the cross section for the inclusive production of isolated prompt photons in pp collisions at a center-of-mass energy root s = 7 TeV is presented.
Abstract: A measurement of the cross section for the inclusive production of isolated prompt photons in pp collisions at a center-of-mass energy root s = 7 TeV is presented. The measurement covers the pseudorapidity ranges vertical bar eta(gamma)vertical bar < 1: 37 and 1: 52 <= vertical bar eta(gamma)vertical bar < 1: 81 in the transverse energy range 15 <= E-T(gamma) < 100 GeV. The results are based on an integrated luminosity of 880 nb(-1), collected with the ATLAS detector at the Large Hadron Collider. Photon candidates are identified by combining information from the calorimeters and from the inner tracker. Residual background in the selected sample is estimated from data based on the observed distribution of the transverse isolation energy in a narrow cone around the photon candidate. The results are compared to predictions from next-to-leading-order perturbative QCD calculations.

Journal ArticleDOI
TL;DR: In this paper, the first results obtained detecting the J/psi through the dilepton decay into e(+)e(-) and mu(+)mu(-) pairs in the rapidity ranges vertical bar y vertical bar < 0.9 and 2.5 < y < 4, respectively, and with acceptance down to zero PT.

Journal ArticleDOI
TL;DR: In this article, a precession electron diffraction (PED) was used to obtain LiFePO4 and FePO 4 phase mapping at the nanometer-scale level on a large number of particles of sizes between 50 and 300 nm in a partially charged cathode.
Abstract: A recent transmission electron microscopy (TEM) method using precession electron diffraction (PED) was used to obtain LiFePO4 and FePO 4 phase mapping at the nanometer-scale level on a large number of particles of sizes between 50 and 300 nm in a partially charged cathode. Despite the similarity of the two phases (the difference of lattice parameters is <5%), the method gives clear results that have been confirmed using high-resolution transmission electron microscopy (HRTEM) and energy-filtered transmission electron microscopy/electron energy loss spectroscopy (EFTEM/EELS) experiments. The PED maps show that the particles are either fully lithiated or fully delithiated and, therefore, bring a strong support to the domino-cascade model at the nanoscale level (scale of a particle). A core-shell model or spinodal decomposition at mesoscale (scale of agglomerates of particles) is possible. Size effects on the transformation are also discussed. © 2011 American Chemical Society.

Journal ArticleDOI
TL;DR: In this paper, U-tube measurements of instantaneous velocities, concentrations, and fluxes for a well-sorted, medium-sized sand in oscillatory sheet flow are analyzed.
Abstract: [1] U‐tube measurements of instantaneous velocities, concentrations, and fluxes for a well‐sorted, medium‐sized sand in oscillatory sheet flow are analyzed. The experiments involved two velocity‐asymmetric flows, the same two flows with an opposing current of 0.4 m/s, and a mixed skewed‐asymmetric flow, all with a velocity amplitude of 1.2 m/s and flow period of 7 s. We find that the net positive transport rate beneath velocity‐ asymmetric oscillatory flow results from large, but opposing sand fluxes during the positive and negative flow phase. With an increase in velocity asymmetry and, in particular, velocity skewness, the difference in the magnitude of the fluxes in the two half cycles increases, leading to larger net transport rates. This trend is consistent with the observed increase in skewness of the oscillatory bed shear stress. Phase‐lag effects, whereby sand stirred during the negative flow phase has not settled by the time of the negative‐to‐positive flow reversal and is subsequently transported during the positive flow phase, are notable but of minor importance to the net transport rate compared to earlier experiments with finer sands. In the vertical, the oscillatory flux is positive above the no‐ flow bed. Within the sheet flow pick‐up layer, the oscillatory flux is negative and similar in magnitude to the positive flux induced by the residual flow. The 0.4 m/s opposing current causes more sand to be picked up during the negative than during the positive flow phase. Above the no‐flow bed the resulting negative oscillatory flux is comparable in magnitude to the current‐related flux.

Journal ArticleDOI
Georges Aad, Brad Abbott, J. Abdallah1, A. A. Abdelalim2  +3207 moreInstitutions (193)
TL;DR: The first search for supersymmetry in final states containing one isolated electron or muon, jets, and missing transverse momentum from 7 TeV proton-proton collisions at the LHC was presented in this article.
Abstract: This Letter presents the first search for supersymmetry in final states containing one isolated electron or muon, jets, and missing transverse momentum from sqrt{s} = 7 TeV proton-proton collisions at the LHC. The data were recorded by the ATLAS experiment during 2010 and correspond to a total integrated luminosity of 35 pb-1. No excess above the standard model background expectation is observed. Limits are set on the parameters of the minimal supergravity framework, extending previous limits. For A_0 = 0 GeV, tan beta = 3, mu > 0 and for equal squark and gluino masses, gluino masses below 700 GeV are excluded at 95% confidence level.

Journal ArticleDOI
TL;DR: In this paper, Synchrotron X-ray diffraction analysis suggested that this was due to a reduction in the kinetics of twin formation, and the strongest strengthening coefficient in cold strips was obtained with Ti additions ≤ 0.1 wt.% (+1 380 MPa/wt.% Ti) but the effect quickly saturated after an increase of ∼+150 MPa.
Abstract: At low strains (∊ 0.3) the work hardening rate decreased slightly. Synchrotron X-ray diffraction analysis suggested that this was due to a reduction in the kinetics of twin formation. The highest strengthening coefficient in cold strips was obtained with Ti additions ≤0.1 wt.% (+1 380 MPa/wt.% Ti) but the effect quickly saturated after an increase of ∼+150 MPa. With Nb additions only modest hardening (+187 MPa/wt.%) could be achieved. The strengthening due to V was >530 MPa/wt.% for V additions ≤ 0.4 wt.% Saturation effects are less critical with V additions and yield stress increases of +375 MPa were demonstrated.

Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, M. Arnaud3, M. Ashdown4  +231 moreInstitutions (58)
TL;DR: In this paper, the authors investigated the source of the millimetre excess in the spectral energy distribution of the Large Magellanic Cloud (LMC) and Small Magellan Cloud (SMC) using the Planck data.
Abstract: The integrated spectral energy distributions (SED) of the Large Magellanic Cloud (LMC) and Small Magellanic Cloud (SMC) appear significantly flatter than expected from dust models based on their far-infrared and radio emission. The still unexplained origin of this millimetre excess is investigated here using the Planck data. The integrated SED of the two galaxies before subtraction of the foreground (Milky Way) and background (CMB fluctuations) emission are in good agreement with previous determinations, confirming the presence of the millimetre excess. In the context of this preliminary analysis we do not propose a full multi-component fitting of the data, but instead subtract contributions unrelated to the galaxies and to dust emission. The background CMB contribution is subtracted using an internal linear combination (ILC) method performed locally around the galaxies. The foreground emission from the Milky Way is subtracted as a Galactic Hi template, and the dust emissivity is derived in a region surrounding the two galaxies and dominated by Milky Way emission. After subtraction, the remaining emission of both galaxies correlates closely with the atomic and molecular gas emission of the LMC and SMC. The millimetre excess in the LMC can be explained by CMB fluctuations, but a significant excess is still present in the SMC SED. The Planck and IRAS–IRIS data at 100 μm are combined to produce thermal dust temperature and optical depth maps of the two galaxies. The LMC temperature map shows the presence of a warm inner arm already found with the Spitzer data, but which also shows the existence of a previously unidentified cold outer arm. Several cold regions are found along this arm, some of which are associated with known molecular clouds. The dust optical depth maps are used to constrain the thermal dust emissivity power-law index (β). The average spectral index is found to be consistent with β = 1.5 and β = 1.2 below 500μm for the LMC and SMC respectively, significantly flatter than the values observed in the Milky Way. Also, there is evidence in the SMC of a further flattening of the SED in the sub-mm, unlike for the LMC where the SED remains consistent with β = 1.5. The spatial distribution of the millimetre dustexcess in the SMC follows the gas and thermal dust distribution. Different models are explored in order to fit the dust emission in the SMC. It is concluded that the millimetre excess is unlikely to be caused by very cold dust emission and that it could be due to a combination of spinning dust emission and thermal dust emission by more amorphous dust grains than those present in our Galaxy.