scispace - formally typeset
Search or ask a question

Showing papers by "ETH Zurich published in 2012"


Journal ArticleDOI
TL;DR: Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis that facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system.
Abstract: Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.

43,540 citations


Journal ArticleDOI
TL;DR: In this paper, results from searches for the standard model Higgs boson in proton-proton collisions at 7 and 8 TeV in the CMS experiment at the LHC, using data samples corresponding to integrated luminosities of up to 5.8 standard deviations.

8,857 citations


Journal ArticleDOI
TL;DR: A new superpixel algorithm is introduced, simple linear iterative clustering (SLIC), which adapts a k-means clustering approach to efficiently generate superpixels and is faster and more memory efficient, improves segmentation performance, and is straightforward to extend to supervoxel generation.
Abstract: Computer vision applications have come to rely increasingly on superpixels in recent years, but it is not always clear what constitutes a good superpixel algorithm. In an effort to understand the benefits and drawbacks of existing methods, we empirically compare five state-of-the-art superpixel algorithms for their ability to adhere to image boundaries, speed, memory efficiency, and their impact on segmentation performance. We then introduce a new superpixel algorithm, simple linear iterative clustering (SLIC), which adapts a k-means clustering approach to efficiently generate superpixels. Despite its simplicity, SLIC adheres to boundaries as well as or better than previous methods. At the same time, it is faster and more memory efficient, improves segmentation performance, and is straightforward to extend to supervoxel generation.

7,849 citations


Journal ArticleDOI
Kaoru Hagiwara, Ken Ichi Hikasa1, Koji Nakamura, Masaharu Tanabashi1, M. Aguilar-Benitez, Claude Amsler2, R. M. Barnett3, P. R. Burchat4, C. D. Carone5, C. Caso6, G. Conforto7, Olav Dahl3, Michael Doser8, Semen Eidelman9, Jonathan L. Feng10, L. K. Gibbons11, M. C. Goodman12, Christoph Grab13, D. E. Groom3, Atul Gurtu14, Atul Gurtu8, K. G. Hayes15, J.J. Hernández-Rey16, K. Honscheid17, Christopher Kolda18, Michelangelo L. Mangano8, D. M. Manley19, Aneesh V. Manohar20, John March-Russell8, Alberto Masoni, Ramon Miquel3, Klaus Mönig, Hitoshi Murayama3, Hitoshi Murayama21, S. Sánchez Navas13, Keith A. Olive22, Luc Pape8, C. Patrignani6, A. Piepke23, Matts Roos24, John Terning25, Nils A. Tornqvist24, T. G. Trippe3, Petr Vogel26, C. G. Wohl3, Ron L. Workman27, W-M. Yao3, B. Armstrong3, P. S. Gee3, K. S. Lugovsky, S. B. Lugovsky, V. S. Lugovsky, Marina Artuso28, D. Asner29, K. S. Babu30, E. L. Barberio8, Marco Battaglia8, H. Bichsel31, O. Biebel32, P. Bloch8, Robert N. Cahn3, Ariella Cattai8, R.S. Chivukula33, R. Cousins34, G. A. Cowan35, Thibault Damour36, K. Desler, R. J. Donahue3, D. A. Edwards, Victor Daniel Elvira37, Jens Erler38, V. V. Ezhela, A Fassò8, W. Fetscher13, Brian D. Fields39, B. Foster40, Daniel Froidevaux8, Masataka Fukugita41, Thomas K. Gaisser42, L. A. Garren37, H J Gerber13, Frederick J. Gilman43, Howard E. Haber44, C. A. Hagmann29, J.L. Hewett4, Ian Hinchliffe3, Craig J. Hogan31, G. Höhler45, P. Igo-Kemenes46, John David Jackson3, Kurtis F Johnson47, D. Karlen48, B. Kayser37, S. R. Klein3, Konrad Kleinknecht49, I.G. Knowles50, P. Kreitz4, Yu V. Kuyanov, R. Landua8, Paul Langacker38, L. S. Littenberg51, Alan D. Martin52, Tatsuya Nakada8, Tatsuya Nakada53, Meenakshi Narain33, Paolo Nason, John A. Peacock54, H. R. Quinn55, Stuart Raby17, Georg G. Raffelt32, E. A. Razuvaev, B. Renk49, L. Rolandi8, Michael T Ronan3, L.J. Rosenberg54, C.T. Sachrajda55, A. I. Sanda56, Subir Sarkar57, Michael Schmitt58, O. Schneider53, Douglas Scott59, W. G. Seligman60, M. H. Shaevitz60, Torbjörn Sjöstrand61, George F. Smoot3, Stefan M Spanier4, H. Spieler3, N. J. C. Spooner62, Mark Srednicki63, Achim Stahl, Todor Stanev42, M. Suzuki3, N. P. Tkachenko, German Valencia64, K. van Bibber29, Manuella Vincter65, D. R. Ward66, Bryan R. Webber66, M R Whalley52, Lincoln Wolfenstein43, J. Womersley37, C. L. Woody51, Oleg Zenin 
Tohoku University1, University of Zurich2, Lawrence Berkeley National Laboratory3, Stanford University4, College of William & Mary5, University of Genoa6, University of Urbino7, CERN8, Budker Institute of Nuclear Physics9, University of California, Irvine10, Cornell University11, Argonne National Laboratory12, ETH Zurich13, Tata Institute of Fundamental Research14, Hillsdale College15, Spanish National Research Council16, Ohio State University17, University of Notre Dame18, Kent State University19, University of California, San Diego20, University of California, Berkeley21, University of Minnesota22, University of Alabama23, University of Helsinki24, Los Alamos National Laboratory25, California Institute of Technology26, George Washington University27, Syracuse University28, Lawrence Livermore National Laboratory29, Oklahoma State University–Stillwater30, University of Washington31, Max Planck Society32, Boston University33, University of California, Los Angeles34, Royal Holloway, University of London35, Université Paris-Saclay36, Fermilab37, University of Pennsylvania38, University of Illinois at Urbana–Champaign39, University of Bristol40, University of Tokyo41, University of Delaware42, Carnegie Mellon University43, University of California, Santa Cruz44, Karlsruhe Institute of Technology45, Heidelberg University46, Florida State University47, Carleton University48, University of Mainz49, University of Edinburgh50, Brookhaven National Laboratory51, Durham University52, University of Lausanne53, Massachusetts Institute of Technology54, University of Southampton55, Nagoya University56, University of Oxford57, Northwestern University58, University of British Columbia59, Columbia University60, Lund University61, University of Sheffield62, University of California, Santa Barbara63, Iowa State University64, University of Alberta65, University of Cambridge66
TL;DR: The Particle Data Group's biennial review as mentioned in this paper summarizes much of particle physics, using data from previous editions, plus 2658 new measurements from 644 papers, and lists, evaluates, and average measured properties of gauge bosons, leptons, quarks, mesons, and baryons.
Abstract: This biennial Review summarizes much of particle physics. Using data from previous editions, plus 2658 new measurements from 644 papers, we list, evaluate, and average measured properties of gauge bosons, leptons, quarks, mesons, and baryons. We summarize searches for hypothetical particles such as Higgs bosons, heavy neutrinos, and supersymmetric particles. All the particle properties and search limits are listed in Summary Tables. We also give numerous tables, figures, formulae, and reviews of topics such as the Standard Model, particle detectors, probability, and statistics. Among the 112 reviews are many that are new or heavily revised including those on Heavy-Quark and Soft-Collinear Effective Theory, Neutrino Cross Section Measurements, Monte Carlo Event Generators, Lattice QCD, Heavy Quarkonium Spectroscopy, Top Quark, Dark Matter, V-cb & V-ub, Quantum Chromodynamics, High-Energy Collider Parameters, Astrophysical Constants, Cosmological Parameters, and Dark Matter. A booklet is available containing the Summary Tables and abbreviated versions of some of the other sections of this full Review. All tables, listings, and reviews (and errata) are also available on the Particle Data Group website: http://pdg.lbl.gov.

4,465 citations


Journal ArticleDOI
TL;DR: These guidelines are presented for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process vs. those that measure flux through the autophagy pathway (i.e., the complete process); thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from stimuli that result in increased autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field.

4,316 citations


Journal ArticleDOI
TL;DR: In this comparative study, missForest outperforms other methods of imputation especially in data settings where complex interactions and non-linear relations are suspected and the out-of-bag imputation error estimates of missForest prove to be adequate in all settings.
Abstract: Motivation Modern data acquisition based on high-throughput technology is often facing the problem of missing data. Algorithms commonly used in the analysis of such large-scale data often depend on a complete set. Missing value imputation offers a solution to this problem. However, the majority of available imputation methods are restricted to one type of variable only: continuous or categorical. For mixed-type data, the different types are usually handled separately. Therefore, these methods ignore possible relations between variable types. We propose a non-parametric method which can cope with different types of variables simultaneously. Results We compare several state of the art methods for the imputation of missing values. We propose and evaluate an iterative imputation method (missForest) based on a random forest. By averaging over many unpruned classification or regression trees, random forest intrinsically constitutes a multiple imputation scheme. Using the built-in out-of-bag error estimates of random forest, we are able to estimate the imputation error without the need of a test set. Evaluation is performed on multiple datasets coming from a diverse selection of biological fields with artificially introduced missing values ranging from 10% to 30%. We show that missForest can successfully handle missing values, particularly in datasets including different types of variables. In our comparative study, missForest outperforms other methods of imputation especially in data settings where complex interactions and non-linear relations are suspected. The out-of-bag imputation error estimates of missForest prove to be adequate in all settings. Additionally, missForest exhibits attractive computational efficiency and can cope with high-dimensional data. Availability The package missForest is freely available from http://stat.ethz.ch/CRAN/. Contact stekhoven@stat.math.ethz.ch; buhlmann@stat.math.ethz.ch

2,928 citations


Journal ArticleDOI
TL;DR: A new strategy that systematically queries sample sets for the presence and quantity of essentially any protein of interest is presented, using the information available in fragment ion spectral libraries to mine the complete fragment ion maps generated using a data-independent acquisition method.

2,358 citations


Journal ArticleDOI
TL;DR: In this paper, a conceptual model of an ideal-typical transdisciplinary research process is synthesized and structures such a set of principles from various strands of the literature and empirical experiences, looking at challenges and coping strategies as experienced in transdisciplinary sustainability projects in Europe, North America, South America, Africa, and Asia.
Abstract: There is emerging agreement that sustainability challenges require new ways of knowledge production and decision-making. One key aspect of sustainability science, therefore, is the involvement of actors from outside academia into the research process in order to integrate the best available knowledge, reconcile values and preferences, as well as create ownership for problems and solution options. Transdisciplinary, community-based, interactive, or participatory research approaches are often suggested as appropriate means to meet both the requirements posed by real-world problems as well as the goals of sustainability science as a transformational scientific field. Dispersed literature on these approaches and a variety of empirical projects applying them make it difficult for interested researchers and practitioners to review and become familiar with key components and design principles of how to do transdisciplinary sustainability research. Starting from a conceptual model of an ideal–typical transdisciplinary research process, this article synthesizes and structures such a set of principles from various strands of the literature and empirical experiences. We then elaborate on them, looking at challenges and some coping strategies as experienced in transdisciplinary sustainability projects in Europe, North America, South America, Africa, and Asia. The article concludes with future research needed in order to further enhance the practice of transdisciplinary sustainability research.

1,927 citations


Journal ArticleDOI
TL;DR: This research showed that, while many white-colored products contained titanium, it was not a prerequisite and testing should focus on food-grade TiO(2) (E171) rather than that adopted in many environmental health and safety tests (i.e., P25), which is used in much lower amounts in products less likely to enter the environment.
Abstract: Titanium dioxide is a common additive in many food, personal care, and other consumer products used by people, which after use can enter the sewage system and, subsequently, enter the environment as treated effluent discharged to surface waters or biosolids applied to agricultural land, incinerated wastes, or landfill solids. This study quantifies the amount of titanium in common food products, derives estimates of human exposure to dietary (nano-) TiO(2), and discusses the impact of the nanoscale fraction of TiO(2) entering the environment. The foods with the highest content of TiO(2) included candies, sweets, and chewing gums. Among personal care products, toothpastes and select sunscreens contained 1% to >10% titanium by weight. While some other cremes contained titanium, despite being colored white, most shampoos, deodorants, and shaving creams contained the lowest levels of titanium (<0.01 μg/mg). For several high-consumption pharmaceuticals, the titanium content ranged from below the instrument detection limit (0.0001 μg Ti/mg) to a high of 0.014 μg Ti/mg. Electron microscopy and stability testing of food-grade TiO(2) (E171) suggests that approximately 36% of the particles are less than 100 nm in at least one dimension and that it readily disperses in water as fairly stable colloids. However, filtration of water solubilized consumer products and personal care products indicated that less than 5% of the titanium was able to pass through 0.45 or 0.7 μm pores. Two white paints contained 110 μg Ti/mg while three sealants (i.e., prime coat paint) contained less titanium (25 to 40 μg Ti/mg). This research showed that, while many white-colored products contained titanium, it was not a prerequisite. Although several of these product classes contained low amounts of titanium, their widespread use and disposal down the drain and eventually to wastewater treatment plants (WWTPs) deserves attention. A Monte Carlo human exposure analysis to TiO(2) through foods identified children as having the highest exposures because TiO(2) content of sweets is higher than other food products and that a typical exposure for a US adult may be on the order of 1 mg Ti per kilogram body weight per day. Thus, because of the millions of tons of titanium-based white pigment used annually, testing should focus on food-grade TiO(2) (E171) rather than that adopted in many environmental health and safety tests (i.e., P25), which is used in much lower amounts in products less likely to enter the environment (e.g., catalyst supports, photocatalytic coatings).

1,767 citations


Journal ArticleDOI
Julia A. Vorholt1
TL;DR: Insights into the underlying structural principles of indigenous microbial phyllosphere populations will help to develop a deeper understanding of the phyllospheric microbiota and will have applications in the promotion of plant growth and plant protection.
Abstract: Our knowledge of the microbiology of the phyllosphere, or the aerial parts of plants, has historically lagged behind our knowledge of the microbiology of the rhizosphere, or the below-ground habitat of plants, particularly with respect to fundamental questions such as which microorganisms are present and what they do there. In recent years, however, this has begun to change. Cultivation-independent studies have revealed that a few bacterial phyla predominate in the phyllosphere of different plants and that plant factors are involved in shaping these phyllosphere communities, which feature specific adaptations and exhibit multipartite relationships both with host plants and among community members. Insights into the underlying structural principles of indigenous microbial phyllosphere populations will help us to develop a deeper understanding of the phyllosphere microbiota and will have applications in the promotion of plant growth and plant protection.

1,450 citations


Journal ArticleDOI
TL;DR: It is concluded that stem cells exert a mechanical force on collagen fibres and gauge the feedback to make cell-fate decisions, and are regulated by the elastic modulus of PAAm.
Abstract: To investigate how substrate properties influence stem-cell fate, we cultured single human epidermal stem cells on polydimethylsiloxane (PDMS) and polyacrylamide (PAAm) hydrogel surfaces, 0.1 kPa-2.3 MPa in stiffness, with a covalently attached collagen coating. Cell spreading and differentiation were unaffected by polydimethylsiloxane stiffness. However, cells on polyacrylamide of low elastic modulus (0.5 kPa) could not form stable focal adhesions and differentiated as a result of decreased activation of the extracellular-signal-related kinase (ERK)/mitogen-activated protein kinase (MAPK) signalling pathway. The differentiation of human mesenchymal stem cells was also unaffected by PDMS stiffness but regulated by the elastic modulus of PAAm. Dextran penetration measurements indicated that polyacrylamide substrates of low elastic modulus were more porous than stiff substrates, suggesting that the collagen anchoring points would be further apart. We then changed collagen crosslink concentration and used hydrogel-nanoparticle substrates to vary anchoring distance at constant substrate stiffness. Lower collagen anchoring density resulted in increased differentiation. We conclude that stem cells exert a mechanical force on collagen fibres and gauge the feedback to make cell-fate decisions.

Journal ArticleDOI
TL;DR: In this paper, a generic objectness measure is proposed to quantify how likely an image window is to contain an object of any class, such as cows and telephones, from amorphous background elements such as grass and road.
Abstract: We present a generic objectness measure, quantifying how likely it is for an image window to contain an object of any class. We explicitly train it to distinguish objects with a well-defined boundary in space, such as cows and telephones, from amorphous background elements, such as grass and road. The measure combines in a Bayesian framework several image cues measuring characteristics of objects, such as appearing different from their surroundings and having a closed boundary. These include an innovative cue to measure the closed boundary characteristic. In experiments on the challenging PASCAL VOC 07 dataset, we show this new cue to outperform a state-of-the-art saliency measure, and the combined objectness measure to perform better than any cue alone. We also compare to interest point operators, a HOG detector, and three recent works aiming at automatic object segmentation. Finally, we present two applications of objectness. In the first, we sample a small numberof windows according to their objectness probability and give an algorithm to employ them as location priors for modern class-specific object detectors. As we show experimentally, this greatly reduces the number of windows evaluated by the expensive class-specific model. In the second application, we use objectness as a complementary score in addition to the class-specific model, which leads to fewer false positives. As shown in several recent papers, objectness can act as a valuable focus of attention mechanism in many other applications operating on image windows, including weakly supervised learning of object categories, unsupervised pixelwise segmentation, and object tracking in video. Computing objectness is very efficient and takes only about 4 sec. per image.

Journal ArticleDOI
TL;DR: How SRM is applied in proteomics is described, recent advances are reviewed, present selected applications and a perspective on the future of this powerful technology is provided.
Abstract: Selected reaction monitoring (SRM) is a targeted mass spectrometry technique that is emerging in the field of proteomics as a complement to untargeted shotgun methods. SRM is particularly useful when predetermined sets of proteins, such as those constituting cellular networks or sets of candidate biomarkers, need to be measured across multiple samples in a consistent, reproducible and quantitatively precise manner. Here we describe how SRM is applied in proteomics, review recent advances, present selected applications and provide a perspective on the future of this powerful technology.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated how ModelPredictive control and weatherpredictions can increase the energy efficiency in Integrated Room Automation (IRA) while respecting occupant comfort.

Journal ArticleDOI
TL;DR: It is shown that experimental manipulations of the abiotic or biotic environment, assessments of trait-phylogeny-environment relationships, and investigations of frequency-dependent population growth all suggest strong influences of stabilizing niche differences and fitness differences on the outcome of plant community assembly.
Abstract: Although research on the role of competitive interactions during community assembly began decades ago, a recent revival of interest has led to new discoveries and research opportunities. Using contemporary coexistence theory that emphasizes stabilizing niche differences and relative fitness differences, we evaluate three empirical approaches for studying community assembly. We show that experimental manipulations of the abiotic or biotic environment, assessments of trait-phylogeny-environment relationships, and investigations of frequency-dependent population growth all suggest strong influences of stabilizing niche differences and fitness differences on the outcome of plant community assembly. Nonetheless, due to the limitations of these approaches applied in isolation, we still have a poor understanding of which niche axes and which traits determine the outcome of competition and community structure. Combining current approaches represents our best chance of achieving this goal, which is fundamental to conceptual ecology and to the management of plant communities under global change.

Journal ArticleDOI
TL;DR: In this article, the chiral and deconfinement properties of the QCD transition at finite temperature were investigated using the p4, asqtad, and HISQ/tree actions.
Abstract: We present results on the chiral and deconfinement properties of the QCD transition at finite temperature. Calculations are performed with $2+1$ flavors of quarks using the p4, asqtad, and HISQ/tree actions. Lattices with temporal extent ${N}_{\ensuremath{\tau}}=6$, 8, and 12 are used to understand and control discretization errors and to reliably extrapolate estimates obtained at finite lattice spacings to the continuum limit. The chiral transition temperature is defined in terms of the phase transition in a theory with two massless flavors and analyzed using $O(N)$ scaling fits to the chiral condensate and susceptibility. We find consistent estimates from the HISQ/tree and asqtad actions and our main result is ${T}_{c}=154\ifmmode\pm\else\textpm\fi{}9\text{ }\text{ }\mathrm{MeV}$.

Journal ArticleDOI
TL;DR: A simple and general fabrication method for helical swimming micromachines by direct laser writing and e-beam evaporation is demonstrated and the magnetic helical devices exhibit varying magnetic shape anisotropy, yet always generate corkscrew motion using a rotating magnetic field.
Abstract: A simple and general fabrication method for helical swimming micromachines by direct laser writing and e-beam evaporation is demonstrated. The magnetic helical devices exhibit varying magnetic shape anisotropy, yet always generate corkscrew motion using a rotating magnetic field. They also exhibit good swimming performance and are capable of pick-and-place micromanipulation in 3D. Cytotoxicity of the devices was investigated using mouse myoblasts. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Journal ArticleDOI
Gunnar Jeschke1
TL;DR: To characterize structure or structural changes, experimental protocols were optimized and techniques for artifact suppression were introduced, and it was realized that interpretation of the distance distributions must take into account the conformational distribution of spin labels.
Abstract: Distance distributions between paramagnetic centers in the range of 1.8 to 6 nm in membrane proteins and up to 10 nm in deuterated soluble proteins can be measured by the DEER technique. The number of paramagnetic centers and their relative orientation can be characterized. DEER does not require crystallization and is not limited with respect to the size of the protein or protein complex. Diamagnetic proteins are accessible by site-directed spin labeling. To characterize structure or structural changes, experimental protocols were optimized and techniques for artifact suppression were introduced. Data analysis programs were developed, and it was realized that interpretation of the distance distributions must take into account the conformational distribution of spin labels. First methods have appeared for deriving structural models from a small number of distance constraints. The present scope and limitations of the technique are illustrated.

Journal ArticleDOI
26 Apr 2012-Nature
TL;DR: These findings demonstrate that by eliciting different cytokines C. albicans and S. aureus prime TH17 cells that produce either IFN-γ or IL-10, and identify IL-1β and IL-2 as pro- and anti-inflammatory regulators of TH17 Cells both at priming and in the effector phase.
Abstract: IL-17-producing CD4+ T helper cells (TH17) have been extensively investigated in mouse models of autoimmunity. However, the requirements for differentiation and the properties of pathogen-induced human TH17 cells remain poorly defined. Using an approach that combines the in vitro priming of naive T cells with the ex vivo analysis of memory T cells, we describe here two types of human TH17 cells with distinct effector function and differentiation requirements. Candida albicans-specific TH17 cells produced IL-17 and IFN-γ, but no IL-10, whereas Staphylococcus aureus-specific TH17 cells produced IL-17 and could produce IL-10 upon restimulation. IL-6, IL-23 and IL-1β contributed to TH17 differentiation induced by both pathogens, but IL-1β was essential in C. albicans-induced TH17 differentiation to counteract the inhibitory activity of IL-12 and to prime IL-17/IFN-γ double-producing cells. In addition, IL-1β inhibited IL-10 production in differentiating and in memory TH17 cells, whereas blockade of IL-1β in vivo led to increased IL-10 production by memory TH17 cells. We also show that, after restimulation, TH17 cells transiently downregulated IL-17 production through a mechanism that involved IL-2-induced activation of STAT5 and decreased expression of ROR-γt. Taken together these findings demonstrate that by eliciting different cytokines C. albicans and S. aureus prime TH17 cells that produce either IFN-γ or IL-10, and identify IL-1β and IL-2 as pro- and anti-inflammatory regulators of TH17 cells both at priming and in the effector phase.

Journal ArticleDOI
02 Mar 2012-Science
TL;DR: The transcriptomes of Bacillus subtilis exposed to a wide range of environmental and nutritional conditions that the organism might encounter in nature are reported, offering an initial understanding of why certain regulatory strategies may be favored during evolution of dynamic control systems.
Abstract: Bacteria adapt to environmental stimuli by adjusting their transcriptomes in a complex manner, the full potential of which has yet to be established for any individual bacterial species. Here, we report the transcriptomes of Bacillus subtilis exposed to a wide range of environmental and nutritional conditions that the organism might encounter in nature. We comprehensively mapped transcription units (TUs) and grouped 2935 promoters into regulons controlled by various RNA polymerase sigma factors, accounting for ~66% of the observed variance in transcriptional activity. This global classification of promoters and detailed description of TUs revealed that a large proportion of the detected antisense RNAs arose from potentially spurious transcription initiation by alternative sigma factors and from imperfect control of transcription termination.

Journal ArticleDOI
R. Prins1

Journal ArticleDOI
29 Mar 2012
TL;DR: In this article, the authors reported results from searches for the standard model Higgs boson in proton-proton collisions at square root(s) = 7 TeV in five decay modes: gamma pair, b-quark pair, tau lepton pair, W pair, and Z pair.
Abstract: Combined results are reported from searches for the standard model Higgs boson in proton-proton collisions at sqrt(s)=7 TeV in five Higgs boson decay modes: gamma pair, b-quark pair, tau lepton pair, W pair, and Z pair. The explored Higgs boson mass range is 110-600 GeV. The analysed data correspond to an integrated luminosity of 4.6-4.8 inverse femtobarns. The expected excluded mass range in the absence of the standard model Higgs boson is 118-543 GeV at 95% CL. The observed results exclude the standard model Higgs boson in the mass range 127-600 GeV at 95% CL, and in the mass range 129-525 GeV at 99% CL. An excess of events above the expected standard model background is observed at the low end of the explored mass range making the observed limits weaker than expected in the absence of a signal. The largest excess, with a local significance of 3.1 sigma, is observed for a Higgs boson mass hypothesis of 124 GeV. The global significance of observing an excess with a local significance greater than 3.1 sigma anywhere in the search range 110-600 (110-145) GeV is estimated to be 1.5 sigma (2.1 sigma). More data are required to ascertain the origin of this excess.

Journal ArticleDOI
TL;DR: In this paper, statistical transformations for post-processing regional climate models (RCMs) are reviewed and classified into distribution derived transformations, parametric transformations and nonparametric transformations, each differing with respect to their underlying assumptions.
Abstract: The impact of climate change on water resources is usually assessed at the local scale. However, regional climate models (RCMs) are known to exhibit systematic biases in precipitation. Hence, RCM simulations need to be post-processed in order to produce reliable estimates of local scale climate. Popular post-processing approaches are based on statistical transformations, which attempt to adjust the distribution of modelled data such that it closely resembles the observed climatology. However, the diversity of suggested methods renders the selection of optimal techniques difficult and therefore there is a need for clarification. In this paper, statistical transformations for post-processing RCM output are reviewed and classified into (1) distribution derived transformations, (2) parametric transformations and (3) nonparametric transformations, each differing with respect to their underlying assumptions. A real world application, using observations of 82 precipitation stations in Norway, showed that nonparametric transformations have the highest skill in systematically reducing biases in RCM precipitation.

Journal ArticleDOI
TL;DR: DebtRank, a novel measure of systemic impact inspired by feedback-centrality, is introduced, finding that a group of 22 institutions, which received most of the funds, form a strongly connected graph where each of the nodes becomes systemically important at the peak of the crisis.
Abstract: Systemic risk, here meant as the risk of default of a large portion of the financial system, depends on the network of financial exposures among institutions. However, there is no widely accepted methodology to determine the systemically important nodes in a network. To fill this gap, we introduce, DebtRank, a novel measure of systemic impact inspired by feedback-centrality. As an application, we analyse a new and unique dataset on the USD 1.2 trillion FED emergency loans program to global financial institutions during 2008–2010. We find that a group of 22 institutions, which received most of the funds, form a strongly connected graph where each of the nodes becomes systemically important at the peak of the crisis. Moreover, a systemic default could have been triggered even by small dispersed shocks. The results suggest that the debate on too-big-to-fail institutions should include the even more serious issue of too-central-to-fail.

Journal ArticleDOI
TL;DR: In this paper, the authors review the factors and processes that are known to influence the hydrogen-isotopic compositions of lipids from photosynthesizing organisms, and provide a framework for interpreting their D/H ratios from ancient sediments and identify future research opportunities.
Abstract: Hydrogen-isotopic abundances of lipid biomarkers are emerging as important proxies in the study of ancient environments and ecosystems. A decade ago, pioneering studies made use of new analytical methods and demonstrated that the hydrogen-isotopic composition of individual lipids from aquatic and terrestrial organisms can be related to the composition of their growth (i.e., environmental) water. Subsequently, compound-specific deuterium/hydrogen (D/H) ratios of sedimentary biomarkers have been increasingly used as paleohydrological proxies over a range of geological timescales. Isotopic fractionation observed between hydrogen in environmental water and hydrogen in lipids, however, is sensitive to biochemical, physiological, and environmental influences on the composition of hydrogen available for biosynthesis in cells. Here we review the factors and processes that are known to influence the hydrogen-isotopic compositions of lipids-especially n-alkanes-from photosynthesizing organisms, and we provide a framework for interpreting their D/H ratios from ancient sediments and identify future research opportunities.

Journal ArticleDOI
TL;DR: In this paper, the authors presented new improved constraints on the Hubble parameter H(z) in the redshift range 0.15 -1.1, obtained from the differential spectroscopic evolution of early-type galaxies as a function of redshift.
Abstract: We present new improved constraints on the Hubble parameter H(z) in the redshift range 0.15 \textless z \textless 1.1, obtained from the differential spectroscopic evolution of early-type galaxies as a function of redshift. We extract a large sample of early-type galaxies ( 11000) from several spectroscopic surveys, spanning almost 8 billion years of cosmic lookback time (0.15 \textless z \textless 1.42). We select the most massive, red elliptical galaxies, passively evolving and without signature of ongoing star formation. Those galaxies can be used as standard cosmic chronometers, as firstly proposed by Jimenez & Loeb (2002), whose (life! Nit age evolution as a function of cosmic time directly probes H (z). We analyze the 4000 angstrom break (D4000) as a function of redshift, use stellar population synthesis models to theoretically calibrate the dependence of the differential age evolution on the differential D4000, and estimate the Hubble parameter taking into account both statistical and systematical errors. We provide 8 new measurements of H(z) (see table 4), and determine its change in H(z) to a precision of 5-12% mapping homogeneously the redshift range up to z 1.1; for the first time, we place a constraint on 11(z) at z not equal 0 with a precision comparable with the one achieved for the Hubble constant (about 5-6% at z similar to 0.2), and covered a redshift range (0.5 \textless z \textless 0.8) which is crucial to distinguish many different quintessence cosmologies. These measurements have been tested to best match a ACDM model, clearly providing a statistically robust indication that the Universe is undergoing an accelerated expansion. This method shows the potentiality to open a new avenue in constrain a variety of alternative cosmologies, especially when future surveys (e.g. Euclid) will open the possibility to extend it up to z similar to 2.

Journal ArticleDOI
TL;DR: As climate models improve and decision-makers' expectations for accurate climate predictions are growing, natural climate variability limits climate predictability and hampers the ability to guide adaptation in many regions such as North America as mentioned in this paper.
Abstract: As climate models improve, decision-makers' expectations for accurate climate predictions are growing. Natural climate variability, however, limits climate predictability and hampers the ability to guide adaptation in many regions such as North America. Scientists, policymakers and the public need to improve communication and avoid raising expectations for accurate regional predictions everywhere.

Journal ArticleDOI
15 Mar 2012-Nature
TL;DR: In this paper, the authors exploit the unique tunability of a honeycomb optical lattice to adjust the effective mass of the Dirac fermions by breaking inversion symmetry and changing the lattice anisotropy.
Abstract: Dirac points are central to many phenomena in condensed-matter physics, from massless electrons in graphene to the emergence of conducting edge states in topological insulators. At a Dirac point, two energy bands intersect linearly and the electrons behave as relativistic Dirac fermions. In solids, the rigid structure of the material determines the mass and velocity of the electrons, as well as their interactions. A different, highly flexible means of studying condensed-matter phenomena is to create model systems using ultracold atoms trapped in the periodic potential of interfering laser beams. Here we report the creation of Dirac points with adjustable properties in a tunable honeycomb optical lattice. Using momentum-resolved interband transitions, we observe a minimum bandgap inside the Brillouin zone at the positions of the two Dirac points. We exploit the unique tunability of our lattice potential to adjust the effective mass of the Dirac fermions by breaking inversion symmetry. Moreover, changing the lattice anisotropy allows us to change the positions of the Dirac points inside the Brillouin zone. When the anisotropy exceeds a critical limit, the two Dirac points merge and annihilate each other-a situation that has recently attracted considerable theoretical interest but that is extremely challenging to observe in solids. We map out this topological transition in lattice parameter space and find excellent agreement with ab initio calculations. Our results not only pave the way to model materials in which the topology of the band structure is crucial, but also provide an avenue to exploring many-body phases resulting from the interplay of complex lattice geometries with interactions.

Journal ArticleDOI
13 Dec 2012-Nature
TL;DR: A compact, broadband, semiconductor frequency comb generator that operates in the mid-infrared, and it is demonstrated that the modes of a continuous-wave, free-running, broadband quantum cascade laser are phase-locked.
Abstract: A broadband, compact, all-electrically driven mid-infrared frequency comb based on a quantum cascade laser widens the scope of application of combs in this frequency range beyond that of sources which depend on a chain of optical components. Optical frequency combs are light sources that produce a comb-like spectrum, with sharp equidistant frequency modes, and have many uses in metrology and spectroscopy applications. The mid-infrared regime is particularly important for molecular fingerprinting, but so far the comb sources in this wavelength regime are bulky and rely on a chain of optical components. For wide practical applications, an electrically injected, compact scheme is desired. Andreas Hugi et al. now demonstrate a mid-infrared frequency comb generator based on a semiconductor device, a continuous-wave quantum cascade laser. Optical frequency combs1 act as rulers in the frequency domain and have opened new avenues in many fields such as fundamental time metrology, spectroscopy and frequency synthesis. In particular, spectroscopy by means of optical frequency combs has surpassed the precision and speed of Fourier spectrometers. Such a spectroscopy technique is especially relevant for the mid-infrared range, where the fundamental rotational–vibrational bands of most light molecules are found2. Most mid-infrared comb sources are based on down-conversion of near-infrared, mode-locked, ultrafast lasers using nonlinear crystals3. Their use in frequency comb spectroscopy applications has resulted in an unequalled combination of spectral coverage, resolution and sensitivity4,5,6,7. Another means of comb generation is pumping an ultrahigh-quality factor microresonator with a continuous-wave laser8,9,10. However, these combs depend on a chain of optical components, which limits their use. Therefore, to widen the spectroscopic applications of such mid-infrared combs, a more direct and compact generation scheme, using electrical injection, is preferable. Here we present a compact, broadband, semiconductor frequency comb generator that operates in the mid-infrared. We demonstrate that the modes of a continuous-wave, free-running, broadband quantum cascade laser11 are phase-locked. Combining mode proliferation based on four-wave mixing with gain provided by the quantum cascade laser leads to a phase relation similar to that of a frequency-modulated laser. The comb centre carrier wavelength is 7 micrometres. We identify a narrow drive current range with intermode beat linewidths narrower than 10 hertz. We find comb bandwidths of 4.4 per cent with an intermode stability of less than or equal to 200 hertz. The intermode beat can be varied over a frequency range of 65 kilohertz by radio-frequency injection. The large gain bandwidth and independent control over the carrier frequency offset and the mode spacing open the way to broadband, compact, all-solid-state mid-infrared spectrometers.

Journal ArticleDOI
16 Mar 2012-Science
TL;DR: It is shown that when analog climates are compared between regions, fewer than 15% of species have more than 10% of their invaded distribution outside their native climatic niche, revealing that substantial niche shifts are rare in terrestrial plant invaders.
Abstract: The assumption that climatic niche requirements of invasive species are conserved between their native and invaded ranges is key to predicting the risk of invasion. However, this assumption has been challenged recently by evidence of niche shifts in some species. Here, we report the first large-scale test of niche conservatism for 50 terrestrial plant invaders between Eurasia, North America, and Australia. We show that when analog climates are compared between regions, fewer than 15% of species have more than 10% of their invaded distribution outside their native climatic niche. These findings reveal that substantial niche shifts are rare in terrestrial plant invaders, providing support for an appropriate use of ecological niche models for the prediction of both biological invasions and responses to climate change.