scispace - formally typeset
Search or ask a question

Showing papers by "ETH Zurich published in 2011"


Journal ArticleDOI
TL;DR: The Twentieth Century Reanalysis (20CR) dataset as discussed by the authors provides the first estimates of global tropospheric variability, and of the dataset's time-varying quality, from 1871 to the present at 6-hourly temporal and 2° spatial resolutions.
Abstract: The Twentieth Century Reanalysis (20CR) project is an international effort to produce a comprehensive global atmospheric circulation dataset spanning the twentieth century, assimilating only surface pressure reports and using observed monthly sea-surface temperature and sea-ice distributions as boundary conditions. It is chiefly motivated by a need to provide an observational dataset with quantified uncertainties for validations of climate model simulations of the twentieth century on all time-scales, with emphasis on the statistics of daily weather. It uses an Ensemble Kalman Filter data assimilation method with background ‘first guess’ fields supplied by an ensemble of forecasts from a global numerical weather prediction model. This directly yields a global analysis every 6 hours as the most likely state of the atmosphere, and also an uncertainty estimate of that analysis. The 20CR dataset provides the first estimates of global tropospheric variability, and of the dataset's time-varying quality, from 1871 to the present at 6-hourly temporal and 2° spatial resolutions. Intercomparisons with independent radiosonde data indicate that the reanalyses are generally of high quality. The quality in the extratropical Northern Hemisphere throughout the century is similar to that of current three-day operational NWP forecasts. Intercomparisons over the second half-century of these surface-based reanalyses with other reanalyses that also make use of upper-air and satellite data are equally encouraging. It is anticipated that the 20CR dataset will be a valuable resource to the climate research community for both model validations and diagnostic studies. Some surprising results are already evident. For instance, the long-term trends of indices representing the North Atlantic Oscillation, the tropical Pacific Walker Circulation, and the Pacific–North American pattern are weak or non-existent over the full period of record. The long-term trends of zonally averaged precipitation minus evaporation also differ in character from those in climate model simulations of the twentieth century. Copyright © 2011 Royal Meteorological Society and Crown Copyright.

3,043 citations


Journal ArticleDOI
TL;DR: Otto Warburg's observations are re-examine in relation to the current concepts of cancer metabolism as being intimately linked to alterations of mitochondrial DNA, oncogenes and tumour suppressors, and thus readily exploitable for cancer therapy.
Abstract: Otto Warburg pioneered quantitative investigations of cancer cell metabolism, as well as photosynthesis and respiration. Warburg and co-workers showed in the 1920s that, under aerobic conditions, tumour tissues metabolize approximately tenfold more glucose to lactate in a given time than normal tissues, a phenomenon known as the Warburg effect. However, this increase in aerobic glycolysis in cancer cells is often erroneously thought to occur instead of mitochondrial respiration and has been misinterpreted as evidence for damage to respiration instead of damage to the regulation of glycolysis. In fact, many cancers exhibit the Warburg effect while retaining mitochondrial respiration. We re-examine Warburg's observations in relation to the current concepts of cancer metabolism as being intimately linked to alterations of mitochondrial DNA, oncogenes and tumour suppressors, and thus readily exploitable for cancer therapy.

2,312 citations


Journal ArticleDOI
20 Oct 2011-Nature
TL;DR: It is found that biodiversity values were substantially lower in degraded forests, but that this varied considerably by geographic region, taxonomic group, ecological metric and disturbance type.
Abstract: Human-driven land-use changes increasingly threaten biodiversity, particularly in tropical forests where both species diversity and human pressures on natural environments are high. The rapid conversion of tropical forests for agriculture, timber production and other uses has generated vast, human-dominated landscapes with potentially dire consequences for tropical biodiversity. Today, few truly undisturbed tropical forests exist, whereas those degraded by repeated logging and fires, as well as secondary and plantation forests, are rapidly expanding. Here we provide a global assessment of the impact of disturbance and land conversion on biodiversity in tropical forests using a meta-analysis of 138 studies. We analysed 2,220 pairwise comparisons of biodiversity values in primary forests (with little or no human disturbance) and disturbed forests. We found that biodiversity values were substantially lower in degraded forests, but that this varied considerably by geographic region, taxonomic group, ecological metric and disturbance type. Even after partly accounting for confounding colonization and succession effects due to the composition of surrounding habitats, isolation and time since disturbance, we find that most forms of forest degradation have an overwhelmingly detrimental effect on tropical biodiversity. Our results clearly indicate that when it comes to maintaining tropical biodiversity, there is no substitute for primary forests.

1,640 citations


Journal ArticleDOI
TL;DR: This paper presents HypE, a hypervolume estimation algorithm for multi-objective optimization, by which the accuracy of the estimates and the available computing resources can be traded off; thereby, not only do many-Objective problems become feasible with hypervolume-based search, but also the runtime can be flexibly adapted.
Abstract: In the field of evolutionary multi-criterion optimization, the hypervolume indicator is the only single set quality measure that is known to be strictly monotonic with regard to Pareto dominance: whenever a Pareto set approximation entirely dominates another one, then the indicator value of the dominant set will also be better. This property is of high interest and relevance for problems involving a large number of objective functions. However, the high computational effort required for hypervolume calculation has so far prevented the full exploitation of this indicator's potential; current hypervolume-based search algorithms are limited to problems with only a few objectives. This paper addresses this issue and proposes a fast search algorithm that uses Monte Carlo simulation to approximate the exact hypervolume values. The main idea is not that the actual indicator values are important, but rather that the rankings of solutions induced by the hypervolume indicator. In detail, we present HypE, a hypervolume estimation algorithm for multi-objective optimization, by which the accuracy of the estimates and the available computing resources can be traded off; thereby, not only do many-objective problems become feasible with hypervolume-based search, but also the runtime can be flexibly adapted. Moreover, we show how the same principle can be used to statistically compare the outcomes of different multi-objective optimizers with respect to the hypervolume----so far, statistical testing has been restricted to scenarios with few objectives. The experimental results indicate that HypE is highly effective for many-objective problems in comparison to existing multi-objective evolutionary algorithms. HypE is available for download at http://www.tik.ee.ethz.ch/sop/download/supplementary/hype/ .

1,560 citations


Journal ArticleDOI
22 Dec 2011-Nature
TL;DR: A new generation of amide-forming reactions are reviewed and summarize their potential application to current synthetic challenges, including the development of catalytic amide formation, the synthesis of therapeutic peptides and the preparation of modified peptide and proteins.
Abstract: One of the most important reactions in organic chemistry--amide bond formation--is often overlooked as a contemporary challenge because of the widespread occurrence of amides in modern pharmaceuticals and biologically active compounds. But existing methods are reaching their inherent limits, and concerns about their waste and expense are becoming sharper. Novel chemical approaches to amide formation are therefore being developed. Here we review and summarize a new generation of amide-forming reactions that may contribute to solving these problems. We also consider their potential application to current synthetic challenges, including the development of catalytic amide formation, the synthesis of therapeutic peptides and the preparation of modified peptides and proteins.

1,462 citations


Journal ArticleDOI
TL;DR: In this article, the authors presented new reference values for the NIST SRM 610-617 glasses following ISO guidelines and the International Association of Geoanalysts' protocol, and determined quantitatively possible element inhomogeneities using different test portion masses of 1, 0.1 and 0.02μg.
Abstract: We present new reference values for the NIST SRM 610–617 glasses following ISO guidelines and the International Association of Geoanalysts’ protocol. Uncertainties at the 95% confidence level (CL) have been determined for bulk- and micro-analytical purposes. In contrast to former compilation procedures, this approach delivers data that consider present-day requirements of data quality. New analytical data and the nearly complete data set of the GeoReM database were used for this study. Data quality was checked by the application of the Horwitz function and by a careful investigation of analytical procedures. We have determined quantitatively possible element inhomogeneities using different test portion masses of 1, 0.1 and 0.02 μg. Although avoiding the rim region of the glass wafers, we found moderate inhomogeneities of several chalcophile/siderophile elements and gross inhomogeneities of Ni, Se, Pd and Pt at small test portion masses. The extent of inhomogeneity was included in the determination of uncertainties. While the new reference values agree with the NIST certified values with the one exception of Mn in SRM 610, they typically differ by as much as 10% from the Pearce et al. (1997) values in current use. In a few cases (P, S, Cl, Ta, Re) the discrepancies are even higher. Nous presentons des nouvelles valeurs de reference pour les verres NIST SRM 610–617 en suivant les recommandations de l’ISO et le protocole de l’IAG. Les incertitudes au niveau de confiance de 95% ont ete determinees a des fins d’analyse totale et de micro-analyse. Contrairement aux procedures de compilation precedentes, cette approche fournit des donnees qui tiennent compte des exigences actuelles dans la qualite des donnees. De nouvelles donnees analytiques et le jeu de donnees presque complet de la base de donnees GeoReM ont ete utilises pour cette etude. La qualite des donnees a ete verifiee par l’application de la fonction de Horwitz et par un examen minutieux des procedures analytiques. Nous avons determine quantitativement les possibles inhomogeneites d’element en utilisant des prises d’essai de masses differentes correspondant a 1, 0.1 et 0.02 μg. Bien que nous ayons evite les zones de bordure des disques de verre, nous avons trouve des inhomogeneites moderees pour plusieurs elements chalcophiles/siderophiles et des inhomogeneites flagrantes de Ni, Se, Pd et Pt pour les prises d’essai de petites masses. La mesure d’inhomogeneite a ete incluse dans la determination des incertitudes. Alors que les nouvelles valeurs de reference sont en accord avec les valeurs NIST certifiees a la seule exception du Mn dans SRM 610, elles sont generalement differentes, avec des ecarts de pres de 10%, des valeurs de Pearce et al. (1997) qui sont d’un usage courant. Dans quelques cas (P, S, Cl, Ta, Re), les ecarts sont encore plus eleves.

1,388 citations


Journal ArticleDOI
TL;DR: Visual odometry is the process of estimating the egomotion of an agent (e.g., vehicle, human, and robot) using only the input of a single or If multiple cameras attached to it, and application domains include robotics, wearable computing, augmented reality, and automotive.
Abstract: Visual odometry (VO) is the process of estimating the egomotion of an agent (e.g., vehicle, human, and robot) using only the input of a single or If multiple cameras attached to it. Application domains include robotics, wearable computing, augmented reality, and automotive. The term VO was coined in 2004 by Nister in his landmark paper. The term was chosen for its similarity to wheel odometry, which incrementally estimates the motion of a vehicle by integrating the number of turns of its wheels over time. Likewise, VO operates by incrementally estimating the pose of the vehicle through examination of the changes that motion induces on the images of its onboard cameras. For VO to work effectively, there should be sufficient illumination in the environment and a static scene with enough texture to allow apparent motion to be extracted. Furthermore, consecutive frames should be captured by ensuring that they have sufficient scene overlap.

1,371 citations


Journal ArticleDOI
K. Abe1, N. Abgrall2, Yasuo Ajima, Hiroaki Aihara1  +413 moreInstitutions (53)
TL;DR: The T2K experiment observes indications of ν (μ) → ν(e) appearance in data accumulated with 1.43×10(20) protons on target, and under this hypothesis, the probability to observe six or more candidate events is 7×10(-3), equivalent to 2.5σ significance.
Abstract: The T2K experiment observes indications of nu(mu) -> nu(mu) e appearance in data accumulated with 1.43 x 10(20) protons on target. Six events pass all selection criteria at the far detector. In a three-flavor neutrino oscillation scenario with |Delta m(23)(2)| = 2.4 x 10(-3) eV(2), sin(2)2 theta(23) = 1 and sin(2)2 theta(13) = 0, the expected number of such events is 1.5 +/- 0.3(syst). Under this hypothesis, the probability to observe six or more candidate events is 7 x 10(-3), equivalent to 2.5 sigma significance. At 90% C.L., the data are consistent with 0.03(0.04) < sin(2)2 theta(13) < 0.28(0.34) for delta(CP) = 0 and a normal (inverted) hierarchy.

1,361 citations


Journal ArticleDOI
12 Aug 2011-Science
TL;DR: It is concluded that, unlike many other mutualisms, the symbiont cannot be “enslaved,” and the mutualism is evolutionarily stable because control is bidirectional, and partners offering the best rate of exchange are rewarded.
Abstract: Plants and their arbuscular mycorrhizal fungal symbionts interact in complex underground networks involving multiple partners. This increases the potential for exploitation and defection by individuals, raising the question of how partners maintain a fair, two-way transfer of resources. We manipulated cooperation in plants and fungal partners to show that plants can detect, discriminate, and reward the best fungal partners with more carbohydrates. In turn, their fungal partners enforce cooperation by increasing nutrient transfer only to those roots providing more carbohydrates. On the basis of these observations we conclude that, unlike many other mutualisms, the symbiont cannot be "enslaved." Rather, the mutualism is evolutionarily stable because control is bidirectional, and partners offering the best rate of exchange are rewarded.

1,346 citations


Journal ArticleDOI
TL;DR: Key pH regulators in tumour cells include: isoforms 2, 9 and 12 of carbonic anhydrase, isoforms of anion exchangers, Na+/HCO3− co-transporters, Na+./H+ exchanger, monocarboxylate transporters and the vacuolar ATPase.
Abstract: The high metabolic rate of tumours often leads to acidosis and hypoxia in poorly perfused regions. Tumour cells have thus evolved the ability to function in a more acidic environment than normal cells. Key pH regulators in tumour cells include: isoforms 2, 9 and 12 of carbonic anhydrase, isoforms of anion exchangers, Na+/HCO3- co-transporters, Na+/H+ exchangers, monocarboxylate transporters and the vacuolar ATPase. Both small molecules and antibodies targeting these pH regulators are currently at various stages of clinical development. These antitumour mechanisms are not exploited by the classical cancer drugs and therefore represent a new anticancer drug discovery strategy.

1,331 citations


Journal ArticleDOI
08 Apr 2011-Science
TL;DR: Evidence is provided that the anomalous 2010 warmth that caused adverse impacts exceeded the amplitude and spatial extent of the previous hottest summer of 2003, which likely broke the 500-year-long seasonal temperature records over approximately 50% of Europe.
Abstract: The summer of 2010 was exceptionally warm in eastern Europe and large parts of Russia. We provide evidence that the anomalous 2010 warmth that caused adverse impacts exceeded the amplitude and spatial extent of the previous hottest summer of 2003. "Mega-heatwaves" such as the 2003 and 2010 events likely broke the 500-year-long seasonal temperature records over approximately 50% of Europe. According to regional multi-model experiments, the probability of a summer experiencing mega-heatwaves will increase by a factor of 5 to 10 within the next 40 years. However, the magnitude of the 2010 event was so extreme that despite this increase, the likelihood of an analog over the same region remains fairly low until the second half of the 21st century.

Journal ArticleDOI
TL;DR: This review describes a multidimensional treatment of molecular recognition phenomena involving aromatic rings in chemical and biological systems that facilitates the development of new advanced materials and supramolecular systems, and should inspire further utilization of interactions with aromatic rings to control the stereochemical outcome of synthetic transformations.
Abstract: This review describes a multidimensional treatment of molecular recognition phenomena involving aromatic rings in chemical and biological systems. It summarizes new results reported since the appearance of an earlier review in 2003 in host-guest chemistry, biological affinity assays and biostructural analysis, data base mining in the Cambridge Structural Database (CSD) and the Protein Data Bank (PDB), and advanced computational studies. Topics addressed are arene-arene, perfluoroarene-arene, S⋅⋅⋅aromatic, cation-π, and anion-π interactions, as well as hydrogen bonding to π systems. The generated knowledge benefits, in particular, structure-based hit-to-lead development and lead optimization both in the pharmaceutical and in the crop protection industry. It equally facilitates the development of new advanced materials and supramolecular systems, and should inspire further utilization of interactions with aromatic rings to control the stereochemical outcome of synthetic transformations.

Journal ArticleDOI
12 Aug 2011-Science
TL;DR: An antibody able to broadly neutralize both group 1 and group 2 influenza A viruses—and its target epitope—are identified and may be used for passive protection and to inform vaccine design because of its broad specificity and neutralization potency.
Abstract: The isolation of broadly neutralizing antibodies against influenza A viruses has been a long-sought goal for therapeutic approaches and vaccine design. Using a single-cell culture method for screening large numbers of human plasma cells, we isolated a neutralizing monoclonal antibody that recognized the hemagglutinin (HA) glycoprotein of all 16 subtypes and neutralized both group 1 and group 2 influenza A viruses. Passive transfer of this antibody conferred protection to mice and ferrets. Complexes with HAs from the group 1 H1 and the group 2 H3 subtypes analyzed by x-ray crystallography showed that the antibody bound to a conserved epitope in the F subdomain. This antibody may be used for passive protection and to inform vaccine design because of its broad specificity and neutralization potency.

Journal ArticleDOI
TL;DR: In this paper, the continuous-time quantum Monte Carlo (QMC) algorithm is used to solve the local correlation problem in quantum impurity models with high and low energy scales and is effective for wide classes of physically realistic models.
Abstract: Quantum impurity models describe an atom or molecule embedded in a host material with which it can exchange electrons. They are basic to nanoscience as representations of quantum dots and molecular conductors and play an increasingly important role in the theory of "correlated electron" materials as auxiliary problems whose solution gives the "dynamical mean field" approximation to the self energy and local correlation functions. These applications require a method of solution which provides access to both high and low energy scales and is effective for wide classes of physically realistic models. The continuous-time quantum Monte Carlo algorithms reviewed in this article meet this challenge. We present derivations and descriptions of the algorithms in enough detail to allow other workers to write their own implementations, discuss the strengths and weaknesses of the methods, summarize the problems to which the new methods have been successfully applied and outline prospects for future applications.

01 Aug 2011
TL;DR: A generic objectness measure, quantifying how likely it is for an image window to contain an object of any class, and uses objectness as a complementary score in addition to the class-specific model, which leads to fewer false positives.
Abstract: We present a generic objectness measure, quantifying how likely it is for an image window to contain an object of any class. We explicitly train it to distinguish objects with a well-defined boundary in space, such as cows and telephones, from amorphous background elements, such as grass and road. The measure combines in a Bayesian framework several image cues measuring characteristics of objects, such as appearing different from their surroundings and having a closed boundary. These include an innovative cue to measure the closed boundary characteristic. In experiments on the challenging PASCAL VOC 07 dataset, we show this new cue to outperform a state-of-the-art saliency measure, and the combined objectness measure to perform better than any cue alone. We also compare to interest point operators, a HOG detector, and three recent works aiming at automatic object segmentation. Finally, we present two applications of objectness. In the first, we sample a small numberof windows according to their objectness probability and give an algorithm to employ them as location priors for modern class-specific object detectors. As we show experimentally, this greatly reduces the number of windows evaluated by the expensive class-specific model. In the second application, we use objectness as a complementary score in addition to the class-specific model, which leads to fewer false positives. As shown in several recent papers, objectness can act as a valuable focus of attention mechanism in many other applications operating on image windows, including weakly supervised learning of object categories, unsupervised pixelwise segmentation, and object tracking in video. Computing objectness is very efficient and takes only about 4 sec. per image.

Journal ArticleDOI
Marcos Daniel Actis1, G. Agnetta2, Felix Aharonian3, A. G. Akhperjanian  +682 moreInstitutions (109)
TL;DR: The ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes as mentioned in this paper, which is an international initiative to build the next generation instrument, with a factor of 5-10 improvement in sensitivity in the 100 GeV-10 TeV range and the extension to energies well below 100GeV and above 100 TeV.
Abstract: Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to build the next generation instrument, with a factor of 5-10 improvement in sensitivity in the 100 GeV-10 TeV range and the extension to energies well below 100 GeV and above 100 TeV. CTA will consist of two arrays (one in the north, one in the south) for full sky coverage and will be operated as open observatory. The design of CTA is based on currently available technology. This document reports on the status and presents the major design concepts of CTA.

Journal ArticleDOI
TL;DR: In this paper, the authors upscaled FLUXNET observations of carbon dioxide, water, and energy fluxes to the global scale using the machine learning technique, model tree ensembles (MTE), to predict site-level gross primary productivity (GPP), terrestrial ecosystem respiration (TER), net ecosystem exchange (NEE), latent energy (LE), and sensible heat (H) based on remote sensing indices, climate and meteorological data, and information on land use.
Abstract: We upscaled FLUXNET observations of carbon dioxide, water, and energy fluxes to the global scale using the machine learning technique, model tree ensembles (MTE). We trained MTE to predict site-level gross primary productivity (GPP), terrestrial ecosystem respiration (TER), net ecosystem exchange (NEE), latent energy (LE), and sensible heat (H) based on remote sensing indices, climate and meteorological data, and information on land use. We applied the trained MTEs to generate global flux fields at a 0.5 degrees x 0.5 degrees spatial resolution and a monthly temporal resolution from 1982 to 2008. Cross-validation analyses revealed good performance of MTE in predicting among-site flux variability with modeling efficiencies (MEf) between 0.64 and 0.84, except for NEE (MEf = 0.32). Performance was also good for predicting seasonal patterns (MEf between 0.84 and 0.89, except for NEE (0.64)). By comparison, predictions of monthly anomalies were not as strong (MEf between 0.29 and 0.52). Improved accounting of disturbance and lagged environmental effects, along with improved characterization of errors in the training data set, would contribute most to further reducing uncertainties. Our global estimates of LE (158 +/- 7 J x 10(18) yr(-1)), H (164 +/- 15 J x 10(18) yr(-1)), and GPP (119 +/- 6 Pg C yr(-1)) were similar to independent estimates. Our global TER estimate (96 +/- 6 Pg C yr(-1)) was likely underestimated by 5-10%. Hot spot regions of interannual variability in carbon fluxes occurred in semiarid to semihumid regions and were controlled by moisture supply. Overall, GPP was more important to interannual variability in NEE than TER. Our empirically derived fluxes may be used for calibration and evaluation of land surface process models and for exploratory and diagnostic assessments of the biosphere.

Journal ArticleDOI
TL;DR: In this article, the authors examine the driving forces behind farmers' decisions to adapt to climate change and the impact of adaptation on farmers' food production, and investigate whether there are differences in the food production functions of farm households that adapted and those that did not adapt.
Abstract: We examine the driving forces behind farmers’ decisions to adapt to climate change, and the impact of adaptation on farmers’ food production. We investigate whether there are differences in the food production functions of farm households that adapted and those that did not adapt. We estimate a simultaneous equations model with endogenous switching to account for the heterogeneity in the decision to adapt or not, and for unobservable characteristics of farmers and their farm. We compare the expected food production under the actual and counterfactual cases that the farm household adapted or not to climate change. We find that the group of farm households that adapted has systematically different characteristics than the group of farm households that did not adapt. The relationship between production and average temperature is inverted U-shaped for farm households that adapted, while it is U-shaped for farm households that did not adapt, and vice versa in the case of precipitation. We find that adaptation increases food production, however, the impact of adaptation on food production is smaller for the farm households that actually did adapt than for the farm households that did not adapt in the counterfactual case that they adapted.

Journal ArticleDOI
30 Jun 2011-Nature
TL;DR: It is shown that the expression of microRNAs 103 and 107 (miR-103/107) is upregulated in obese mice and caveolin-1, a critical regulator of the insulin receptor, is identified as a direct target gene of miR- 103/107, as a new target for the treatment of type 2 diabetes and obesity.
Abstract: Defects in insulin signalling are among the most common and earliest defects that predispose an individual to the development of type 2 diabetes. MicroRNAs have been identified as a new class of regulatory molecules that influence many biological functions, including metabolism. However, the direct regulation of insulin sensitivity by microRNAs in vivo has not been demonstrated. Here we show that the expression of microRNAs 103 and 107 (miR-103/107) is upregulated in obese mice. Silencing of miR-103/107 leads to improved glucose homeostasis and insulin sensitivity. In contrast, gain of miR-103/107 function in either liver or fat is sufficient to induce impaired glucose homeostasis. We identify caveolin-1, a critical regulator of the insulin receptor, as a direct target gene of miR-103/107. We demonstrate that caveolin-1 is upregulated upon miR-103/107 inactivation in adipocytes and that this is concomitant with stabilization of the insulin receptor, enhanced insulin signalling, decreased adipocyte size and enhanced insulin-stimulated glucose uptake. These findings demonstrate the central importance of miR-103/107 to insulin sensitivity and identify a new target for the treatment of type 2 diabetes and obesity.

Journal ArticleDOI
TL;DR: This work demonstrates by experimental evidence that even mild social influence can undermine the wisdom of crowd effect in simple estimation tasks.
Abstract: Social groups can be remarkably smart and knowledgeable when their averaged judgements are compared with the judgements of individuals. Already Galton [Galton F (1907) Nature 75:7] found evidence that the median estimate of a group can be more accurate than estimates of experts. This wisdom of crowd effect was recently supported by examples from stock markets, political elections, and quiz shows [Surowiecki J (2004) The Wisdom of Crowds]. In contrast, we demonstrate by experimental evidence (N = 144) that even mild social influence can undermine the wisdom of crowd effect in simple estimation tasks. In the experiment, subjects could reconsider their response to factual questions after having received average or full information of the responses of other subjects. We compare subjects’ convergence of estimates and improvements in accuracy over five consecutive estimation periods with a control condition, in which no information about others’ responses was provided. Although groups are initially “wise,” knowledge about estimates of others narrows the diversity of opinions to such an extent that it undermines the wisdom of crowd effect in three different ways. The “social influence effect” diminishes the diversity of the crowd without improvements of its collective error. The “range reduction effect” moves the position of the truth to peripheral regions of the range of estimates so that the crowd becomes less reliable in providing expertise for external observers. The “confidence effect” boosts individuals’ confidence after convergence of their estimates despite lack of improved accuracy. Examples of the revealed mechanism range from misled elites to the recent global financial crisis.

Journal ArticleDOI
Christian Franck1
TL;DR: In this article, the authors summarize the literature of the last two decades on technology areas that are relevant to HVDC breakers and compare the mainly 20+ years old, state-of-the-art, state of-the art HVD-C CBs to the new HVDc technology, existing discrepancies become evident.
Abstract: The continuously increasing demand for electric power and the economic access to remote renewable energy sources such as off-shore wind power or solar thermal generation in deserts have revived the interest in high-voltage direct current (HVDC) multiterminal systems (networks). A lot of work was done in this area, especially in the 1980s, but only two three-terminal systems were realized. Since then, HVDC technology has advanced considerably and, despite numerous technical challenges, the realization of large-scale HVDC networks is now seriously discussed and considered. For the acceptance and reliability of these networks, the availability of HVDC circuit breakers (CBs) will be critical, making them one of the key enabling technologies. Numerous ideas for HVDC breaker schemes have been published and patented, but no acceptable solution has been found to interrupt HVDC short-circuit currents. This paper aims to summarize the literature, especially that of the last two decades, on technology areas that are relevant to HVDC breakers. By comparing the mainly 20+ years old, state-of-the art HVDC CBs to the new HVDC technology, existing discrepancies become evident. Areas where additional research and development are needed are identified and proposed.

Journal ArticleDOI
TL;DR: It is shown that with small changes in the network structure (low cost) the robustness of diverse networks can be improved dramatically whereas their functionality remains unchanged and is useful not only for improving significantly with low cost the robustity of existing infrastructures but also for designing economically robust network systems.
Abstract: Terrorist attacks on transportation networks have traumatized modern societies. With a single blast, it has become possible to paralyze airline traffic, electric power supply, ground transportation or Internet communication. How and at which cost can one restructure the network such that it will become more robust against a malicious attack? We introduce a new measure for robustness and use it to devise a method to mitigate economically and efficiently this risk. We demonstrate its efficiency on the European electricity system and on the Internet as well as on complex networks models. We show that with small changes in the network structure (low cost) the robustness of diverse networks can be improved dramatically whereas their functionality remains unchanged. Our results are useful not only for improving significantly with low cost the robustness of existing infrastructures but also for designing economically robust network systems.

Journal ArticleDOI
TL;DR: This work provides a quantitative description of the proteome of a commonly used human cell line in two functional states, interphase and mitosis, and shows that these human cultured cells express at least ∼10 000 proteins and that the quantified proteins span a concentration range of seven orders of magnitude up to 20 000 000 copies per cell.
Abstract: The generation of mathematical models of biological processes, the simulation of these processes under different conditions, and the comparison and integration of multiple data sets are explicit goals of systems biology that require the knowledge of the absolute quantity of the system's components. To date, systematic estimates of cellular protein concentrations have been exceptionally scarce. Here, we provide a quantitative description of the proteome of a commonly used human cell line in two functional states, interphase and mitosis. We show that these human cultured cells express at least ∼10 000 proteins and that the quantified proteins span a concentration range of seven orders of magnitude up to 20 000 000 copies per cell. We discuss how protein abundance is linked to function and evolution.

Journal ArticleDOI
S. Chatrchyan, Vardan Khachatryan, Albert M. Sirunyan, A. Tumasyan  +2268 moreInstitutions (158)
TL;DR: In this article, the transverse momentum balance in dijet and γ/Z+jets events is used to measure the jet energy response in the CMS detector, as well as the transversal momentum resolution.
Abstract: Measurements of the jet energy calibration and transverse momentum resolution in CMS are presented, performed with a data sample collected in proton-proton collisions at a centre-of-mass energy of 7TeV, corresponding to an integrated luminosity of 36pb−1. The transverse momentum balance in dijet and γ/Z+jets events is used to measure the jet energy response in the CMS detector, as well as the transverse momentum resolution. The results are presented for three different methods to reconstruct jets: a calorimeter-based approach, the ``Jet-Plus-Track'' approach, which improves the measurement of calorimeter jets by exploiting the associated tracks, and the ``Particle Flow'' approach, which attempts to reconstruct individually each particle in the event, prior to the jet clustering, based on information from all relevant subdetectors

Journal ArticleDOI
TL;DR: In this article, the authors introduce a measure of Social Value Orientation (SVO), which measures the magnitude of the concern people have for others, sometimes referred to as social value orientation, and provide evidence of its solid psychometric properties.
Abstract: Narrow self-interest is often used as a simplifying assumption when studying people making decisions in social contexts. Nonetheless, people exhibit a wide range of different motivations when choosing unilaterally among interdependent outcomes. Measuring the magnitude of the concern people have for others, sometimes called Social Value Orientation (SVO), has been an interest of many social scientists for decades and several different measurement methods have been developed so far. Here we introduce a new measure of SVO that has several advantages over existent methods. A detailed description of the new measurement method is presented, along with norming data that provide evidence of its solid psychometric properties. We conclude with a brief discussion of the research streams that would benefit from a more sensitive and higher resolution measure of SVO, and extend an invitation to others to use this new measure which is freely available

Journal ArticleDOI
K. Abe1, N. Abgrall2, Hiroaki Aihara1, Yasuo Ajima  +533 moreInstitutions (53)
TL;DR: The T2K experiment as discussed by the authors is a long-baseline neutrino oscillation experiment whose main goal is to measure the last unknown lepton sector mixing angle by observing its appearance in a particle beam generated by the J-PARC accelerator.
Abstract: The T2K experiment is a long-baseline neutrino oscillation experiment Its main goal is to measure the last unknown lepton sector mixing angle {\theta}_{13} by observing { u}_e appearance in a { u}_{\mu} beam It also aims to make a precision measurement of the known oscillation parameters, {\Delta}m^{2}_{23} and sin^{2} 2{\theta}_{23}, via { u}_{\mu} disappearance studies Other goals of the experiment include various neutrino cross section measurements and sterile neutrino searches The experiment uses an intense proton beam generated by the J-PARC accelerator in Tokai, Japan, and is composed of a neutrino beamline, a near detector complex (ND280), and a far detector (Super-Kamiokande) located 295 km away from J-PARC This paper provides a comprehensive review of the instrumentation aspect of the T2K experiment and a summary of the vital information for each subsystem

Journal ArticleDOI
26 Oct 2011-PLOS ONE
TL;DR: It is found that transnational corporations form a giant bow-tie structure and that a large portion of control flows to a small tightly-knit core of financial institutions that can be seen as an economic “super-entity” that raises new important issues both for researchers and policy makers.
Abstract: The structure of the control network of transnational corporations affects global market competition and financial stability. So far, only small national samples were studied and there was no appropriate methodology to assess control globally. We present the first investigation of the architecture of the international ownership network, along with the computation of the control held by each global player. We find that transnational corporations form a giant bow-tie structure and that a large portion of control flows to a small tightly-knit core of financial institutions. This core can be seen as an economic “super-entity” that raises new important issues both for researchers and policy makers.

Journal ArticleDOI
TL;DR: Evidence that more P-efficient plants can be developed by modifying root growth and architecture, through manipulation of root exudates or by managing plant-microbial associations such as arbuscular mycorrhizal fungi and microbial inoculants is critically reviewed.
Abstract: Background Agricultural production is often limited by low phosphorus (P) availability. In developing countries, which have limited access to P fertiliser, there is a need to develop plants that are more efficient at low soil P. In fertilised and intensive systems, P-efficient plants are required to minimise inefficient use of P-inputs and to reduce potential for loss of P to the environment.

Proceedings ArticleDOI
12 Jun 2011
TL;DR: The design of CrowdDB is described, a major change is that the traditional closed-world assumption for query processing does not hold for human input, and important avenues for future work in the development of crowdsourced query processing systems are outlined.
Abstract: Some queries cannot be answered by machines only. Processing such queries requires human input for providing information that is missing from the database, for performing computationally difficult functions, and for matching, ranking, or aggregating results based on fuzzy criteria. CrowdDB uses human input via crowdsourcing to process queries that neither database systems nor search engines can adequately answer. It uses SQL both as a language for posing complex queries and as a way to model data. While CrowdDB leverages many aspects of traditional database systems, there are also important differences. Conceptually, a major change is that the traditional closed-world assumption for query processing does not hold for human input. From an implementation perspective, human-oriented query operators are needed to solicit, integrate and cleanse crowdsourced data. Furthermore, performance and cost depend on a number of new factors including worker affinity, training, fatigue, motivation and location. We describe the design of CrowdDB, report on an initial set of experiments using Amazon Mechanical Turk, and outline important avenues for future work in the development of crowdsourced query processing systems.

Journal ArticleDOI
TL;DR: The impact on the system-level performance, i.e., efficiency, power density, etc., of industrial inverter drives and of dc-dc converter resulting from the new SiC devices is evaluated based on analytical optimization procedures and prototype systems.
Abstract: Switching devices based on wide bandgap materials such as silicon carbide (SiC) offer a significant performance improvement on the switch level (specific on resistance, etc.) compared with Si devices. Well-known examples are SiC diodes employed, for example, in inverter drives with high switching frequencies. In this paper, the impact on the system-level performance, i.e., efficiency, power density, etc., of industrial inverter drives and of dc-dc converter resulting from the new SiC devices is evaluated based on analytical optimization procedures and prototype systems. There, normally on JFETs by SiCED and normally off JFETs by SemiSouth are considered.