scispace - formally typeset
Search or ask a question

Showing papers by "Vienna University of Technology published in 2013"


Journal ArticleDOI
TL;DR: The Prediction in Ungauged Basins (PUB) initiative of the International Association of Hydrological Sciences (IAHS) launched in 2003 and concluded by the PUB Symposium 2012 held in Delft (23-25 October 2012), set out to shift the scientific culture of hydrology towards improved scientific understanding of hydrological processes, as well as associated uncertainties and the development of models with increasing realism and predictive power as discussed by the authors.
Abstract: The Prediction in Ungauged Basins (PUB) initiative of the International Association of Hydrological Sciences (IAHS), launched in 2003 and concluded by the PUB Symposium 2012 held in Delft (23–25 October 2012), set out to shift the scientific culture of hydrology towards improved scientific understanding of hydrological processes, as well as associated uncertainties and the development of models with increasing realism and predictive power. This paper reviews the work that has been done under the six science themes of the PUB Decade and outlines the challenges ahead for the hydrological sciences community.Editor D. KoutsoyiannisCitation Hrachowitz, M., Savenije, H.H.G., Bloschl, G., McDonnell, J.J., Sivapalan, M., Pomeroy, J.W., Arheimer, B., Blume, T., Clark, M.P., Ehret, U., Fenicia, F., Freer, J.E., Gelfan, A., Gupta, H.V., Hughes, D.A., Hut, R.W., Montanari, A., Pande, S., Tetzlaff, D., Troch, P.A., Uhlenbrook, S., Wagener, T., Winsemius, H.C., Woods, R.A., Zehe, E., and Cudennec, C., 2013. A d...

848 citations


Journal ArticleDOI
TL;DR: In this article, a CMOS-compatible photodetector based on graphene with multi-gigahertz operation ranging from the O-to U-band of telecommunication bands is demonstrated, highlighting the promise of graphene as a new material for integrated photonics.
Abstract: A CMOS-compatible photodetector based on graphene with multi-gigahertz operation ranging from the O- to U-band of telecommunication bands is demonstrated, highlighting the promise of graphene as a new material for integrated photonics.

675 citations


Journal ArticleDOI
TL;DR: In this article, a detailed description of the analysis used by the CMS Collaboration in the search for the standard model Higgs boson in pp collisions at the LHC, which led to the observation of a new boson.
Abstract: A detailed description is reported of the analysis used by the CMS Collaboration in the search for the standard model Higgs boson in pp collisions at the LHC, which led to the observation of a new boson. The data sample corresponds to integrated luminosities up to 5.1 inverse femtobarns at sqrt(s) = 7 TeV, and up to 5.3 inverse femtobarns at sqrt(s) = 8 TeV. The results for five Higgs boson decay modes gamma gamma, ZZ, WW, tau tau, and bb, which show a combined local significance of 5 standard deviations near 125 GeV, are reviewed. A fit to the invariant mass of the two high resolution channels, gamma gamma and ZZ to 4 ell, gives a mass estimate of 125.3 +/- 0.4 (stat) +/- 0.5 (syst) GeV. The measurements are interpreted in the context of the standard model Lagrangian for the scalar Higgs field interacting with fermions and vector bosons. The measured values of the corresponding couplings are compared to the standard model predictions. The hypothesis of custodial symmetry is tested through the measurement of the ratio of the couplings to the W and Z bosons. All the results are consistent, within their uncertainties, with the expectations for a standard model Higgs boson.

643 citations


Journal ArticleDOI
TL;DR: A variety of algorithms have been developed by CMS to select b-quark jets based on variables such as the impact parameters of charged-particle tracks, the properties of reconstructed decay vertices, and the presence or absence of a lepton as mentioned in this paper.
Abstract: At the Large Hadron Collider, the identification of jets originating from b quarks is important for searches for new physics and for measurements of standard model processes. A variety of algorithms has been developed by CMS to select b-quark jets based on variables such as the impact parameters of charged-particle tracks, the properties of reconstructed decay vertices, and the presence or absence of a lepton, or combinations thereof. The performance of these algorithms has been measured using data from proton-proton collisions at the LHC and compared with expectations based on simulation. The data used in this study were recorded in 2011 at √s = 7 TeV for a total integrated luminosity of 5.0 fb^(-1). The efficiency for tagging b-quark jets has been measured in events from multijet and t-quark pair production. CMS has achieved a b-jet tagging efficiency of 85% for a light-parton misidentification probability of 10% in multijet events. For analyses requiring higher purity, a misidentification probability of only 1.5% has been achieved, for a 70% b-jet tagging efficiency.

631 citations


Journal ArticleDOI
TL;DR: This work proposes a generic and simple framework comprising three steps: constructing a cost volume, fast cost volume filtering, and 3) Winner-Takes-All label selection that achieves 1) disparity maps in real time whose quality exceeds those of all other fast (local) approaches on the Middlebury stereo benchmark, and 2) optical flow fields which contain very fine structures as well as large displacements.
Abstract: Many computer vision tasks can be formulated as labeling problems. The desired solution is often a spatially smooth labeling where label transitions are aligned with color edges of the input image. We show that such solutions can be efficiently achieved by smoothing the label costs with a very fast edge-preserving filter. In this paper, we propose a generic and simple framework comprising three steps: 1) constructing a cost volume, 2) fast cost volume filtering, and 3) Winner-Takes-All label selection. Our main contribution is to show that with such a simple framework state-of-the-art results can be achieved for several computer vision applications. In particular, we achieve 1) disparity maps in real time whose quality exceeds those of all other fast (local) approaches on the Middlebury stereo benchmark, and 2) optical flow fields which contain very fine structures as well as large displacements. To demonstrate robustness, the few parameters of our framework are set to nearly identical values for both applications. Also, competitive results for interactive image segmentation are presented. With this work, we hope to inspire other researchers to leverage this framework to other application areas.

618 citations


Journal ArticleDOI
TL;DR: The Panta Rhei Everything Flows project as mentioned in this paper is dedicated to research activities on change in hydrology and society, which aims to reach an improved interpretation of the processes governing the water cycle by focusing on their changing dynamics in connection with rapidly changing human systems.
Abstract: The new Scientific Decade 2013-2022 of IAHS, entitled Panta RheiEverything Flows, is dedicated to research activities on change in hydrology and society. The purpose of Panta Rhei is to reach an improved interpretation of the processes governing the water cycle by focusing on their changing dynamics in connection with rapidly changing human systems. The practical aim is to improve our capability to make predictions of water resources dynamics to support sustainable societal development in a changing environment. The concept implies a focus on hydrological systems as a changing interface between environment and society, whose dynamics are essential to determine water security, human safety and development, and to set priorities for environmental management. The Scientific Decade 2013-2022 will devise innovative theoretical blueprints for the representation of processes including change and will focus on advanced monitoring and data analysis techniques. Interdisciplinarity will be sought by increased efforts to connect with the socio-economic sciences and geosciences in general. This paper presents a summary of the Science Plan of Panta Rhei, its targets, research questions and expected outcomes.

550 citations


Journal ArticleDOI
TL;DR: The Advanced Scatterometer (ASCAT) is a C-band active microwave remote sensing instrument flown on board of the Meteorological Operational (METOP) satellite series as discussed by the authors.
Abstract: Many physical, chemical and biological processes taking place at the land surface are strongly influenced by the amount of water stored within the upper soil layers. Therefore, many scientific disciplines require soil moisture observations for developing, evaluating and improving their models. One of these disciplines is meteorology where soil moisture is important due to its control on the exchange of heat and water between the soil and the lower atmosphere. Soil moisture observations may thus help to improve the forecasts of air temperature, air humidity and precipitation. However, until recently, soil moisture observations had only been available over a limited number of regional soil moisture networks. This has hampered scientific progress as regards the characterisation of land surface processes not just in meteorology but many other scientific disciplines as well. Fortunately, in recent years, satellite soil moisture data have increasingly become available. One of the freely available global soil moisture data sets is derived from the backscatter measurements acquired by the Advanced Scatterometer (ASCAT) that is a C-band active microwave remote sensing instrument flown on board of the Meteorological Operational (METOP) satellite series. ASCAT was designed to observe wind speed and direction over the oceans and was initially not foreseen for monitoring soil moisture over land. Yet, as argued in this review paper, the characteristics of the ASCAT instrument, most importantly its wavelength (5.7 cm), its high radiometric accuracy, and its multiple-viewing capabilities make it an attractive sensor for measuring soil moisture. Moreover, given the operational status of ASCAT, and its promising long-term prospects, many geoscientific applications might benefit from using ASCAT soil moisture data. Nonetheless, the ASCAT soil moisture product is relatively complex, requiring a good understanding of its properties before it can be successfully used in applications. To provide a comprehensive overview of themajor characteristics and caveats of the ASCATsoil moisture product, this paper describes the ASCAT instrument and the soil moisture processor and near-real-time distribution service implemented by the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT).A review of themost recent validation studies shows that the quality of ASCAT soil moisture product is – with the exception of arid environments –comparable to, and over some regions (e.g. Europe) even better than currently available soil moisture data derived from passive microwave sensors. Further, a review of applications studies shows that the use of the ASCAT soil moisture product is particularly advanced in the fields of numerical weather prediction and hydrologic modelling. But also in other application areas such as yield monitoring, epidemiologic modelling, or societal risks assessment some first progress can be noted. Considering the generally positive evaluation results, it is expected that the ASCAT soil moisture product will increasingly be used by a growing number of rather diverse land applications.

484 citations


BookDOI
01 Jan 2013
TL;DR: In this article, the authors present a data acquisition framework for predictions of runoff in ungauged basins, including a synthesis framework for runoff predictions in un-gauging basins.
Abstract: List of contributors Foreword Thomas Dunne Preface Gunter Bloeschl, Murugesu Sivapalan, Thorsten Wagener, Alberto Viglione and Hubert Savenije 1. Introduction Gunter Bloeschl, Murugesu Sivapalan, Thorsten Wagener, Alberto Viglione and Hubert Savenije 2. A synthesis framework for runoff predictions in ungauged basins Thorsten Wagener, Gunter Bloeschl, David Goodrich, Hoshin V. Gupta, Murugesu Sivapalan, Yasuto Tachikawa, Peter Troch and Markus Weiler 3. A data acquisition framework for predictions of runoff in ungauged basins Brian McGlynn, Gunter Bloeschl, Marco Borga, Helge Bormann, Ruud Hurkmans, Jurgen Komma, Lakshman Nandagiri, Remko Uijlenhoet and Thorsten Wagener 4. Process realism: flow paths and storage Doerthe Tetzlaff, Ghazi Al-Rawas, Gunter Bloeschl, Sean K. Carey, Ying Fan, Markus Hrachowitz, Robert Kirnbauer, Graham Jewitt, Hjalmar Laudon, Kevin J. McGuire, Takahiro Sayama, Chris Soulsby, Erwin Zehe and Thorsten Wagener 5. Prediction of annual runoff in ungauged basins Thomas McMahon, Gregor Laaha, Juraj Parajka, Murray C. Peel, Hubert Savenije, Murugesu Sivapalan, Jan Szolgay, Sally Thompson, Alberto Viglione, Ross Woods and Dawen Yang 6. Prediction of seasonal runoff in ungauged basins R. Weingartner, Gunter Bloeschl, David Hannah, Danny Marks, Juraj Parajka, Charles Pearson, Magdalena Rogger, Jose Luis. Salinas, Eric Sauquet, Sri Srikanthan, Sally Thompson and Alberto Viglione 7. Prediction of flow duration curves in ungauged basins Attilio Castellarin, Gianluca Botter, Denis A. Hughes, Suxia Liu, Taha B. M. J. Ouarda, Juraj Parajka, David Post, Murugesu Sivapalan, Christopher Spence, Alberto Viglione and Richard Vogel 8. Prediction of low flows in ungauged basins Gregor Laaha, Siegfried Demuth, Hege Hisdal, Charles N. Kroll, Henny A. J. van Lanen, Thomas Nester, Magdalena Rogger, Eric Sauquet, Lena M. Tallaksen, Ross Woods and Andy Young 9. Prediction of floods in ungauged basins Dan Rosbjerg, Gunter Bloeschl, Donald H. Burn, Attilio Castellarin, Barry Croke, Guliano Di Baldassarre, Vito Iacobellis, Thomas Kjeldsen, George Kuczera, Ralf Merz, Alberto Montanari, David Morris, Taha B. M. J. Ouarda, Liliang Ren, Magdalena Rogger, Jose Luis Salinas, Elena Toth and Alberto Viglione 10. Predictions of runoff hydrographs in ungauged basins Juraj Parajka, Vazken Andreassian, Stacey Archfield, Andras Bardossy, Francis Chiew, Qingyun Duan, Alexander Gelfan, Kamila Hlavcova, Ralf Merz, Neil McIntyre, Ludovic Oudin, Charles Perrin, Magdalena Rogger, Jose Luis Salinas, Hubert Savenije, Jon Olav Skoien, Thorsten Wagener, Erwin Zehe and Yongqiang Zhang 11. Case studies Hubert Savenije, Murugesu Sivapalan, Trent Biggs, Shaofeng Jia, Leonid M. Korytny, E.A.Ilyichyova, Boris Gartsman, John W. Pomeroy, Kevin Shook, Xing Fang, Tom Brown, Denis A. Hughes, Stacey Archfield, Jos Samuel, Paulin Coulibaly, Robert A. Metcalfe, Attilio Castellarin, Ralf Merz, Gunter Humer, Ataur Rahman, Khaled Haddad, Erwin Weinmann, George Kuczera, Theresa Blume, Armand Crabit, Francois Colin, Roger Moussa, Hessel Winsemius, Hubert Savenije, Jens Liebe, Nick van de Giesen, M. Todd Walter, Tammo S. Steenhuis, Jeffrey R. Kennedy, David Goodrich, Carl L. Unkrich, Dominic Mazvimavi, Neil R. Viney, Kuniyoshi Takeuchi, H. A. P. Hapuarachchi, Anthony S. Kiem, Hiroshi Ishidaira, Tianqi Ao, Jun Magome, Maichun C. Zhou, Mikhail Georgievski, Guoqiang Wang, Chihiro Yoshimura, Berit Arheimer, Goeran Lindstroem and Shijun Lin 12. Synthesis across processes, places and scales Hoshin V. Gupta, Gunter Bloeschl, Jeffrey McDonnell, Hubert Savenije, Murugesu Sivapalan, Alberto Viglione and Thorsten Wagener 13. Recommendations Kuniyoshi Takeuchi, Gunter. Bloeschl, Hubert Savenije, John Schaake, Murugesu Sivapalan, Alberto Viglione, Thorsten Wagener and Gordon Young Appendix: summary of studies used in the comparative assessments References Index.

446 citations


Journal ArticleDOI
TL;DR: The goal is to provide a survey that will help researchers to better position their own work in the context of existing solutions, and to help newcomers and practitioners in computer graphics to quickly gain an overview of this vast field.
Abstract: This paper provides a comprehensive overview of urban reconstruction. While there exists a considerable body of literature, this topic is still under active research. The work reviewed in this survey stems from the following three research communities: computer graphics, computer vision and photogrammetry and remote sensing. Our goal is to provide a survey that will help researchers to better position their own work in the context of existing solutions, and to help newcomers and practitioners in computer graphics to quickly gain an overview of this vast field. Further, we would like to bring the mentioned research communities to even more interdisciplinary work, since the reconstruction problem itself is by far not solved.

445 citations


Journal ArticleDOI
30 Aug 2013-Science
TL;DR: With scanning tunneling microscopy, the nature of O2 molecules on the surface of anatase (titanium oxide, TiO2) doped with niobium are observed, transformed, and identified in conjunction with theory.
Abstract: Oxygen (O2) adsorbed on metal oxides is important in catalytic oxidation reactions, chemical sensing, and photocatalysis. Strong adsorption requires transfer of negative charge from oxygen vacancies (V(O)s) or dopants, for example. With scanning tunneling microscopy, we observed, transformed, and, in conjunction with theory, identified the nature of O2 molecules on the (101) surface of anatase (titanium oxide, TiO2) doped with niobium. V(O)s reside exclusively in the bulk, but we pull them to the surface with a strongly negatively charged scanning tunneling microscope tip. O2 adsorbed as superoxo (O2(-)) at fivefold-coordinated Ti sites was transformed to peroxo (O2(2-)) and, via reaction with a VO, placed into an anion surface lattice site as an (O2)O species. This so-called bridging dimer also formed when O2 directly reacted with V(O)s at or below the surface.

441 citations


Journal ArticleDOI
TL;DR: In this paper, a simple, dynamic model is developed to represent the interactions and feedback loops between hydrological and social processes in a human-flood system and the effect of changing individual characteristics, including external forcing such as technological development.
Abstract: Over history, humankind has tended to settle near streams because of the role of rivers as transportation corridors and the fertility of riparian areas. However, human settlements in floodplains have been threatened by the risk of flooding. Possible responses have been to resettle away and/or modify the river system by building flood control structures. This has led to a complex web of interactions and feedback mechanisms between hydrological and social processes in settled floodplains. This paper is an attempt to conceptualise these interplays for hypothetical human-flood systems. We develop a simple, dynamic model to represent the interactions and feedback loops between hydrological and social processes. The model is then used to explore the dynamics of the human-flood system and the effect of changing individual characteristics, including external forcing such as technological development. The results show that the conceptual model is able to reproduce reciprocal effects between floods and people as well as the emergence of typical patterns. For instance, when levees are built or raised to protect floodplain areas, their presence not only reduces the frequency of flooding, but also exacerbates high water levels. Then, because of this exacerbation, higher flood protection levels are required by society. As a result, more and more flooding events are avoided, but rare and catastrophic events take place.

Journal ArticleDOI
TL;DR: The goal of this article is to compare the approaches to QoS description in the literature, where several models and metamodels are included, and to analyze where the need for further research and investigation lies.
Abstract: Quality of service (QoS) can be a critical element for achieving the business goals of a service provider, for the acceptance of a service by the user, or for guaranteeing service characteristics in a composition of services, where a service is defined as either a software or a software-support (i.e., infrastructural) service which is available on any type of network or electronic channel. The goal of this article is to compare the approaches to QoS description in the literature, where several models and metamodels are included. consider a large spectrum of models and metamodels to describe service quality, ranging from ontological approaches to define quality measures, metrics, and dimensions, to metamodels enabling the specification of quality-based service requirements and capabilities as well as of SLAs (Service-Level Agreements) and SLA templates for service provisioning. Our survey is performed by inspecting the characteristics of the available approaches to reveal which are the consolidated ones and which are the ones specific to given aspects and to analyze where the need for further research and investigation lies. The approaches here illustrated have been selected based on a systematic review of conference proceedings and journals spanning various research areas in computer science and engineering, including: distributed, information, and telecommunication systems, networks and security, and service-oriented and grid computing.

Journal ArticleDOI
01 Aug 2013
TL;DR: It is argued that this presents a number of challenges for CSCW research moving forward: in having a greater impact on larger-scale health IT projects; broadening the scope of settings and perspectives that are studied; and reflecting on the relevance of the traditional methods in this field - namely workplace studies - to meet these challenges.
Abstract: CSCW as a field has been concerned since its early days with healthcare, studying how healthcare work is collaboratively and practically achieved and designing systems to support that work. Reviewing literature from the CSCW Journal and related conferences where CSCW work is published, we reflect on the contributions that have emerged from this work. The analysis illustrates a rich range of concepts and findings towards understanding the work of healthcare but the work on the larger policy level is lacking. We argue that this presents a number of challenges for CSCW research moving forward: in having a greater impact on larger-scale health IT projects; broadening the scope of settings and perspectives that are studied; and reflecting on the relevance of the traditional methods in this field - namely workplace studies - to meet these challenges.

Journal ArticleDOI
TL;DR: Weaknesses in GPT/GMF, specifically their limited spatial and temporal variability, are largely eradicated by a new, combined model GPT2, which provides pressure, temperature, lapse rate, water vapor pressure, and mapping function coefficients at any site, resting upon a global 5° grid of mean values, annual, and semi-annual variations in all parameters.
Abstract: Up to now, state-of-the-art empirical slant delay modeling for processing observations from radio space geodetic techniques has been provided by a combination of two empirical models. These are GPT (Global Pressure and Temperature) and GMF (Global Mapping Function), both operating on the basis of long-term averages of surface values from numerical weather models. Weaknesses in GPT/GMF, specifically their limited spatial and temporal variability, are largely eradicated by a new, combined model GPT2, which provides pressure, temperature, lapse rate, water vapor pressure, and mapping function coefficients at any site, resting upon a global 5° grid of mean values, annual, and semi-annual variations in all parameters. Built on ERA-Interim data, GPT2 brings forth improved empirical slant delays for geophysical studies. Compared to GPT/GMF, GPT2 yields a 40% reduction of annual and semi-annual amplitude differences in station heights with respect to a solution based on instantaneous local pressure values and the Vienna mapping functions 1, as shown with a series of global VLBI (Very Long Baseline Interferometry) solutions.

Journal ArticleDOI
TL;DR: In this article, a new automated quality control system for soil moisture measurements contained in the International Soil Moisture Network (ISMN) is presented, which includes flagging values exceeding a certain threshold and checking validity of soil moisture variations in relation to changes in soil temperature and precipitation.
Abstract: The International Soil Moisture Network (ISMN) was initiated in 2009 to support calibration and validation of remote sensing products and land surface models, and to facilitate studying the behavior of our climate over space and time. The ISMN does this by collecting and harmonizing soil moisture data sets from a large variety of individually operating networks and making them available through a centralized data portal. Due to the diversity of climatological conditions covered by the stations and differences in measurement devices and setup, the quality of the measurements is highly variable. Therefore, appropriate quality characterization is desirable for a correct use of the data sets. This study presents a new, automated quality control system for soil moisture measurements contained in the ISMN. Two types of quality control procedures are presented. The first category is based on the geophysical dynamic range and consistency of the measurements. It includes flagging values exceeding a certain threshold and checking the validity of soil moisture variations in relation to changes in soil temperature and precipitation. In particular, the usability of global model- or remote sensing–based temperature and precipitation data sets were tested for this purpose as an alternative to in situ measurements, which are often not recorded at the soil moisture sites themselves. The second category of procedures analyzes the shape of the soil moisture time series to detect outliers (spikes), positive and negative breaks, saturation of the signal, and unresponsive sensors. All methods were first validated and then applied to all the data sets currently contained in the ISMN. A validation example of an AMSR-E satellite and a GLDAS-Noah model product showed a small but positive impact of the flagging. On the basis of the positive results of this study we will add the flags as a standard attribute to all soil moisture measurements contained in the ISMN.

01 Apr 2013
TL;DR: In this article, a global-scale observational analysis of the coupling between soil moisture and precipitation is presented, showing that rain falls preferentially over soils that are relatively dry compared to the surrounding area.
Abstract: Analysis of observations on six continents reveals a global preference for afternoon rain to fall on locally drier soils—contrary to the predictions of large-scale climate models, and suggesting that such models may exaggerate the occurrence of droughts. Soil moisture is known to influence precipitation across a range of scales in time and space, and most models suggest that wetter soils promote higher atmospheric moisture content and favour the local development of storms. But this analysis of global precipitation data from a combination of weather satellites shows that — especially in semi-arid regions — afternoon precipitation is more likely over dry soil than over wet soil. The findings suggest that current climate models may be missing fundamental processes regulating convection and land–atmosphere interactions. Land surface properties, such as vegetation cover and soil moisture, influence the partitioning of radiative energy between latent and sensible heat fluxes in daytime hours. During dry periods, soil-water deficit can limit evapotranspiration, leading to warmer and drier conditions in the lower atmosphere1,2. Soil moisture can influence the development of convective storms through such modifications of low-level atmospheric temperature and humidity1,3, which in turn feeds back on soil moisture. Yet there is considerable uncertainty in how soil moisture affects convective storms across the world, owing to a lack of observational evidence and uncertainty in large-scale models4. Here we present a global-scale observational analysis of the coupling between soil moisture and precipitation. We show that across all six continents studied, afternoon rain falls preferentially over soils that are relatively dry compared to the surrounding area. The signal emerges most clearly in the observations over semi-arid regions, where surface fluxes are sensitive to soil moisture, and convective events are frequent. Mechanistically, our results are consistent with enhanced afternoon moist convection driven by increased sensible heat flux over drier soils, and/or mesoscale variability in soil moisture. We find no evidence in our analysis of a positive feedback—that is, a preference for rain over wetter soils—at the spatial scale (50–100 kilometres) studied. In contrast, we find that a positive feedback of soil moisture on simulated precipitation does dominate in six state-of-the-art global weather and climate models—a difference that may contribute to excessive simulated droughts in large-scale models.

Journal ArticleDOI
24 Jul 2013-PLOS ONE
TL;DR: The results show that online images can be used to create reproducible quantitative measures of urban perception and characterize the inequality of different cities, using thousands of geo-tagged images to measure the perception of safety, class and uniqueness.
Abstract: A traveler visiting Rio, Manila or Caracas does not need a report to learn that these cities are unequal; she can see it directly from the taxicab window. This is because in most cities inequality is conspicuous, but also, because cities express different forms of inequality that are evident to casual observers. Cities are highly heterogeneous and often unequal with respect to the income of their residents, but also with respect to the cleanliness of their neighborhoods, the beauty of their architecture, and the liveliness of their streets, among many other evaluative dimensions. Until now, however, our ability to understand the effect of a city's built environment on social and economic outcomes has been limited by the lack of quantitative data on urban perception. Here, we build on the intuition that inequality is partly conspicuous to create quantitative measure of a city's contrasts. Using thousands of geo-tagged images, we measure the perception of safety, class and uniqueness; in the cities of Boston and New York in the United States, and Linz and Salzburg in Austria, finding that the range of perceptions elicited by the images of New York and Boston is larger than the range of perceptions elicited by images from Linz and Salzburg. We interpret this as evidence that the cityscapes of Boston and New York are more contrasting, or unequal, than those of Linz and Salzburg. Finally, we validate our measures by exploring the connection between them and homicides, finding a significant correlation between the perceptions of safety and class and the number of homicides in a NYC zip code, after controlling for the effects of income, population, area and age. Our results show that online images can be used to create reproducible quantitative measures of urban perception and characterize the inequality of different cities.

Book ChapterDOI
TL;DR: A simple notation for describing interacting MAPE loops is contributed, which is used to describe a number of existing patterns of interacting MAPe loops, to begin to fulfill (a) and (b), and numerous remaining research challenges in this area are outlined.
Abstract: Self-adaptation is typically realized using a control loop. One prominent approach for organizing a control loop in self-adaptive systems is by means of four components that are responsible for the primary functions of self-adaptation: Monitor, Analyze, Plan, and Execute, together forming a MAPE loop. When systems are large, complex, and heterogeneous, a single MAPE loop may not be sufficient for managing all adaptation in a system, so multiple MAPE loops may be introduced. In self-adaptive systems with multiple MAPE loops, decisions about how to decentralize each of the MAPE functions must be made. These decisions involve how and whether the corresponding functions from multiple loops are to be coordinated (e.g., planning components coordinating to prepare a plan for an adaptation). To foster comprehension of self-adaptive systems with multiple MAPE loops and support reuse of known solutions, it is crucial that we document common design approaches for engineers. As such systematic knowledge is currently lacking, it is timely to reflect on these systems to: (a) consolidate the knowledge in this area, and (b) to develop a systematic approach for describing different types of control in self-adaptive systems. We contribute with a simple notation for describing interacting MAPE loops, which we believe helps in achieving (b), and we use this notation to describe a number of existing patterns of interacting MAPE loops, to begin to fulfill (a). From our study, we outline numerous remaining research challenges in this area.

Journal ArticleDOI
TL;DR: The Climate Change Initiative (CCI) as discussed by the authors provides a forum to bring the data and modeling communities together to provide a climate system perspective and a forum for bringing data and modelling communities together.
Abstract: Observations of Earth from space have been made for over 40 years and have contributed to advances in many aspects of climate science. However, attempts to exploit this wealth of data are often hampered by a lack of homogeneity and continuity and by insufficient understanding of the products and their uncertainties. There is, therefore, a need to reassess and reprocess satellite datasets to maximize their usefulness for climate science. The European Space Agency has responded to this need by establishing the Climate Change Initiative (CCI). The CCI will create new climate data records for (currently) 13 essential climate variables (ECVs) and make these open and easily accessible to all. Each ECV project works closely with users to produce time series from the available satellite observations relevant to users' needs. A climate modeling users' group provides a climate system perspective and a forum to bring the data and modeling communities together. This paper presents the CCI program. It outlines its benefit and presents approaches and challenges for each ECV project, covering clouds, aerosols, ozone, greenhouse gases, sea surface temperature, ocean color, sea level, sea ice, land cover, fire, glaciers, soil moisture, and ice sheets. It also discusses how the CCI approach may contribute to defining and shaping future developments in Earth observation for climate science.

Journal ArticleDOI
16 Aug 2013-Science
TL;DR: By stopping a light pulse in an atomic ensemble contained inside an optical resonator, the realization of an all-optical transistor is realized, in which one stored gate photon controls the resonator transmission of subsequently applied source photons.
Abstract: The realization of an all-optical transistor, in which one “gate” photon controls a “source” light beam, is a long-standing goal in optics. By stopping a light pulse in an atomic ensemble contained inside an optical resonator, we realized a device in which one stored gate photon controls the resonator transmission of subsequently applied source photons. A weak gate pulse induces bimodal transmission distribution, corresponding to zero and one gate photons. One stored gate photon produces fivefold source attenuation and can be retrieved from the atomic ensemble after switching more than one source photon. Without retrieval, one stored gate photon can switch several hundred source photons. With improved storage and retrieval efficiency, our work may enable various new applications, including photonic quantum gates and deterministic multiphoton entanglement.

Journal ArticleDOI
TL;DR: In this article, the performance of the LHCb Muon system and its stability across the full 2010 data taking with LHC running at root s = 7 TeV energy is studied.
Abstract: The performance of the LHCb Muon system and its stability across the full 2010 data taking with LHC running at root s = 7 TeV energy is studied. The optimization of the detector setting and the time calibration performed with the first collisions delivered by LHC is described. Particle rates, measured for the wide range of luminosities and beam operation conditions experienced during the run, are compared with the values expected from simulation. The space and time alignment of the detectors, chamber efficiency, time resolution and cluster size are evaluated. The detector performance is found to be as expected from specifications or better. Notably the overall efficiency is well above the design requirements.

Journal ArticleDOI
TL;DR: In this article, results of searches for heavy stable charged particles produced in pp collisions at 7 and 8 TeV are presented corresponding to an integrated luminosity of 5.0 and 18.8 inverse femtobarns, respectively.
Abstract: Results of searches for heavy stable charged particles produced in pp collisions at sqrt(s) = 7 and 8 TeV are presented corresponding to an integrated luminosity of 5.0 inverse femtobarns and 18.8 inverse femtobarns, respectively. Data collected with the CMS detector are used to study the momentum, energy deposition, and time-of-flight of signal candidates. Leptons with an electric charge between e/3 and 8e, as well as bound states that can undergo charge exchange with the detector material, are studied. Analysis results are presented for various combinations of signatures in the inner tracker only, inner tracker and muon detector, and muon detector only. Detector signatures utilized are long time-of-flight to the outer muon system and anomalously high (or low) energy deposition in the inner tracker. The data are consistent with the expected background, and upper limits are set on the production cross section of long-lived gluinos, scalar top quarks, and scalar tau leptons, as well as pair produced long-lived leptons. Corresponding lower mass limits, ranging up to 1322 GeV for gluinos, are the most stringent to date.

Journal ArticleDOI
E. Abbas, Betty Abelev1, Jaroslav Adam2, Dagmar Adamová3  +1019 moreInstitutions (91)
TL;DR: The ALICE VZERO system, made of two scintillator arrays at asymmetric positions, one on each side of the interaction point, plays a central role in ALICE and is used to monitor LHC beam conditions, to reject beam-induced backgrounds and to measure basic physics quantities such as luminosity, particle multiplicity, centrality and event plane direction as mentioned in this paper.
Abstract: ALICE is an LHC experiment devoted to the study of strongly interacting matter in proton-proton, proton-nucleus and nucleus-nucleus collisions at ultra-relativistic energies. The ALICE VZERO system, made of two scintillator arrays at asymmetric positions, one on each side of the interaction point, plays a central role in ALICE. In addition to its core function as a trigger source, the VZERO system is used to monitor LHC beam conditions, to reject beam-induced backgrounds and to measure basic physics quantities such as luminosity, particle multiplicity, centrality and event plane direction in nucleus-nucleus collisions. After describing the VZERO system, this publication presents its performance over more than four years of operation at the LHC.

Journal ArticleDOI
TL;DR: In this article, the spin-orbit coupling effects at LaAlO{}_{3}$/SrTiO${}_1} interfaces and Srinivasan et al. showed that the spin splitting with a cubic dependence on the wave vector is possible at the crossing point of the $xy$ and $yz$ (or $zx$) orbitals.
Abstract: The theoretical understanding of the spin-orbit coupling (SOC) effects at LaAlO${}_{3}$/SrTiO${}_{3}$ interfaces and SrTiO${}_{3}$ surfaces is still in its infancy. We perform first-principles density-functional-theory calculations and derive from these a simple tight-binding Hamiltonian, through a Wannier function projection and group theoretical analysis. We find striking differences to the standard Rashba theory for spin-orbit coupling in semiconductor heterostructures due to multiorbital effects: By far the biggest SOC effect is at the crossing point of the $xy$ and $yz$ (or $zx$) orbitals, and around the $\ensuremath{\Gamma}$ point a Rashba spin splitting with a cubic dependence on the wave vector $\stackrel{P\vec}{k}$ is possible.

Journal ArticleDOI
TL;DR: It is shown that spin dephasing and relaxation can be largely suppressed, allowing for substantial spin squeezing under realistic experimental conditions.
Abstract: We propose and analyze a novel mechanism for long-range spin-spin interactions in diamond nanostructures. The interactions between electronic spins, associated with nitrogen-vacancy centers in diamond, are mediated by their coupling via strain to the vibrational mode of a diamond mechanical nanoresonator. This coupling results in phonon-mediated effective spin-spin interactions that can be used to generate squeezed states of a spin ensemble. We show that spin dephasing and relaxation can be largely suppressed, allowing for substantial spin squeezing under realistic experimental conditions. Our approach has implications for spin-ensemble magnetometry, as well as phonon-mediated quantum information processing with spin qubits.

Journal ArticleDOI
TL;DR: This genome-wide expression study demonstrates that the initial Trichoderma mycotrophy has differentiated into several alternative ecological strategies ranging from parasitism to predation and saprotrophy, and provides first insights into the mechanisms of interactions between Trichodma and other fungi that may be exploited for further development of biofungicides.
Abstract: Trichoderma is a genus of mycotrophic filamentous fungi (teleomorph Hypocrea) which possess a bright variety of biotrophic and saprotrophic lifestyles. The ability to parasitize and/or kill other fungi (mycoparasitism) is used in plant protection against soil-borne fungal diseases (biological control, or biocontrol). To investigate mechanisms of mycoparasitism, we compared the transcriptional responses of cosmopolitan opportunistic species and powerful biocontrol agents Trichoderma atroviride and T. virens with tropical ecologically restricted species T. reesei during confrontations with a plant pathogenic fungus Rhizoctonia solani. The three Trichoderma spp. exhibited a strikingly different transcriptomic response already before physical contact with alien hyphae. T. atroviride expressed an array of genes involved in production of secondary metabolites, GH16 s-glucanases, various proteases and small secreted cysteine rich proteins. T. virens, on the other hand, expressed mainly the genes for biosynthesis of gliotoxin, respective precursors and also glutathione, which is necessary for gliotoxin biosynthesis. In contrast, T. reesei increased the expression of genes encoding cellulases and hemicellulases, and of the genes involved in solute transport. The majority of differentially regulated genes were orthologues present in all three species or both in T. atroviride and T. virens, indicating that the regulation of expression of these genes is different in the three Trichoderma spp. The genes expressed in all three fungi exhibited a nonrandom genomic distribution, indicating a possibility for their regulation via chromatin modification. This genome-wide expression study demonstrates that the initial Trichoderma mycotrophy has differentiated into several alternative ecological strategies ranging from parasitism to predation and saprotrophy. It provides first insights into the mechanisms of interactions between Trichoderma and other fungi that may be exploited for further development of biofungicides.

Journal ArticleDOI
TL;DR: A survey, classification, and comparison of various DPF approaches and algorithms available to date are presented, with emphasis on decentralized ANs that do not include a central processing or control unit.
Abstract: Distributed particle filter (DPF) algorithms are sequential state estimation algorithms that are executed by a set of agents. Some or all of the agents perform local particle filtering and interact with other agents to calculate a global state estimate. DPF algorithms are attractive for large-scale, nonlinear, and non-Gaussian distributed estimation problems that often occur in applications involving agent networks (ANs). In this article, we present a survey, classification, and comparison of various DPF approaches and algorithms available to date. Our emphasis is on decentralized ANs that do not include a central processing or control unit.

Journal ArticleDOI
TL;DR: The pre-print version of the final publishing paper that is available from the link below as mentioned in this paper is also available from Amazon Mechanical Turk, however, the preprint version requires a subscription.
Abstract: The article is the pre-print version of the final publishing paper that is available from the link below.

Journal ArticleDOI
TL;DR: The interaction between single quantum emitters and non-transversally polarized photons for which the electric field vector amplitude has a significant component in the direction of propagation is investigated.
Abstract: Light is often described as a fully transverse-polarized wave, i.e., with an electric field vector that is orthogonal to the direction of propagation. However, light confined in dielectric structures such as optical waveguides or whispering-gallery-mode microresonators can have a strong longitudinal polarization component. Here, using single $^{85}\mathrm{Rb}$ atoms strongly coupled to a whispering-gallery-mode microresonator, we experimentally and theoretically demonstrate that the presence of this longitudinal polarization fundamentally alters the interaction between light and matter.

Journal ArticleDOI
Carlos Guerrero1, A. Tsinganis1, A. Tsinganis2, E. Berthoumieux3, E. Berthoumieux1, Mario Barbagallo4, Fabio Belloni3, F. Gunsing3, C. Weiß1, C. Weiß5, E. Chiaveri1, E. Chiaveri3, Marco Calviani1, V. Vlachoudis1, S. Altstadt6, S. Andriamonje1, J. Andrzejewski, L. Audouin7, V. Bécares, F. Bečvář8, J. Billowes9, V. Boccone1, Damir Bosnar10, M. Brugger1, F. Calviño11, D. Cano-Ott, C. Carrapiço12, F. Cerutti1, M. P. W. Chin1, Nicola Colonna4, G. Cortes11, M. A. Cortés-Giraldo13, M. Diakaki2, C. Domingo-Pardo14, I. Duran15, Rugard Dressler16, N. Dzysiuk4, C. Eleftheriadis17, Alfredo Ferrari1, K. Fraval3, Srinivasan Ganesan18, A. R. García, G. Giubrone14, Kathrin Göbel6, M. B. Gómez-Hornillos11, I. Goncalves12, E. Gonzalez-Romero, E. Griesmayer5, P. Gurusamy18, A. Hernández-Prieto11, A. Hernández-Prieto1, D. G. Jenkins19, E. Jericha5, Yacine Kadi1, F. Käppeler20, D. Karadimos2, N. Kivel16, P. E. Koehler21, M. Kokkoris2, M. Krtička8, Jeri Kroll8, C. Lampoudis3, C. Langer6, E. Leal-Cidoncha15, C. Lederer6, H. Leeb5, L.S. Leong7, Roberto Losito1, A. Manousos17, J. Marganiec, T. Martinez, Cristian Massimi22, P. F. Mastinu4, M. Mastromarco4, M. Meaze4, E. Mendoza, Alberto Mengoni23, P. M. Milazzo4, F. Mingrone22, M. Mirea, W. Mondalaers, T. Papaevangelou3, C. Paradela15, A. Pavlik24, J. Perkowski, A. J. M. Plompen, Javier Praena13, J. M. Quesada13, Thomas Rauscher25, Rene Reifarth6, A. Riego11, F. Roman1, Carlo Rubbia1, M. Sabaté-Gilarte13, R. Sarmento12, A. K. Saxena18, Peter Schillebeeckx, Stefan Schmidt6, Dorothea Schumann16, Patrick Steinegger16, G. Tagliente4, J. L. Tain14, D. Tarrío15, Laurent Tassan-Got7, S. Valenta8, G. Vannini22, V. Variale4, P. Vaz12, Alberto Ventura23, R. Versaci1, M. J. Vermeulen19, R. Vlastou2, Anton Wallner24, T. Ware9, Mario Weigand6, T. J. Wright9, Petar Žugec10 
TL;DR: In this paper, the authors present the characteristics of the new neutron beam in the currently available configurations, which correspond to two different collimation systems and two choices of neutron moderator, including the intensity and energy dependence of the neutron flux, the spatial profile of the beam, the in-beam background components and the energy resolution/broadening.
Abstract: The neutron time-of-flight facility n_TOF features a white neutron source produced by spallation through 20GeV/c protons impinging on a lead target. The facility, aiming primarily at the measurement of neutron-induced reaction cross sections, was operating at CERN between 2001 and 2004, and then underwent a major upgrade in 2008. This paper presents in detail all the characteristics of the new neutron beam in the currently available configurations, which correspond to two different collimation systems and two choices of neutron moderator. The characteristics discussed include the intensity and energy dependence of the neutron flux, the spatial profile of the beam, the in-beam background components and the energy resolution/broadening. The discussion of these features is based on dedicated measurements and Monte Carlo simulations, and includes estimations of the systematic uncertainties of the mentioned quantities.