scispace - formally typeset
Search or ask a question

Showing papers by "Vienna University of Technology published in 2011"


Journal ArticleDOI
TL;DR: The Representative Concentration Pathways (RCP) as discussed by the authors is a set of four new pathways developed for the climate modeling community as a basis for long-term and near-term modeling experiments.
Abstract: This paper summarizes the development process and main characteristics of the Representative Concentration Pathways (RCPs), a set of four new pathways developed for the climate modeling community as a basis for long-term and near-term modeling experiments. The four RCPs together span the range of year 2100 radiative forcing values found in the open literature, i.e. from 2.6 to 8.5 W/m 2 . The RCPs are the product of an innovative collaboration between integrated assessment modelers, climate modelers, terrestrial ecosystem modelers and emission inventory experts. The resulting product forms a comprehensive data set with high spatial and sectoral resolutions for the period extending to 2100. Land use and emissions of air pollutants and greenhouse gases are reported mostly at a 0.5×0.5 degree spatial resolution, with air pollutants also provided per sector (for well-mixed gases, a coarser resolution is used). The underlying integrated assessment model outputs for land use, atmospheric emissions and concentration data were harmonized across models and scenarios to ensure consistency with historical observations while preserving individual scenario trends. For most variables, the RCPs cover a wide range of the existing literature. The RCPs are supplemented with extensions (Extended Concentration Pathways, ECPs), which allow

6,169 citations


Journal ArticleDOI
TL;DR: An overview and a taxonomy for DSM is given, the various types of DSM are analyzed, and an outlook on the latest demonstration projects in this domain is given.
Abstract: Energy management means to optimize one of the most complex and important technical creations that we know: the energy system. While there is plenty of experience in optimizing energy generation and distribution, it is the demand side that receives increasing attention by research and industry. Demand Side Management (DSM) is a portfolio of measures to improve the energy system at the side of consumption. It ranges from improving energy efficiency by using better materials, over smart energy tariffs with incentives for certain consumption patterns, up to sophisticated real-time control of distributed energy resources. This paper gives an overview and a taxonomy for DSM, analyzes the various types of DSM, and gives an outlook on the latest demonstration projects in this domain.

2,647 citations


Journal ArticleDOI
TL;DR: The International Soil Moisture Network (ISMN) as discussed by the authors is a centralized data hosting facility where globally available in situ soil moisture measurements from operational networks and validation campaigns are collected, harmonized, and made available to users.
Abstract: . In situ measurements of soil moisture are invaluable for calibrating and validating land surface models and satellite-based soil moisture retrievals. In addition, long-term time series of in situ soil moisture measurements themselves can reveal trends in the water cycle related to climate or land cover change. Nevertheless, on a worldwide basis the number of meteorological networks and stations measuring soil moisture, in particular on a continuous basis, is still limited and the data they provide lack standardization of technique and protocol. To overcome many of these limitations, the International Soil Moisture Network (ISMN; http://www.ipf.tuwien.ac.at/insitu ) was initiated to serve as a centralized data hosting facility where globally available in situ soil moisture measurements from operational networks and validation campaigns are collected, harmonized, and made available to users. Data collecting networks share their soil moisture datasets with the ISMN on a voluntary and no-cost basis. Incoming soil moisture data are automatically transformed into common volumetric soil moisture units and checked for outliers and implausible values. Apart from soil water measurements from different depths, important metadata and meteorological variables (e.g., precipitation and soil temperature) are stored in the database. These will assist the user in correctly interpreting the soil moisture data. The database is queried through a graphical user interface while output of data selected for download is provided according to common standards for data and metadata. Currently (status May 2011), the ISMN contains data of 19 networks and more than 500 stations located in North America, Europe, Asia, and Australia. The time period spanned by the entire database runs from 1952 until the present, although most datasets have originated during the last decade. The database is rapidly expanding, which means that both the number of stations and the time period covered by the existing stations are still growing. Hence, it will become an increasingly important resource for validating and improving satellite-derived soil moisture products and studying climate related trends. As the ISMN is animated by the scientific community itself, we invite potential networks to enrich the collection by sharing their in situ soil moisture data.

914 citations


Journal ArticleDOI
TL;DR: The motivation and key concepts behind answer set programming---a promising approach to declarative problem solving.
Abstract: The motivation and key concepts behind answer set programming---a promising approach to declarative problem solving.

911 citations


Proceedings ArticleDOI
20 Jun 2011
TL;DR: This paper proposes a generic and simple framework comprising three steps: constructing a cost volume, fast cost volume filtering and winner-take-all label selection, and achieves state-of-the-art results that achieve disparity maps in real-time, and optical flow fields with very fine structures as well as large displacements.
Abstract: Many computer vision tasks can be formulated as labeling problems. The desired solution is often a spatially smooth labeling where label transitions are aligned with color edges of the input image. We show that such solutions can be efficiently achieved by smoothing the label costs with a very fast edge preserving filter. In this paper we propose a generic and simple framework comprising three steps: (i) constructing a cost volume (ii) fast cost volume filtering and (iii) winner-take-all label selection. Our main contribution is to show that with such a simple framework state-of-the-art results can be achieved for several computer vision applications. In particular, we achieve (i) disparity maps in real-time, whose quality exceeds those of all other fast (local) approaches on the Middlebury stereo benchmark, and (ii) optical flow fields with very fine structures as well as large displacements. To demonstrate robustness, the few parameters of our framework are set to nearly identical values for both applications. Also, competitive results for interactive image segmentation are presented. With this work, we hope to inspire other researchers to leverage this framework to other application areas.

898 citations


Journal ArticleDOI
TL;DR: Recent advances in molecular ecology and genomics indicate that the interactions of Trichoderma spp.
Abstract: Trichoderma is a genus of common filamentous fungi that display a remarkable range of lifestyles and interactions with other fungi, animals and plants. Because of their ability to antagonize plant-pathogenic fungi and to stimulate plant growth and defence responses, some Trichoderma strains are used for biological control of plant diseases. In this Review, we discuss recent advances in molecular ecology and genomics which indicate that the interactions of Trichoderma spp. with animals and plants may have evolved as a result of saprotrophy on fungal biomass (mycotrophy) and various forms of parasitism on other fungi (mycoparasitism), combined with broad environmental opportunism.

777 citations


Journal ArticleDOI
TL;DR: The modified Becke-Johnson exchange potential (TB-mBJ) is tested on various types of solids which are difficult to describe theoretically: nonmagnetic semiconducting transition-metal oxides and sulfides, metals (Fe, Co, Ni, and Cu), and (anti)ferromagnetic insulators (e.g., YBa${}{}_{2}$Cu${}_{3}$O${}
Abstract: The modified Becke-Johnson exchange potential [F. Tran and P. Blaha, Phys. Rev. Lett. 102, 226401 (2009)] (TB-mBJ) is tested on various types of solids which are difficult to describe theoretically: nonmagnetic semiconducting transition-metal oxides and sulfides, metals (Fe, Co, Ni, and Cu), and (anti)ferromagnetic insulators (e.g., YBa${}_{2}$Cu${}_{3}$O${}_{6}$). The results for the band gap and other quantities such as the magnetic moment or electric field gradient are analyzed in detail, in particular to have a better understanding of the mechanism which leads to improved (or sometimes worse) results with the TB-mBJ potential compared to the standard local density and generalized gradient approximations.

712 citations


Proceedings ArticleDOI
01 Jan 2011
TL;DR: The method reconstructs highly slanted surfaces and achieves impressive disparity details with sub-pixel precision and allows for explicit treatment of occlusions and can handle large untextured regions.
Abstract: Common local stereo methods match support windows at integer-valued disparities. The implicit assumption that pixels within the support region have constant disparity does not hold for slanted surfaces and leads to a bias towards reconstructing frontoparallel surfaces. This work overcomes this bias by estimating an individual 3D plane at each pixel onto which the support region is projected. The major challenge of this approach is to find a pixel’s optimal 3D plane among all possible planes whose number is infinite. We show that an ideal algorithm to solve this problem is PatchMatch [1] that we extend to find an approximate nearest neighbor according to a plane. In addition to Patch-Match’s spatial propagation scheme, we propose (1) view propagation where planes are propagated among left and right views of the stereo pair and (2) temporal propagation where planes are propagated from preceding and consecutive frames of a video when doing temporal stereo. Adaptive support weights are used in matching cost aggregation to improve results at disparity borders. We also show that our slanted support windows can be used to compute a cost volume for global stereo methods, which allows for explicit treatment of occlusions and can handle large untextured regions. In the results we demonstrate that our method reconstructs highly slanted surfaces and achieves impressive disparity details with sub-pixel precision. In the Middlebury table, our method is currently top-performer among local methods and takes rank 2 among approximately 110 competitors if sub-pixel precision is considered.

687 citations


Journal ArticleDOI
01 Sep 2011
TL;DR: A survey of some of the most important lines of hybridization of metaheuristics with other techniques for optimization, which includes, for example, the combination of exact algorithms and meta heuristics.
Abstract: Research in metaheuristics for combinatorial optimization problems has lately experienced a noteworthy shift towards the hybridization of metaheuristics with other techniques for optimization. At the same time, the focus of research has changed from being rather algorithm-oriented to being more problem-oriented. Nowadays the focus is on solving the problem at hand in the best way possible, rather than promoting a certain metaheuristic. This has led to an enormously fruitful cross-fertilization of different areas of optimization. This cross-fertilization is documented by a multitude of powerful hybrid algorithms that were obtained by combining components from several different optimization techniques. Hereby, hybridization is not restricted to the combination of different metaheuristics but includes, for example, the combination of exact algorithms and metaheuristics. In this work we provide a survey of some of the most important lines of hybridization. The literature review is accompanied by the presentation of illustrative examples.

684 citations


Journal ArticleDOI
TL;DR: An incremental approach for behavior-based analysis, capable of processing the behavior of thousands of malware binaries on a daily basis is proposed, significantly reduces the run-time overhead of current analysis methods, while providing accurate discovery and discrimination of novel malware variants.
Abstract: Malicious software - so called malware - poses a major threat to the security of computer systems. The amount and diversity of its variants render classic security defenses ineffective, such that millions of hosts in the Internet are infected with malware in the form of computer viruses, Internet worms and Trojan horses. While obfuscation and polymorphism employed by malware largely impede detection at file level, the dynamic analysis of malware binaries during run-time provides an instrument for characterizing and defending against the threat of malicious software. In this article, we propose a framework for the automatic analysis of malware behavior using machine learning. The framework allows for automatically identifying novel classes of malware with similar behavior (clustering) and assigning unknown malware to these discovered classes (classification). Based on both, clustering and classification, we propose an incremental approach for behavior-based analysis, capable of processing the behavior of thousands of malware binaries on a daily basis. The incremental analysis significantly reduces the run-time overhead of current analysis methods, while providing accurate discovery and discrimination of novel malware variants.

675 citations


Journal ArticleDOI
TL;DR: In this article, the retrieval characteristics of passive (AMSR-E) and active (ASCAT) microwave satellite estimates are combined to produce an improved soil moisture product. But the results of the satellite-based passive and active microwave sensors have the potential to offer improved estimates of surface soil moisture at global scale.
Abstract: . Combining information derived from satellite-based passive and active microwave sensors has the potential to offer improved estimates of surface soil moisture at global scale. We develop and evaluate a methodology that takes advantage of the retrieval characteristics of passive (AMSR-E) and active (ASCAT) microwave satellite estimates to produce an improved soil moisture product. First, volumetric soil water content (m3 m−3) from AMSR-E and degree of saturation (%) from ASCAT are rescaled against a reference land surface model data set using a cumulative distribution function matching approach. While this imposes any bias of the reference on the rescaled satellite products, it adjusts them to the same range and preserves the dynamics of original satellite-based products. Comparison with in situ measurements demonstrates that where the correlation coefficient between rescaled AMSR-E and ASCAT is greater than 0.65 ("transitional regions"), merging the different satellite products increases the number of observations while minimally changing the accuracy of soil moisture retrievals. These transitional regions also delineate the boundary between sparsely and moderately vegetated regions where rescaled AMSR-E and ASCAT, respectively, are used for the merged product. Therefore the merged product carries the advantages of better spatial coverage overall and increased number of observations, particularly for the transitional regions. The combination method developed has the potential to be applied to existing microwave satellites as well as to new missions. Accordingly, a long-term global soil moisture dataset can be developed and extended, enhancing basic understanding of the role of soil moisture in the water, energy and carbon cycles.

Journal ArticleDOI
TL;DR: A better understanding of mycoparasitism is offered, and the development of improved biocontrol strains for efficient and environmentally friendly protection of plants is enforced.
Abstract: Mycoparasitism, a lifestyle where one fungus is parasitic on another fungus, has special relevance when the prey is a plant pathogen, providing a strategy for biological control of pests for plant protection. Probably, the most studied biocontrol agents are species of the genus Hypocrea/Trichoderma. Here we report an analysis of the genome sequences of the two biocontrol species Trichoderma atroviride (teleomorph Hypocrea atroviridis) and Trichoderma virens (formerly Gliocladium virens, teleomorph Hypocrea virens), and a comparison with Trichoderma reesei (teleomorph Hypocrea jecorina). These three Trichoderma species display a remarkable conservation of gene order (78 to 96%), and a lack of active mobile elements probably due to repeat-induced point mutation. Several gene families are expanded in the two mycoparasitic species relative to T. reesei or other ascomycetes, and are overrepresented in non-syntenic genome regions. A phylogenetic analysis shows that T. reesei and T. virens are derived relative to T. atroviride. The mycoparasitism-specific genes thus arose in a common Trichoderma ancestor but were subsequently lost in T. reesei. The data offer a better understanding of mycoparasitism, and thus enforce the development of improved biocontrol strains for efficient and environmentally friendly protection of plants.

Journal ArticleDOI
TL;DR: In this paper, the centrality dependence of the chargedparticle multiplicity density at midrapidity in Pb-Pb collisions at root s(NN) = 2: 76 TeV is presented.
Abstract: The centrality dependence of the charged-particle multiplicity density at midrapidity in Pb-Pb collisions at root s(NN) = 2: 76 TeV is presented. The charged-particle density normalized per participating nucleon pair increases by about a factor of 2 from peripheral (70%-80%) to central (0%-5%) collisions. The centrality dependence is found to be similar to that observed at lower collision energies. The data are compared with models based on different mechanisms for particle production in nuclear collisions.

Journal ArticleDOI
K. Aamodt1, Betty Abelev2, A. Abrahantes Quintana, Dagmar Adamová3  +972 moreInstitutions (84)
11 Jul 2011
TL;DR: The first measurement of the triangular v3, quadrangular v4, and pentagonal v5 charged particle flow in Pb-Pb collisions is reported, and a double peaked structure in the two-particle azimuthal correlations is observed, which can be naturally explained from the measured anisotropic flow Fourier coefficients.
Abstract: We report on the first measurement of the triangular nu(3), quadrangular nu(4), and pentagonal nu(5) charged particle flow in Pb-Pb collisions at root s(NN) = 2.76 TeV measured with the ALICE detector at the CERN Large Hadron Collider. We show that the triangular flow can be described in terms of the initial spatial anisotropy and its fluctuations, which provides strong constraints on its origin. In the most central events, where the elliptic flow nu(2) and nu(3) have similar magnitude, a double peaked structure in the two-particle azimuthal correlations is observed, which is often interpreted as a Mach cone response to fast partons. We show that this structure can be naturally explained from the measured anisotropic flow Fourier coefficients.

Journal ArticleDOI
TL;DR: In this article, a comprehensive assessment of the reliability of soil moisture estimations from the Advanced SCATterometer (ASCAT) and AMSR-E sensors is carried out by using observed and modelled soil moisture data over 17 sites located in 4 countries across Europe (Italy, Spain, France and Luxembourg).

Journal ArticleDOI
TL;DR: In this paper, the NIR spectra of wood and wood products contain information regarding their chemical composition and molecular structure, which can influence physical properties and performance, however, they do not reveal the properties of wood products.
Abstract: Near infrared (NIR) spectra of wood and wood products contain information regarding their chemical composition and molecular structure. Both influence physical properties and performance, however, ...

Journal ArticleDOI
TL;DR: This Perspective article focuses on patchy systems characterized by spherical neutral particles with patchy surfaces, and describes most of the patchy particle models that have been developed so far and how their basic features are connected to the physical systems they are meant to investigate.
Abstract: Recently, an increasing experimental effort has been devoted to the synthesis of complex colloidal particles with chemically or physically patterned surfaces and possible specific shapes that are far from spherical. These new colloidal particles with anisotropic interactions are commonly named patchy particles. In this Perspective article, we focus on patchy systems characterized by spherical neutral particles with patchy surfaces. We summarize most of the patchy particle models that have been developed so far and describe how their basic features are connected to the physical systems they are meant to investigate. Patchy models consider particles as hard or soft spheres carrying a finite and small number of attractive sites arranged in precise geometries on the particle's surface. The anisotropy of the interaction and the limited valence in bonding are the salient features determining the collective behavior of such systems. By tuning the number, the interaction parameters and the local arrangements of the patches, it is possible to investigate a wide range of physical phenomena, from different self-assembly processes of proteins, polymers and patchy colloids to the dynamical arrest of gel-like structures. We also draw attention to charged patchy systems: colloidal patchy particles as well as proteins are likely charged, hence the description of the presence of heterogeneously distributed charges on the particle surface is a promising perspective for future investigations.

Journal ArticleDOI
10 Feb 2011
TL;DR: An overview of the existing vehicular channel measurements in a variety of important environments, and the observed channel characteristics (such as delay spreads and Doppler spreads) therein, is provided.
Abstract: To make transportation safer, more efficient, and less harmful to the environment, traffic telematics services are currently being intensely investigated and developed. Such services require dependable wireless vehicle-to-infrastructure and vehicle-to-vehicle communications providing robust connectivity at moderate data rates. The development of such dependable vehicular communication systems and standards requires accurate models of the propagation channel in all relevant environments and scenarios. Key characteristics of vehicular channels are shadowing by other vehicles, high Doppler shifts, and inherent nonstationarity. All have major impact on the data packet transmission reliability and latency. This paper provides an overview of the existing vehicular channel measurements in a variety of important environments, and the observed channel characteristics (such as delay spreads and Doppler spreads) therein. We briefly discuss the available vehicular channel models and their respective merits and deficiencies. Finally, we discuss the implications for wireless system design with a strong focus on IEEE 802.11p. On the road towards a dependable vehicular network, room for improvements in coverage, reliability, scalability, and delay are highlighted, calling for evolutionary improvements in the IEEE 802.11p standard. Multiple antennas at the onboard units and roadside units are recommended to exploit spatial diversity for increased diversity and reliability. Evolutionary improvements in the physical (PHY) and medium access control (MAC) layers are required to yield dependable systems. Extensive references are provided.


Book ChapterDOI
23 Oct 2011
TL;DR: This paper presents CQELS (Continuous Query Evaluation over Linked Streams), a native and adaptive query processor for unified query processing over linked Stream Data and Linked Data, and demonstrates the efficiency of this approach.
Abstract: In this paper we address the problem of scalable, native and adaptive query processing over Linked Stream Data integrated with Linked Data. Linked Stream Data consists of data generated by stream sources, e.g., sensors, enriched with semantic descriptions, following the standards proposed for Linked Data. This enables the integration of stream data with Linked Data collections and facilitates a wide range of novel applications. Currently available systems use a "black box" approach which delegates the processing to other engines such as stream/event processing engines and SPARQL query processors by translating to their provided languages. As the experimental results described in this paper show, the need for query translation and data transformation, as well as the lack of full control over the query execution, pose major drawbacks in terms of efficiency. To remedy these drawbacks, we present CQELS (Continuous Query Evaluation over Linked Streams), a native and adaptive query processor for unified query processing over Linked Stream Data and Linked Data. In contrast to the existing systems, CQELS uses a "white box" approach and implements the required query operators natively to avoid the overhead and limitations of closed system regimes. CQELS provides a flexible query execution framework with the query processor dynamically adapting to the changes in the input data. During query execution, it continuously reorders operators according to some heuristics to achieve improved query execution in terms of delay and complexity. Moreover, external disk access on large Linked Data collections is reduced with the use of data encoding and caching of intermediate query results. To demonstrate the efficiency of our approach, we present extensive experimental performance evaluations in terms of query execution time, under varied query types, dataset sizes, and number of parallel queries. These results show that CQELS outperforms related approaches by orders of magnitude.

Journal ArticleDOI
01 Apr 2011-Energy
TL;DR: In this paper, the authors investigated whether the recent increase of bio-energy production had a significant impact on the development of agricultural commodity (feedstock) prices, and the most important impact factors like bio-fuel production, land use, yields, feedstock and crude oil prices are analyzed.

Journal ArticleDOI
TL;DR: In this article, the main types of different RES-E options and their properties are discussed, and several cases studies of different European Member States show an in-depth analysis of the different RESE promotion schemes.
Abstract: The core objective of this paper is to elaborate on historically implemented promotion strategies of renewable energy sources and the associated deployment within the European electricity market. Hence, at a first glance, the historic development of renewable energy sources in the electricity (RES-E) sector is addressed on Member State and on sectoral level as well as consequently discussed according to available RES-E potentials and costs. The specific focus of this paper, are promotion strategies for RES-E options as they are the key driver of an efficient and effective RES-E deployment. Therefore, the paper depicts the main types of different promotion schemes and their properties. Additionally, several cases studies of different European Member States show an in-depth analysis of the different RES-E promotion schemes. In this context, special emphasises are put on the question of effective and efficient promotion scheme designs of different RES-E technologies. Generally, conducted research led to the conclusion, that technology specific financial support measures of RES-E performed much more effective and efficient than others did. Hence, it is not all about the common question of feed-in tariffs vs. quota systems based on tradable green certificates, but more about the design criteria of implemented RES-E support schemes.

Journal ArticleDOI
TL;DR: A compact 20 Hz repetition-rate mid-IR OPCPA system operating at a central wavelength of 3900 nm with the tail-to-tail spectrum extending over 600 nm and delivering 8 mJ pulses that are compressed to 83 fs opens a range of unprecedented opportunities for tabletop ultrafast science.
Abstract: We demonstrate a compact 20 Hz repetition-rate mid-IR OPCPA system operating at a central wavelength of 3900 nm with the tail-to-tail spectrum extending over 600 nm and delivering 8 mJ pulses that are compressed to 83 fs (<7 optical cycles). Because of the long optical period (∼13 fs) and a high peak power, the system opens a range of unprecedented opportunities for tabletop ultrafast science and is particularly attractive as a driver for a highly efficient generation of ultrafast coherent x-ray continua for biomolecular and element specific imaging.

Journal ArticleDOI
TL;DR: It is suggested that the investigated biodegradable magnesium alloy not only achieves enhanced bone response but also excellent interfacial strength and thus fulfils two critical requirements for bone implant applications.

Journal ArticleDOI
TL;DR: In this paper, the authors calibrate the parameters of a conceptual rainfall-runoff model to six consecutive 5 year periods between 1976 and 2006 for 273 catchments in Austria and analyze the temporal change of the calibrated parameters.
Abstract: [1] Climate impact analyses are usually based on driving hydrological models by future climate scenarios, assuming that the model parameters calibrated to past runoff are representative of the future In this paper we calibrate the parameters of a conceptual rainfall-runoff model to six consecutive 5 year periods between 1976 and 2006 for 273 catchments in Austria and analyze the temporal change of the calibrated parameters The calibrated parameters representing snow and soil moisture processes show significant trends For example, the parameter controlling runoff generation doubled, on average, in the 3 decades Comparisons of different subregions, comparisons with independent data sets, and analyses of the spatial variability of the model parameters indicate that these trends represent hydrological changes rather than calibration artifacts The trends can be related to changes in the climatic conditions of the catchments such as higher evapotranspiration and drier catchment conditions in the more recent years The simulations suggest that the impact on simulated runoff of assuming time invariant parameters can be very significant For example, if using the parameters calibrated to 1976 – 1981 for simulating runoff for the period 2001 – 2006, the biases of median flows are, on average, 15% and the biases of high flows are about 35% The errors increase as the time lag between the simulation and calibration periods increases The implications for hydrologic prediction in general and climate impact analyses in particular are discussed

Proceedings ArticleDOI
01 Dec 2011
TL;DR: The presented shape descriptor shows that the combination of angle, point-distance and area shape functions gives a significant boost in recognition rate against the baseline descriptor and outperforms the state-of-the-art descriptors in the experimental evaluation on a publicly available dataset of real-world objects in table scene contexts with up to 200 categories.
Abstract: This work addresses the problem of real-time 3D shape based object class recognition, its scaling to many categories and the reliable perception of categories. A novel shape descriptor for partial point clouds based on shape functions is presented, capable of training on synthetic data and classifying objects from a depth sensor in a single partial view in a fast and robust manner. The classification task is stated as a 3D retrieval task finding the nearest neighbors from synthetically generated views of CAD-models to the sensed point cloud with a Kinect-style depth sensor. The presented shape descriptor shows that the combination of angle, point-distance and area shape functions gives a significant boost in recognition rate against the baseline descriptor and outperforms the state-of-the-art descriptors in our experimental evaluation on a publicly available dataset of real-world objects in table scene contexts with up to 200 categories.

Proceedings ArticleDOI
20 Jun 2011
TL;DR: This paper proposes a global sampling method that uses all samples available in the image to handle the computational complexity introduced by the large number of samples, and poses the sampling task as a correspondence problem.
Abstract: Alpha matting refers to the problem of softly extracting the foreground from an image. Given a trimap (specifying known foreground/background and unknown pixels), a straightforward way to compute the alpha value is to sample some known foreground and background colors for each unknown pixel. Existing sampling-based matting methods often collect samples near the unknown pixels only. They fail if good samples cannot be found nearby. In this paper, we propose a global sampling method that uses all samples available in the image. Our global sample set avoids missing good samples. A simple but effective cost function is defined to tackle the ambiguity in the sample selection process. To handle the computational complexity introduced by the large number of samples, we pose the sampling task as a correspondence problem. The correspondence search is efficiently achieved by generalizing a randomized algorithm previously designed for patch matching[3]. A variety of experiments show that our global sampling method produces both visually and quantitatively high-quality matting results.

Journal ArticleDOI
TL;DR: The measurements made across Europe following the releases from the Fukushima NPP reactors have provided a significant amount of new data on the ratio of the gaseous ( 131)I fraction to total (131)I, both on a spatial scale and its temporal variation.
Abstract: Radioactive emissions into the atmosphere from the damaged reactors of the Fukushima Dai-ichi nuclear power plant (NPP) started on March 12th, 2011. Among the various radionuclides released, iodine-131 ((131)I) and cesium isotopes ((137)Cs and (134)Cs) were transported across the Pacific toward the North American continent and reached Europe despite dispersion and washout along the route of the contaminated air masses. In Europe, the first signs of the releases were detected 7 days later while the first peak of activity level was observed between March 28th and March 30th. Time variations over a 20-day period and spatial variations across more than 150 sampling locations in Europe made it possible to characterize the contaminated air masses. After the Chernobyl accident, only a few measurements of the gaseous (131)I fraction were conducted compared to the number of measurements for the particulate fraction. Several studies had already pointed out the importance of the gaseous (131)I and the large underestimation of the total (131)I airborne activity level, and subsequent calculations of inhalation dose, if neglected. The measurements made across Europe following the releases from the Fukushima NPP reactors have provided a significant amount of new data on the ratio of the gaseous (131)I fraction to total (131)I, both on a spatial scale and its temporal variation. It can be pointed out that during the Fukushima event, the (134)Cs to (137)Cs ratio proved to be different from that observed after the Chernobyl accident. The data set provided in this paper is the most comprehensive survey of the main relevant airborne radionuclides from the Fukushima reactors, measured across Europe. A rough estimate of the total (131)I inventory that has passed over Europe during this period was <1% of the released amount. According to the measurements, airborne activity levels remain of no concern for public health in Europe.

Journal ArticleDOI
David L. Hawksworth1, David L. Hawksworth2, Pedro W. Crous3, Scott A. Redhead, Don R. Reynolds4, Robert A. Samson3, Keith A. Seifert, John W. Taylor4, Michael J. Wingfield5, Özlem Abaci6, Catherine Aime7, Ahmet Asan8, Feng-Yan Bai, Z. Wilhelm de Beer5, Dominik Begerow9, Derya Berikten10, Teun Boekhout3, Peter K. Buchanan11, Treena I. Burgess12, Walter Buzina13, Lei Cai, Paul F. Cannon14, J. Leland Crane15, Ulrike Damm3, Heide Marie Daniel16, Anne D. van Diepeningen3, Irina S. Druzhinina17, Paul S. Dyer18, Ursula Eberhardt3, Jack W. Fell19, Jens Christian Frisvad20, David M. Geiser21, József Geml22, Chirlei Glienke23, Tom Gräfenhan24, Johannes Z. Groenewald3, Marizeth Groenewald3, Johannes de Gruyter25, Eveline Guého-Kellermann, Liang-Dong Guo, David S. Hibbett26, Seung-Beom Hong27, G. Sybren de Hoog1, Jos Houbraken3, Sabine M. Huhndorf28, Kevin D. Hyde, Ahmed Ismail3, Peter R. Johnston11, Duygu Göksay Kadaifciler29, Paul M. Kirk30, Urmas Kõljalg31, Cletus P. Kurtzman32, Paul Emile Lagneau, C. André Lévesque, Xingzhong Liu, Lorenzo Lombard3, Wieland Meyer15, Andrew N. Miller33, David W. Minter, Mohammad Javad Najafzadeh34, Lorelei L. Norvell, Svetlana Ozerskaya35, Rasime Ozic10, Shaun R. Pennycook11, Stephen W. Peterson32, Olga Vinnere Pettersson36, W. Quaedvlieg3, Vincent Robert3, Constantino Ruibal1, Johan Schnürer36, Hans Josef Schroers, Roger G. Shivas, Bernard Slippers5, Henk Spierenburg3, Masako Takashima, Evrim Taskin37, Marco Thines38, Ulf Thrane20, Alev Haliki Uztan6, Marcel van Raak25, János Varga39, Aida Vasco40, Gerard J.M. Verkley3, S.I.R. Videira3, Ronald P. de Vries3, Bevan S. Weir11, Neriman Yilmaz3, Andrey Yurkov9, Ning Zhang 
01 Jun 2011
TL;DR: The Amsterdam Declaration on Fungal Nomenclature recognizes the need for an orderly transitition to a single-name nomenclatural system for all fungi, and to provide mechanisms to protect names that otherwise then become endangered.
Abstract: The Amsterdam Declaration on Fungal Nomenclature was agreed at an international symposium convened in Amsterdam on 19–20 April 2011 under the auspices of the International Commission on the Taxonomy of Fungi (ICTF). The purpose of the symposium was to address the issue of whether or how the current system of naming pleomorphic fungi should be maintained or changed now that molecular data are routinely available. The issue is urgent as mycologists currently follow different practices, and no consensus was achieved by a Special Committee appointed in 2005 by the International Botanical Congress to advise on the problem. The Declaration recognizes the need for an orderly transitition to a single-name nomenclatural system for all fungi, and to provide mechanisms to protect names that otherwise then become endangered. That is, meaning that priority should be given to the first described name, except where that is a younger name in general use when the first author to select a name of a pleomorphic monophyletic genus is to be followed, and suggests controversial cases are referred to a body, such as the ICTF, which will report to the Committee for Fungi. If appropriate, the ICTF could be mandated to promote the implementation of the Declaration. In addition, but not forming part of the Declaration, are reports of discussions held during the symposium on the governance of the nomenclature of fungi, and the naming of fungi known only from an environmental nucleic acid sequence in particular. Possible amendments to the Draft BioCode (2011) to allow for the needs of mycologists are suggested for further consideration, and a possible example of how a fungus only known from the environment might be described is presented.

Book ChapterDOI
20 Sep 2011
TL;DR: Novel techniques for detecting malware samples that exhibit semantically different behavior across different analysis sandboxes are proposed, compatible with any monitoring technology that can be used for dynamic analysis, and completely agnostic to the way that malware achieves evasion.
Abstract: The execution of malware in an instrumented sandbox is a widespread approach for the analysis of malicious code, largely because it sidesteps the difficulties involved in the static analysis of obfuscated code As malware analysis sandboxes increase in popularity, they are faced with the problem of malicious code detecting the instrumented environment to evade analysis In the absence of an "undetectable", fully transparent analysis sandbox, defense against sandbox evasion is mostly reactive: Sandbox developers and operators tweak their systems to thwart individual evasion techniques as they become aware of them, leading to a never-ending arms race The goal of this work is to automate one step of this fight: Screening malware samples for evasive behavior Thus, we propose novel techniques for detecting malware samples that exhibit semantically different behavior across different analysis sandboxes These techniques are compatible with any monitoring technology that can be used for dynamic analysis, and are completely agnostic to the way that malware achieves evasion We implement the proposed techniques in a tool called Disarm, and demonstrate that it can accurately detect evasive malware, leading to the discovery of previously unknown evasion techniques