scispace - formally typeset
Search or ask a question

Showing papers by "Vienna University of Technology published in 2018"


Journal ArticleDOI
TL;DR: In this paper, the interplay between parity-time symmetry and non-Hermitian physics in optics, plasmonics and optomechanics has been explored both theoretically and experimentally.
Abstract: In recent years, notions drawn from non-Hermitian physics and parity–time (PT) symmetry have attracted considerable attention. In particular, the realization that the interplay between gain and loss can lead to entirely new and unexpected features has initiated an intense research effort to explore non-Hermitian systems both theoretically and experimentally. Here we review recent progress in this emerging field, and provide an outlook to future directions and developments. This Review Article outlines the exploration of the interplay between parity–time symmetry and non-Hermitian physics in optics, plasmonics and optomechanics.

1,831 citations


Journal ArticleDOI
TL;DR: The present review is devoted to summarizing the recent advances (2015–2017) in the field of metal-catalysed group-directed C–H functionalisation.
Abstract: The present review is devoted to summarizing the recent advances (2015-2017) in the field of metal-catalysed group-directed C-H functionalisation In order to clearly showcase the molecular diversity that can now be accessed by means of directed C-H functionalisation, the whole is organized following the directing groups installed on a substrate Its aim is to be a comprehensive reference work, where a specific directing group can be easily found, together with the transformations which have been carried out with it Hence, the primary format of this review is schemes accompanied with a concise explanatory text, in which the directing groups are ordered in sections according to their chemical structure The schemes feature typical substrates used, the products obtained as well as the required reaction conditions Importantly, each example is commented on with respect to the most important positive features and drawbacks, on aspects such as selectivity, substrate scope, reaction conditions, directing group removal, and greenness The targeted readership are both experts in the field of C-H functionalisation chemistry (to provide a comprehensive overview of the progress made in the last years) and, even more so, all organic chemists who want to introduce the C-H functionalisation way of thinking for a design of straightforward, efficient and step-economic synthetic routes towards molecules of interest to them Accordingly, this review should be of particular interest also for scientists from industrial R&D sector Hence, the overall goal of this review is to promote the application of C-H functionalisation reactions outside the research groups dedicated to method development and establishing it as a valuable reaction archetype in contemporary R&D, comparable to the role cross-coupling reactions play to date

1,057 citations


Journal ArticleDOI
TL;DR: This paper proposes to exploit the concept of Fog Computing in Healthcare IoT systems by forming a Geo-distributed intermediary layer of intelligence between sensor nodes and Cloud and presents a prototype of a Smart e-Health Gateway called UT-GATE.

867 citations


Journal ArticleDOI
TL;DR: In this article, the Particle and Heavy Ion Transport Code System (PHITS) 3.02 has been released and the accuracy and the applicable energy ranges of the code were improved.
Abstract: We have upgraded many features of the Particle and Heavy Ion Transport code System (PHITS) and released the new version as PHITS3.02. The accuracy and the applicable energy ranges of the code were ...

749 citations


Journal ArticleDOI
TL;DR: BoltzTraP2 is a software package for calculating a smoothed Fourier expression of periodic functions and the Onsager transport coefficients for extended systems using the linearized Boltzmann transport equation within the relaxation time approximation.

624 citations


Journal ArticleDOI
10 Sep 2018
TL;DR: In this paper, the properties of transition metal dichalcogenide semiconductors have been examined in depth, including bright, dark, localized and interlayer excitons.
Abstract: Two-dimensional group-VI transition metal dichalcogenide semiconductors, such as MoS2, WSe2, and others, exhibit strong light-matter coupling and possess direct band gaps in the infrared and visible spectral regimes, making them potentially interesting candidates for various applications in optics and optoelectronics. Here, we review their optical and optoelectronic properties with emphasis on exciton physics and devices. As excitons are tightly bound in these materials and dominate the optical response even at room-temperature, their properties are examined in depth in the first part of this article. We discuss the remarkably versatile excitonic landscape, including bright, dark, localized and interlayer excitons. In the second part, we provide an overview on the progress in optoelectronic device applications, such as electrically driven light emitters, photovoltaic solar cells, photodetectors, and opto-valleytronic devices, again bearing in mind the prominent role of excitonic effects. We conclude with a brief discussion on challenges that remain to be addressed to exploit the full potential of transition metal dichalcogenide semiconductors in possible exciton-based applications.

465 citations


Journal ArticleDOI
TL;DR: A simple but general definition of bioinks is proposed, and its distinction from biomaterial inks is clarified, to briefly summarize the historic evolution of this term within the field of biofabrication.
Abstract: Biofabrication aims to fabricate biologically functional products through bioprinting or bioassembly (Groll et al 2016 Biofabrication 8 013001). In biofabrication processes, cells are positioned at defined coordinates in three-dimensional space using automated and computer controlled techniques (Moroni et al 2018 Trends Biotechnol. 36 384-402), usually with the aid of biomaterials that are either (i) directly processed with the cells as suspensions/dispersions, (ii) deposited simultaneously in a separate printing process, or (iii) used as a transient support material. Materials that are suited for biofabrication are often referred to as bioinks and have become an important area of research within the field. In view of this special issue on bioinks, we aim herein to briefly summarize the historic evolution of this term within the field of biofabrication. Furthermore, we propose a simple but general definition of bioinks, and clarify its distinction from biomaterial inks.

461 citations


Journal ArticleDOI
26 Feb 2018
TL;DR: In this paper, the challenges and opportunities of blockchain for business process management (BPM) are outlined and a summary of seven research directions for investigating the application of blockchain technology in the context of BPM are presented.
Abstract: Blockchain technology offers a sizable promise to rethink the way interorganizational business processes are managed because of its potential to realize execution without a central party serving as a single point of trust (and failure). To stimulate research on this promise and the limits thereof, in this article, we outline the challenges and opportunities of blockchain for business process management (BPM). We first reflect how blockchains could be used in the context of the established BPM lifecycle and second how they might become relevant beyond. We conclude our discourse with a summary of seven research directions for investigating the application of blockchain technology in the context of BPM.

456 citations


Journal ArticleDOI
01 Jan 2018
TL;DR: This Review focuses on efforts to combine chemo- and biocatalysts, outlining the opportunities achievable by this approach and also efforts to overcome any incompatibilities between these different systems.
Abstract: The past decade has seen a substantial increase in successful examples of the combination of chemo- and biocatalysis for multistep syntheses. This is driven by obvious advantages such as higher yields, decreased costs, environmental benefits and high selectivity. On the downside, efforts must be undertaken to combine the divergent reaction conditions, reagent tolerance and solvent systems of these ‘different worlds of catalysis’. Owing to progress in enzyme discovery and engineering, as well as in the development of milder and more compatible conditions for operating with various chemocatalysts, many historical limitations can already be overcome. This Review highlights the opportunities available in the chemical space of combined syntheses using prominent examples, but also discusses the current challenges and emerging solutions, keeping in mind the fast progress in transition metal-, organo-, photo-, electro-, hetero- and biocatalysis. Chemical and biological catalysts provide distinct advantages and disadvantages to the synthetic chemist. This Review focuses on efforts to combine chemo- and biocatalysts, outlining the opportunities achievable by this approach and also efforts to overcome any incompatibilities between these different systems.

373 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a review of the state-of-the-art methods for strong electronic correlations, starting with the local, eminently important correlations of dynamical mean field theory (DMFT).
Abstract: Strong electronic correlations pose one of the biggest challenges to solid state theory. We review recently developed methods that address this problem by starting with the local, eminently important correlations of dynamical mean field theory (DMFT). On top of this, non-local correlations on all length scales are generated through Feynman diagrams, with a local two-particle vertex instead of the bare Coulomb interaction as a building block. With these diagrammatic extensions of DMFT long-range charge-, magnetic-, and superconducting fluctuations as well as (quantum) criticality can be addressed in strongly correlated electron systems. We provide an overview of the successes and results achieved---hitherto mainly for model Hamiltonians---and outline future prospects for realistic material calculations.

324 citations


Journal ArticleDOI
TL;DR: In this paper, a phonon laser is steered through an exceptional point (EP) in a compound optomechanical system formed by two coupled resonators, and a linewidth broadening of the mechanical lasing mode generated in one of the resonators when the system approaches the EP is observed.
Abstract: Non-Hermitian physical systems have attracted considerable attention lately for their unconventional behaviour around exceptional points (EPs)—spectral singularities at which eigenvalues and eigenvectors coalesce. In particular, many new EP-related concepts such as unidirectional lasing and invisibility, as well as chiral transmission, have been realized. Given the progress in understanding the physics of EPs in various photonic structures, it is surprising that one of the oldest theoretical predictions associated with them, a remarkable broadening of the laser linewidth at an EP, has been probed only indirectly so far. Here, we fill this gap by steering a phonon laser through an EP in a compound optomechanical system formed by two coupled resonators. We observe a pronounced linewidth broadening of the mechanical lasing mode generated in one of the resonators when the system approaches the EP.

Journal ArticleDOI
TL;DR: A new approach intended to refine the currently most important discrete mapping function, the Vienna Mapping Functions 1 (VMF1), which is successively referred to as VMF3, designed in such a way as to eliminate shortcomings in the empirical coefficients b and c and in the tuning for the specific elevation angle of $$3^{\circ }$$3∘.
Abstract: Incorrect modeling of troposphere delays is one of the major error sources for space geodetic techniques such as Global Navigation Satellite Systems (GNSS) or Very Long Baseline Interferometry (VLBI). Over the years, many approaches have been devised which aim at mapping the delay of radio waves from zenith direction down to the observed elevation angle, so-called mapping functions. This paper contains a new approach intended to refine the currently most important discrete mapping function, the Vienna Mapping Functions 1 (VMF1), which is successively referred to as Vienna Mapping Functions 3 (VMF3). It is designed in such a way as to eliminate shortcomings in the empirical coefficients b and c and in the tuning for the specific elevation angle of $$3^{\circ }$$ . Ray-traced delays of the ray-tracer RADIATE serve as the basis for the calculation of new mapping function coefficients. Comparisons of modeled slant delays demonstrate the ability of VMF3 to approximate the underlying ray-traced delays more accurately than VMF1 does, in particular at low elevation angles. In other words, when requiring highest precision, VMF3 is to be preferable to VMF1. Aside from revising the discrete form of mapping functions, we also present a new empirical model named Global Pressure and Temperature 3 (GPT3) on a $$5^{\circ }\times 5^{\circ }$$ as well as a $$1^{\circ }\times 1^{\circ }$$ global grid, which is generally based on the same data. Its main components are hydrostatic and wet empirical mapping function coefficients derived from special averaging techniques of the respective (discrete) VMF3 data. In addition, GPT3 also contains a set of meteorological quantities which are adopted as they stand from their predecessor, Global Pressure and Temperature 2 wet. Thus, GPT3 represents a very comprehensive troposphere model which can be used for a series of geodetic as well as meteorological and climatological purposes and is fully consistent with VMF3.

Journal ArticleDOI
12 Feb 2018
TL;DR: This tutorial paper advocates a recently proposed paradigm for scalable multitarget tracking that is based on message passing or, more concretely, the loopy sum–product algorithm, which provides a highly effective, efficient, and scalable solution to the probabilistic data association problem, a major challenge in multitargettracking.
Abstract: Situation-aware technologies enabled by multitarget tracking will lead to new services and applications in fields such as autonomous driving, indoor localization, robotic networks, and crowd counting In this tutorial paper, we advocate a recently proposed paradigm for scalable multitarget tracking that is based on message passing or, more concretely, the loopy sum–product algorithm This approach has advantages regarding estimation accuracy, computational complexity, and implementation flexibility Most importantly, it provides a highly effective, efficient, and scalable solution to the probabilistic data association problem, a major challenge in multitarget tracking This fact makes it attractive for emerging applications requiring real-time operation on resource-limited devices In addition, the message passing approach is intuitively appealing and suited to nonlinear and non-Gaussian models We present message-passing-based multitarget tracking methods for single-sensor and multiple-sensor scenarios, and for a known and unknown number of targets The presented methods can cope with clutter, missed detections, and an unknown association between targets and measurements We also discuss the integration of message-passing-based probabilistic data association into existing multitarget tracking methods The superior performance, low complexity, and attractive scaling properties of the presented methods are verified numerically In addition to simulated data, we use measured data captured by two radar stations with overlapping fields-of-view observing a large number of targets simultaneously

Book ChapterDOI
01 Jan 2018
TL;DR: This chapter summarise the state-of-the-art techniques for qualitative and quantitative monitoring of CPS behaviours, and presents an overview of some of the important applications and describes the tools supporting CPS monitoring and compare their main features.
Abstract: The term Cyber-Physical Systems (CPS) typically refers to engineered, physical and biological systems monitored and/or controlled by an embedded computational core. The behaviour of a CPS over time is generally characterised by the evolution of physical quantities, and discrete software and hardware states. In general, these can be mathematically modelled by the evolution of continuous state variables for the physical components interleaved with discrete events. Despite large effort and progress in the exhaustive verification of such hybrid systems, the complexity of CPS models limits formal verification of safety of their behaviour only to small instances. An alternative approach, closer to the practice of simulation and testing, is to monitor and to predict CPS behaviours at simulation-time or at runtime. In this chapter, we summarise the state-of-the-art techniques for qualitative and quantitative monitoring of CPS behaviours. We present an overview of some of the important applications and, finally, we describe the tools supporting CPS monitoring and compare their main features.

Book ChapterDOI
16 Apr 2018
TL;DR: The first complete small-step semantics of EVM bytecode is presented, which is formalized in the F* proof assistant, obtaining executable code that is successfully validate against the official Ethereum test suite.
Abstract: Smart contracts are programs running on cryptocurrency (e.g., Ethereum) blockchains, whose popularity stem from the possibility to perform financial transactions, such as payments and auctions, in a distributed environment without need for any trusted third party. Given their financial nature, bugs or vulnerabilities in these programs may lead to catastrophic consequences, as witnessed by recent attacks. Unfortunately, programming smart contracts is a delicate task that requires strong expertise: Ethereum smart contracts are written in Solidity, a dedicated language resembling JavaScript, and shipped over the blockchain in the EVM bytecode format. In order to rigorously verify the security of smart contracts, it is of paramount importance to formalize their semantics as well as the security properties of interest, in particular at the level of the bytecode being executed.

Journal ArticleDOI
TL;DR: A review of 2D p-n junction geometries can be found in this paper, focusing on vertical (out-of-plane) and lateral (inplane) 2D junctions and on mixed-dimensional junctions.
Abstract: Recent research in two-dimensional (2D) materials has boosted a renovated interest in the p–n junction, one of the oldest electrical components which can be used in electronics and optoelectronics. 2D materials offer remarkable flexibility to design novel p–n junction device architectures, not possible with conventional bulk semiconductors. In this Review we thoroughly describe the different 2D p–n junction geometries studied so far, focusing on vertical (out-of-plane) and lateral (in-plane) 2D junctions and on mixed-dimensional junctions. We discuss the assembly methods developed to fabricate 2D p–n junctions making a distinction between top-down and bottom-up approaches. We also revise the literature studying the different applications of these atomically thin p–n junctions in electronic and optoelectronic devices. We discuss experiments on 2D p–n junctions used as current rectifiers, photodetectors, solar cells and light emitting devices. The important electronics and optoelectronics parameters of the discussed devices are listed in a table to facilitate their comparison. We conclude the Review with a critical discussion about the future outlook and challenges of this incipient research field.

Journal ArticleDOI
TL;DR: The main objectives of this benchmarking study are to evaluate the potential of applying TLS in characterizing forests, to clarify the strengths and the weaknesses of TLS as a measure of forest digitization, and to reveal the capability of recent algorithms for tree-attribute extraction.
Abstract: The last two decades have witnessed increasing awareness of the potential of terrestrial laser scanning (TLS) in forest applications in both public and commercial sectors, along with tremendous research efforts and progress. It is time to inspect the achievements of and the remaining barriers to TLS-based forest investigations, so further research and application are clearly orientated in operational uses of TLS. In such context, the international TLS benchmarking project was launched in 2014 by the European Spatial Data Research Organization and coordinated by the Finnish Geospatial Research Institute. The main objectives of this benchmarking study are to evaluate the potential of applying TLS in characterizing forests, to clarify the strengths and the weaknesses of TLS as a measure of forest digitization, and to reveal the capability of recent algorithms for tree-attribute extraction. The project is designed to benchmark the TLS algorithms by processing identical TLS datasets for a standardized set of forest attribute criteria and by evaluating the results through a common procedure respecting reliable references. Benchmarking results reflect large variances in estimating accuracies, which were unveiled through the 18 compared algorithms and through the evaluation framework, i.e., forest complexity categories, TLS data acquisition approaches, tree attributes and evaluation procedures. The evaluation framework includes three new criteria proposed in this benchmarking and the algorithm performances are investigated through combining two or more criteria (e.g., the accuracy of the individual tree attributes are inspected in conjunction with plot-level completeness) in order to reveal algorithms’ overall performance. The results also reveal some best available forest attribute estimates at this time, which clarify the status quo of TLS-based forest investigations. Some results are well expected, while some are new, e.g., the variances of estimating accuracies between single-/multi-scan, the principle of the algorithm designs and the possibility of a computer outperforming human operation. With single-scan data, i.e., one hemispherical scan per plot, most of the recent algorithms are capable of achieving stem detection with approximately 75% completeness and 90% correctness in the easy forest stands (easy plots: 600 stems/ha, 20 cm mean DBH). The detection rate decreases when the stem density increases and the average DBH decreases, i.e., 60% completeness with 90% correctness (medium plots: 1000 stem/ha, 15 cm mean DBH) and 30% completeness with 90% correctness (difficult plots: 2000 stems/ha, 10 cm mean DBH). The application of the multi-scan approach, i.e., five scans per plot at the center and four quadrant angles, is more effective in complex stands, increasing the completeness to approximately 90% for medium plots and to approximately 70% for difficult plots, with almost 100% correctness. The results of this benchmarking also show that the TLS-based approaches can provide the estimates of the DBH and the stem curve at a 1–2 cm accuracy that are close to what is required in practical applications, e.g., national forest inventories (NFIs). In terms of algorithm development, a high level of automation is a commonly shared standard, but a bottleneck occurs at stem detection and tree height estimation, especially in multilayer and dense forest stands. The greatest challenge is that even with the multi-scan approach, it is still hard to completely and accurately record stems of all trees in a plot due to the occlusion effects of the trees and bushes in forests. Future development must address the redundant yet incomplete point clouds of forest sample plots and recognize trees more accurately and efficiently. It is worth noting that TLS currently provides the best quality terrestrial point clouds in comparison with all other technologies, meaning that all the benchmarks labeled in this paper can also serve as a reference for other terrestrial point clouds sources.

Journal ArticleDOI
03 Oct 2018-Nature
TL;DR: For many northern ecosystems the benefits of warmer springs on growing-season ecosystem productivity are effectively compensated for by the accumulation of seasonal water deficits, despite the fact that northern ecosystems are thought to be largely temperature- and radiation-limited.
Abstract: Climate change is shifting the phenological cycles of plants1, thereby altering the functioning of ecosystems, which in turn induces feedbacks to the climate system2. In northern (north of 30° N) ecosystems, warmer springs lead generally to an earlier onset of the growing season3,4 and increased ecosystem productivity early in the season5. In situ6 and regional7–9 studies also provide evidence for lagged effects of spring warmth on plant productivity during the subsequent summer and autumn. However, our current understanding of these lagged effects, including their direction (beneficial or adverse) and geographic distribution, is still very limited. Here we analyse satellite, field-based and modelled data for the period 1982–2011 and show that there are widespread and contrasting lagged productivity responses to spring warmth across northern ecosystems. On the basis of the observational data, we find that roughly 15 per cent of the total study area of about 41 million square kilometres exhibits adverse lagged effects and that roughly 5 per cent of the total study area exhibits beneficial lagged effects. By contrast, current-generation terrestrial carbon-cycle models predict much lower areal fractions of adverse lagged effects (ranging from 1 to 14 per cent) and much higher areal fractions of beneficial lagged effects (ranging from 9 to 54 per cent). We find that elevation and seasonal precipitation patterns largely dictate the geographic pattern and direction of the lagged effects. Inadequate consideration in current models of the effects of the seasonal build-up of water stress on seasonal vegetation growth may therefore be able to explain the differences that we found between our observation-constrained estimates and the model-constrained estimates of lagged effects associated with spring warming. Overall, our results suggest that for many northern ecosystems the benefits of warmer springs on growing-season ecosystem productivity are effectively compensated for by the accumulation of seasonal water deficits, despite the fact that northern ecosystems are thought to be largely temperature- and radiation-limited10.

Journal ArticleDOI
TL;DR: The Wigner function has been widely used in quantum information processing and quantum physics as discussed by the authors, where it has been used to model the electron transport, to calculate the static and dynamical properties of many-body quantum systems.
Abstract: The Wigner function was formulated in 1932 by Eugene Paul Wigner, at a time when quantum mechanics was in its infancy. In doing so, he brought phase space representations into quantum mechanics. However, its unique nature also made it very interesting for classical approaches and for identifying the deviations from classical behavior and the entanglement that can occur in quantum systems. What stands out, though, is the feature to experimentally reconstruct the Wigner function, which provides far more information on the system than can be obtained by any other quantum approach. This feature is particularly important for the field of quantum information processing and quantum physics. However, the Wigner function finds wide-ranging use cases in other dominant and highly active fields as well, such as in quantum electronics—to model the electron transport, in quantum chemistry—to calculate the static and dynamical properties of many-body quantum systems, and in signal processing—to investigate waves passing through certain media. What is peculiar in recent years is a strong increase in applying it: Although originally formulated 86 years ago, only today the full potential of the Wigner function—both in ability and diversity—begins to surface. This review, as well as a growing, dedicated Wigner community, is a testament to this development and gives a broad and concise overview of recent advancements in different fields.

Journal ArticleDOI
TL;DR: This opinion article discusses the emergence of a third strategy in TE that integrates the advantages of both of these traditional approaches, while being clearly distinct from them.

Journal ArticleDOI
TL;DR: This review presents the most important developments in single-cell, 2D and 3D microfluidic cell culture systems for studying cell-to-cell interactions published over the last 6 years, with a focus on cancer research and immunotherapy, vascular models and neuroscience.
Abstract: Microfluidic cell cultures are ideally positioned to become the next generation of in vitro diagnostic tools for biomedical research, where key biological processes such as cell signalling and dynamic cell-to-cell interactions can be reliably analysed under reproducible physiological cell culture conditions. In the last decade, a large number of microfluidic cell analysis systems have been developed for a variety of applications including drug target optimization, drug screening and toxicological testing. More recently, advanced in vitro microfluidic cell culture systems have emerged that are capable of replicating the complex three-dimensional architectures of tissues and organs and thus represent valid biological models for investigating the mechanism and function of human tissue structures, as well as studying the onset and progression of diseases such as cancer. In this review, we present the most important developments in single-cell, 2D and 3D microfluidic cell culture systems for studying cell-to-cell interactions published over the last 6 years, with a focus on cancer research and immunotherapy, vascular models and neuroscience. In addition, the current technological development of microdevices with more advanced physiological cell microenvironments that integrate multiple organ models, namely, the so-called body-, human- and multi-organ-on-a-chip, is reviewed.

Journal ArticleDOI
TL;DR: In this paper, the authors present a comprehensive analysis of biomedical image analysis challenges conducted up to now and demonstrate the importance of challenges and show that the lack of quality control has critical consequences.
Abstract: International challenges have become the standard for validation of biomedical image analysis methods. Given their scientific impact, it is surprising that a critical analysis of common practices related to the organization of challenges has not yet been performed. In this paper, we present a comprehensive analysis of biomedical image analysis challenges conducted up to now. We demonstrate the importance of challenges and show that the lack of quality control has critical consequences. First, reproducibility and interpretation of the results is often hampered as only a fraction of relevant information is typically provided. Second, the rank of an algorithm is generally not robust to a number of variables such as the test data used for validation, the ranking scheme applied and the observers that make the reference annotations. To overcome these problems, we recommend best practice guidelines and define open research questions to be addressed in the future.

Journal ArticleDOI
TL;DR: The potential of Sentinel-1 VV and VH backscatter and their ratio VH/VV, the cross ratio (CR), to monitor crop conditions is assessed and demonstrates the large potential of microwave indices for vegetation monitoring of VWC and phenology.
Abstract: Crop monitoring is of great importance for e.g., yield prediction and increasing water use efficiency. The Copernicus Sentinel-1 mission operated by the European Space Agency provides the opportunity to monitor Earth’s surface using radar at high spatial and temporal resolution. Sentinel-1’s Synthetic Aperture Radar provides co- and cross-polarized backscatter, enabling the calculation of microwave indices. In this study, we assess the potential of Sentinel-1 VV and VH backscatter and their ratio VH/VV, the cross ratio (CR), to monitor crop conditions. A quantitative assessment is provided based on in situ reference data of vegetation variables for different crops under varying meteorological conditions. Vegetation Water Content (VWC), biomass, Leaf Area Index (LAI) and height are measured in situ for oilseed-rape, corn and winter cereals at different fields during two growing seasons. To quantify the sensitivity of backscatter and microwave indices to vegetation dynamics, linear and exponential models and machine learning methods have been applied to the Sentinel-1 data and in situ measurements. Using an exponential model, the CR can account for 87% and 63% of the variability in VWC for corn and winter cereals. In oilseed-rape, the coefficient of determination ( R 2 ) is lower ( R 2 = 0.34) due to the large difference in VWC between the two growing seasons and changes in vegetation structure that affect backscatter. Findings from the Random Forest analysis, which uses backscatter, microwave indices and soil moisture as input variables, show that CR is by and large the most important variable to estimate VWC. This study demonstrates, based on a quantitative analysis, the large potential of microwave indices for vegetation monitoring of VWC and phenology.

Journal ArticleDOI
TL;DR: The purpose of this special issue is to analyze the top concerns in IoT technologies that pertain to smart sensors for health care applications; particularly applications targeted at individualized tele-health interventions with the goal of enabling healthier ways of life.

Journal ArticleDOI
13 Nov 2018
TL;DR: In this paper, the authors argue that there are two counterintuitive dynamics that should be considered when considering the expansion of reservoirs to cope with droughts and water shortages in many places around the world.
Abstract: The expansion of reservoirs to cope with droughts and water shortages is hotly debated in many places around the world. We argue that there are two counterintuitive dynamics that should be consider ...

Journal ArticleDOI
01 Nov 2018-Nature
TL;DR: The results establish universal scaling dynamics in an isolated quantum many-body system, which is a crucial step towards characterizing time evolution far from equilibrium in terms of universality classes.
Abstract: Understanding the behaviour of isolated quantum systems far from equilibrium and their equilibration is one of the most pressing problems in quantum many-body physics1,2. There is strong theoretical evidence that sufficiently far from equilibrium a wide variety of systems—including the early Universe after inflation3–6, quark–gluon matter generated in heavy-ion collisions7–9, and cold quantum gases4,10–14—exhibit universal scaling in time and space during their evolution, independent of their initial state or microscale properties. However, direct experimental evidence is lacking. Here we demonstrate universal scaling in the time-evolving momentum distribution of an isolated, far-from-equilibrium, one-dimensional Bose gas, which emerges from a three-dimensional ultracold Bose gas by means of a strong cooling quench. Within the scaling regime, the time evolution of the system at low momenta is described by a time-independent, universal function and a single scaling exponent. The non-equilibrium scaling describes the transport of an emergent conserved quantity towards low momenta, which eventually leads to the build-up of a quasi-condensate. Our results establish universal scaling dynamics in an isolated quantum many-body system, which is a crucial step towards characterizing time evolution far from equilibrium in terms of universality classes. Universality would open the possibility of using, for example, cold-atom set-ups at the lowest energies to simulate important aspects of the dynamics of currently inaccessible systems at the highest energies, such as those encountered in the inflationary early Universe.

Book ChapterDOI
01 Jan 2018
TL;DR: The aim of this chapter is to act as a primer for those wanting to learn about Runtime Verification, providing an overview of the main specification languages used for RV and introducing the standard terminology necessary to describe the monitoring problem.
Abstract: The aim of this chapter is to act as a primer for those wanting to learn about Runtime Verification (RV) We start by providing an overview of the main specification languages used for RV We then introduce the standard terminology necessary to describe the monitoring problem, covering the pragmatic issues of monitoring and instrumentation, and discussing extensively the monitorability problem

Journal ArticleDOI
TL;DR: This work aims to identify the best available technologies for material recovery in order to avoid landfill solutions, and six case studies are presented and discussed: recycling in lightweight aggregates, glass-ceramics, cement, recovery of zinc, rare metals and salts.

Journal ArticleDOI
TL;DR: This article presents a series of key research challenges that are essential to advance the development of LBS, setting a research agenda for LBS to ‘positively’ shape the future of the authors' mobile information society.
Abstract: We are now living in a mobile information era, which is fundamentally changing science and society. Location Based Services (LBS), which deliver information depending on the location of the (mobile...

Journal ArticleDOI
TL;DR: In this article, the authors evaluated 18 phosphorus recovery technologies in terms of cumulative energy demand, global warming potential, and acidification potential with the methodology of life cycle analysis, and compared them with other environmental criteria, i.e. recovery potential, heavy metal and organic micropollutant decontamination potential and fertilizer efficiency, to determine their overall environmental performance.
Abstract: Phosphorus mining from phosphate rock is associated with economic as well as environmental concerns. Through phosphorus recovery from municipal wastewater, countries could decrease their dependency on the global phosphate rock market, however, conceivably leading to an increase in environmental impacts from fertilizer production. In this work 18 phosphorus recovery technologies are evaluated in terms of cumulative energy demand, global warming potential and acidification potential with the methodology of life cycle analysis. These indicators are then contrasted with other environmental criteria, i.e. recovery potential, heavy metal and organic micropollutant decontamination potential and fertilizer efficiency, to determine their overall environmental performance. The LCA shows that a broad spectrum of changes in gaseous emissions and energy demand can be expected through the implementation of P recovery from wastewater. Linkage to further environmental performance results exposes certain trade-offs for the different technologies. Recovery from the liquid phase has mostly positive or comparably little impacts on emissions and energy demand but the low recovery potential contradicts the demand for efficient recycling rates. For recovery from sewage sludge, those technologies that already are or are close to being applied full-scale, are associated with comparatively high emissions and energy demand. Recovery from sewage sludge ash shows varying results, partly revealing trade-offs between heavy metal decontamination, emissions and energy demand. Nevertheless, recovery from ash is correlated with the highest potential for an efficient recycling of phosphorus. Further research should include implications of local infrastructures and legal frameworks to determine economically and environmentally optimised P recovery and recycling concepts.