scispace - formally typeset
Search or ask a question

Showing papers by "Vienna University of Technology published in 2020"


Journal ArticleDOI
TL;DR: The WIEN2k program is based on the augmented plane wave plus local orbitals (APW+lo) method to solve the Kohn-Sham equations of density functional theory, and the various options, properties, and available approximations for the exchange-correlation functional are mentioned.
Abstract: The WIEN2k program is based on the augmented plane wave plus local orbitals (APW+lo) method to solve the Kohn-Sham equations of density functional theory. The APW+lo method, which considers all electrons (core and valence) self-consistently in a full-potential treatment, is implemented very efficiently in WIEN2k, since various types of parallelization are available and many optimized numerical libraries can be used. Many properties can be calculated, ranging from the basic ones, such as the electronic band structure or the optimized atomic structure, to more specialized ones such as the nuclear magnetic resonance shielding tensor or the electric polarization. After a brief presentation of the APW+lo method, we review the usage, capabilities, and features of WIEN2k (version 19) in detail. The various options, properties, and available approximations for the exchange-correlation functional, as well as the external libraries or programs that can be used with WIEN2k, are mentioned. References to relevant applications and some examples are also given.

1,016 citations


Journal ArticleDOI
04 Mar 2020-Nature
TL;DR: It is demonstrated that an image sensor can itself constitute an ANN that can simultaneously sense and process optical images without latency, and is trained to classify and encode images with high throughput, acting as an artificial neural network.
Abstract: Machine vision technology has taken huge leaps in recent years, and is now becoming an integral part of various intelligent systems, including autonomous vehicles and robotics. Usually, visual information is captured by a frame-based camera, converted into a digital format and processed afterwards using a machine-learning algorithm such as an artificial neural network (ANN)1. The large amount of (mostly redundant) data passed through the entire signal chain, however, results in low frame rates and high power consumption. Various visual data preprocessing techniques have thus been developed2-7 to increase the efficiency of the subsequent signal processing in an ANN. Here we demonstrate that an image sensor can itself constitute an ANN that can simultaneously sense and process optical images without latency. Our device is based on a reconfigurable two-dimensional (2D) semiconductor8,9 photodiode10-12 array, and the synaptic weights of the network are stored in a continuously tunable photoresponsivity matrix. We demonstrate both supervised and unsupervised learning and train the sensor to classify and encode images that are optically projected onto the chip with a throughput of 20 million bins per second.

436 citations


Journal ArticleDOI
TL;DR: In this paper, the authors divide edge intelligence into AI for edge (intelligence-enabled edge computing) and AI on edge (artificial intelligence on edge), and provide insights into this new interdisciplinary field from a broader perspective.
Abstract: Along with the rapid developments in communication technologies and the surge in the use of mobile devices, a brand-new computation paradigm, edge computing, is surging in popularity. Meanwhile, the artificial intelligence (AI) applications are thriving with the breakthroughs in deep learning and the many improvements in hardware architectures. Billions of data bytes, generated at the network edge, put massive demands on data processing and structural optimization. Thus, there exists a strong demand to integrate edge computing and AI, which gives birth to edge intelligence. In this article, we divide edge intelligence into AI for edge (intelligence-enabled edge computing) and AI on edge (artificial intelligence on edge). The former focuses on providing more optimal solutions to key problems in edge computing with the help of popular and effective AI technologies while the latter studies how to carry out the entire process of building AI models, i.e., model training and inference, on the edge. This article provides insights into this new interdisciplinary field from a broader perspective. It discusses the core concepts and the research roadmap, which should provide the necessary background for potential future research initiatives in edge intelligence.

343 citations


Journal ArticleDOI
TL;DR: This viewpoint article argues that the impacts of the novel coronavirus COVID-19 call for transformative e-Tourism research, and presents six pillars to guide scholars in their efforts to transform e- Tourism through their research, including historicity, reflexivity, equity, transparency, plurality, and creativity.
Abstract: This viewpoint article argues that the impacts of the novel coronavirus COVID-19 call for transformative e-Tourism research. We are at a crossroads where one road takes us to e-Tourism as it was before the crisis, whereas the other holds the potential to transform e-Tourism. To realize this potential, e-Tourism research needs to challenge existing paradigms and critically evaluate its ontological and epistemological foundations. In light of the paramount importance to rethink contemporary science, growth, and technology paradigms, we present six pillars to guide scholars in their efforts to transform e-Tourism through their research, including historicity, reflexivity, equity, transparency, plurality, and creativity. We conclude the paper with a call to the e-Tourism research community to embrace transformative research.

308 citations


Journal ArticleDOI
TL;DR: The JEFF-3.3 data library as mentioned in this paper is a joint evaluated fission and fusion nuclear data library 3.3 which includes new fission yields, prompt fission neutron spectra and average number of neutrons per fission.
Abstract: The joint evaluated fission and fusion nuclear data library 3.3 is described. New evaluations for neutron-induced interactions with the major actinides $^{235}\hbox {U}$, $^{238}\hbox {U}$ and $^{239}\hbox {Pu}$, on $^{241}\hbox {Am}$ and $^{23}\hbox {Na}$, $^{59}\hbox {Ni}$, Cr, Cu, Zr, Cd, Hf, W, Au, Pb and Bi are presented. It includes new fission yields, prompt fission neutron spectra and average number of neutrons per fission. In addition, new data for radioactive decay, thermal neutron scattering, gamma-ray emission, neutron activation, delayed neutrons and displacement damage are presented. JEFF-3.3 was complemented by files from the TENDL project. The libraries for photon, proton, deuteron, triton, helion and alpha-particle induced reactions are from TENDL-2017. The demands for uncertainty quantification in modeling led to many new covariance data for the evaluations. A comparison between results from model calculations using the JEFF-3.3 library and those from benchmark experiments for criticality, delayed neutron yields, shielding and decay heat, reveals that JEFF-3.3 performes very well for a wide range of nuclear technology applications, in particular nuclear energy.

262 citations


Journal ArticleDOI
TL;DR: This review focuses on an area of ceria defect chemistry which has received comparatively little attention - defect-induced local distortions and short-range associates, which are non-periodic in nature and hence not readily detected by conventional X-ray powder diffraction.
Abstract: Ceria and its solid solutions play a vital role in several industrial processes and devices. These include solar energy-to-fuel conversion, solid oxide fuel and electrolyzer cells, memristors, chemical looping combustion, automotive 3-way catalysts, catalytic surface coatings, supercapacitors and recently, electrostrictive devices. An attractive feature of ceria is the possibility of tuning defect-chemistry to increase the effectiveness of the materials in application areas. Years of study have revealed many features of the long-range, macroscopic characteristics of ceria and its derivatives. In this review we focus on an area of ceria defect chemistry which has received comparatively little attention - defect-induced local distortions and short-range associates. These features are non-periodic in nature and hence not readily detected by conventional X-ray powder diffraction. We compile the relevant literature data obtained by thermodynamic analysis, Raman spectroscopy, and X-ray absorption fine structure (XAFS) spectroscopy. Each of these techniques provides insight into material behavior without reliance on long-range periodic symmetry. From thermodynamic analyses, association of defects is inferred. From XAFS, an element-specific probe, local structure around selected atomic species is obtained, whereas from Raman spectroscopy, local symmetry breaking and vibrational changes in bonding patterns is detected. We note that, for undoped ceria and its solid solutions, the relationship between short range order and cation-oxygen-vacancy coordination remains a subject of active debate. Beyond collating the sometimes contradictory data in the literature, we strengthen this review by reporting new spectroscopy results and analysis. We contribute to this debate by introducing additional data and analysis, with the expectation that increasing our fundamental understanding of this relationship will lead to an ability to predict and tailor the defect-chemistry of ceria-based materials for practical applications.

233 citations


Journal ArticleDOI
TL;DR: The authors review the current state-of-the-art and the future prospects of suitable insulators for 2D technologies and possible solution scenarios like the creation of clean interfaces, production of native oxides from 2D semiconductors and more intensive studies on crystalline insulators.
Abstract: Nanoelectronic devices based on 2D materials are far from delivering their full theoretical performance potential due to the lack of scalable insulators. Amorphous oxides that work well in silicon technology have ill-defined interfaces with 2D materials and numerous defects, while 2D hexagonal boron nitride does not meet required dielectric specifications. The list of suitable alternative insulators is currently very limited. Thus, a radically different mindset with respect to suitable insulators for 2D technologies may be required. We review possible solution scenarios like the creation of clean interfaces, production of native oxides from 2D semiconductors and more intensive studies on crystalline insulators. The lack of scalable, high-quality insulators is a major problem hindering the progress on electronic devices built from 2D materials. Here, the authors review the current state-of-the-art and the future prospects of suitable insulators for 2D technologies.

204 citations


Journal ArticleDOI
28 May 2020
TL;DR: A complete framework of 5G technologies for smart railways, such as spatial modulation, fast channel estimation, cell-free massive multiple-input–multiple-input-multiple-output (MIMO), mmWave, efficient beamforming, wireless backhaul, ultrareliable low latency communications, and enhanced handover strategies are developed.
Abstract: Railway communications has attracted significant attention from both academia and industries due to the booming development of railways, especially high-speed railways (HSRs). To be in line with the vision of future smart rail communications, the rail transport industry needs to develop innovative communication network architectures and key technologies that ensure high-quality transmissions for both passengers and railway operations and control systems. Under high mobility and with safety, eco-friendliness, comfort, transparency, predictability, and reliability. Fifth-generation (5G) technologies could be a promising solution to dealing with the design challenges on high reliability and high throughput for HSR communications. Based on our in-depth analysis of smart rail traffic services and communication scenarios, we propose a network slicing architecture for a 5G-based HSR system. With a ray tracing-based analysis of radio wave propagation characteristics and channel models for millimeter wave (mmWave) bands in railway scenarios, we draw important conclusions with regard to appropriate operating frequency bands for HSRs. mymargin Specifically, we have identified significant 5G-based key technologies for HSRs, such as spatial modulation, fast channel estimation, cell-free massive multiple-input–multiple-output (MIMO), mmWave, efficient beamforming, wireless backhaul, ultrareliable low latency communications, and enhanced handover strategies. Based on these technologies, we have developed a complete framework of 5G technologies for smart railways and pointed out exciting future research directions.

200 citations


Journal ArticleDOI
01 Jan 2020-Talanta
TL;DR: The recent advances and improvements in the electrochemical detection of the lung cancer biomarkers have been reviewed and it is shown that electrochemical methods in lung cancer detections are very attractive and useful.

197 citations


Journal ArticleDOI
10 Jul 2020
TL;DR: A conceptual framework for the identification of fungi is provided, encouraging the approach of integrative (polyphasic) taxonomy for species delimitation, i.e. the combination of genealogy, phenotype, and phenotype-based approaches to catalog the global diversity of fungi and establish initial species hypotheses.
Abstract: True fungi (Fungi) and fungus-like organisms (e.g. Mycetozoa, Oomycota) constitute the second largest group of organisms based on global richness estimates, with around 3 million predicted species. Compared to plants and animals, fungi have simple body plans with often morphologically and ecologically obscure structures. This poses challenges for accurate and precise identifications. Here we provide a conceptual framework for the identification of fungi, encouraging the approach of integrative (polyphasic) taxonomy for species delimitation, i.e. the combination of genealogy (phylogeny), phenotype (including autecology), and reproductive biology (when feasible). This allows objective evaluation of diagnostic characters, either phenotypic or molecular or both. Verification of identifications is crucial but often neglected. Because of clade-specific evolutionary histories, there is currently no single tool for the identification of fungi, although DNA barcoding using the internal transcribed spacer (ITS) remains a first diagnosis, particularly in metabarcoding studies. Secondary DNA barcodes are increasingly implemented for groups where ITS does not provide sufficient precision. Issues of pairwise sequence similarity-based identifications and OTU clustering are discussed, and multiple sequence alignment-based phylogenetic approaches with subsequent verification are recommended as more accurate alternatives. In metabarcoding approaches, the trade-off between speed and accuracy and precision of molecular identifications must be carefully considered. Intragenomic variation of the ITS and other barcoding markers should be properly documented, as phylotype diversity is not necessarily a proxy of species richness. Important strategies to improve molecular identification of fungi are: (1) broadly document intraspecific and intragenomic variation of barcoding markers; (2) substantially expand sequence repositories, focusing on undersampled clades and missing taxa; (3) improve curation of sequence labels in primary repositories and substantially increase the number of sequences based on verified material; (4) link sequence data to digital information of voucher specimens including imagery. In parallel, technological improvements to genome sequencing offer promising alternatives to DNA barcoding in the future. Despite the prevalence of DNA-based fungal taxonomy, phenotype-based approaches remain an important strategy to catalog the global diversity of fungi and establish initial species hypotheses.

191 citations


Journal ArticleDOI
TL;DR: A yeast species used to produce proteins and chemicals is engineered to grow solely on the greenhouse gas CO2, and may promote sustainability by sequestering the greenhouse gases, and by avoiding consumption of an organic feedstock with alternative uses in food production.
Abstract: The methylotrophic yeast Pichia pastoris is widely used in the manufacture of industrial enzymes and pharmaceuticals. Like most biotechnological production hosts, P. pastoris is heterotrophic and grows on organic feedstocks that have competing uses in the production of food and animal feed. In a step toward more sustainable industrial processes, we describe the conversion of P. pastoris into an autotroph that grows on CO2. By addition of eight heterologous genes and deletion of three native genes, we engineer the peroxisomal methanol-assimilation pathway of P. pastoris into a CO2-fixation pathway resembling the Calvin–Benson–Bassham cycle, the predominant natural CO2-fixation pathway. The resulting strain can grow continuously with CO2 as a sole carbon source at a µmax of 0.008 h−1. The specific growth rate was further improved to 0.018 h−1 by adaptive laboratory evolution. This engineered P. pastoris strain may promote sustainability by sequestering the greenhouse gas CO2, and by avoiding consumption of an organic feedstock with alternative uses in food production. A yeast species used to produce proteins and chemicals is engineered to grow solely on the greenhouse gas CO2.

Book ChapterDOI
Matej Kristan1, Ales Leonardis2, Jiří Matas3, Michael Felsberg4, Roman Pflugfelder5, Roman Pflugfelder6, Joni-Kristian Kamarainen, Martin Danelljan7, Luka Čehovin Zajc1, Alan Lukežič1, Ondrej Drbohlav3, Linbo He4, Yushan Zhang4, Yushan Zhang8, Song Yan, Jinyu Yang2, Gustavo Fernandez5, Alexander G. Hauptmann9, Alireza Memarmoghadam10, Alvaro Garcia-Martin11, Andreas Robinson4, Anton Varfolomieiev12, Awet Haileslassie Gebrehiwot11, Bedirhan Uzun13, Bin Yan14, Bing Li15, Chen Qian, Chi-Yi Tsai16, Christian Micheloni17, Dong Wang14, Fei Wang, Fei Xie18, Felix Järemo Lawin4, Fredrik K. Gustafsson19, Gian Luca Foresti17, Goutam Bhat7, Guangqi Chen, Haibin Ling20, Haitao Zhang, Hakan Cevikalp13, Haojie Zhao14, Haoran Bai21, Hari Chandana Kuchibhotla22, Hasan Saribas, Heng Fan20, Hossein Ghanei-Yakhdan23, Houqiang Li24, Houwen Peng25, Huchuan Lu14, Hui Li26, Javad Khaghani27, Jesús Bescós11, Jianhua Li14, Jianlong Fu25, Jiaqian Yu28, Jingtao Xu28, Josef Kittler29, Jun Yin, Junhyun Lee30, Kaicheng Yu31, Kaiwen Liu15, Kang Yang32, Kenan Dai14, Li Cheng27, Li Zhang33, Lijun Wang14, Linyuan Wang, Luc Van Gool7, Luca Bertinetto, Matteo Dunnhofer17, Miao Cheng, Mohana Murali Dasari22, Ning Wang32, Pengyu Zhang14, Philip H. S. Torr33, Qiang Wang, Radu Timofte7, Rama Krishna Sai Subrahmanyam Gorthi22, Seokeon Choi34, Seyed Mojtaba Marvasti-Zadeh27, Shaochuan Zhao26, Shohreh Kasaei35, Shoumeng Qiu15, Shuhao Chen14, Thomas B. Schön19, Tianyang Xu29, Wei Lu, Weiming Hu15, Wengang Zhou24, Xi Qiu, Xiao Ke36, Xiaojun Wu26, Xiaolin Zhang15, Xiaoyun Yang, Xue-Feng Zhu26, Yingjie Jiang26, Yingming Wang14, Yiwei Chen28, Yu Ye36, Yuezhou Li36, Yuncon Yao18, Yunsung Lee30, Yuzhang Gu15, Zezhou Wang14, Zhangyong Tang26, Zhen-Hua Feng29, Zhijun Mai37, Zhipeng Zhang15, Zhirong Wu25, Ziang Ma 
23 Aug 2020
TL;DR: A significant novelty is introduction of a new VOT short-term tracking evaluation methodology, and introduction of segmentation ground truth in the VOT-ST2020 challenge – bounding boxes will no longer be used in theVDT challenges.
Abstract: The Visual Object Tracking challenge VOT2020 is the eighth annual tracker benchmarking activity organized by the VOT initiative. Results of 58 trackers are presented; many are state-of-the-art trackers published at major computer vision conferences or in journals in the recent years. The VOT2020 challenge was composed of five sub-challenges focusing on different tracking domains: (i) VOT-ST2020 challenge focused on short-term tracking in RGB, (ii) VOT-RT2020 challenge focused on “real-time” short-term tracking in RGB, (iii) VOT-LT2020 focused on long-term tracking namely coping with target disappearance and reappearance, (iv) VOT-RGBT2020 challenge focused on short-term tracking in RGB and thermal imagery and (v) VOT-RGBD2020 challenge focused on long-term tracking in RGB and depth imagery. Only the VOT-ST2020 datasets were refreshed. A significant novelty is introduction of a new VOT short-term tracking evaluation methodology, and introduction of segmentation ground truth in the VOT-ST2020 challenge – bounding boxes will no longer be used in the VOT-ST challenges. A new VOT Python toolkit that implements all these novelites was introduced. Performance of the tested trackers typically by far exceeds standard baselines. The source code for most of the trackers is publicly available from the VOT page. The dataset, the evaluation kit and the results are publicly available at the challenge website (http://votchallenge.net).

Journal ArticleDOI
TL;DR: In this paper, thermal desorption-proton transfer reaction reaction-mass spectrometry was used to detect polystyrene (PS) in high-altitude snow.
Abstract: We present a new method for chemical characterization of micro- and nanoplastics based on thermal desorption-proton transfer reaction-mass spectrometry. The detection limit for polystyrene (PS) obtained is <1 ng of the compound present in a sample, which results in 100 times better sensitivity than those of previously reported by other methods. This allows us to use small volumes of samples (1 mL) and to carry out experiments without a preconcentration step. Unique features in the high-resolution mass spectrum of different plastic polymers make this approach suitable for fingerprinting, even when the samples contain mixtures of other organic compounds. Accordingly, we got a positive fingerprint of PS when just 10 ng of the polymer was present within the dissolved organic matter of snow. Multiple types of microplastics (polyethylene terephthalate (PET), polyvinyl chloride, and polypropylene carbonate), were identified in a snowpit from the Austrian Alps; however, only PET was detected in the nanometer range for both snowpit and surface snow samples. This is in accordance with other publications showing that the dominant form of airborne microplastics is PET fibers. The presence of nanoplastics in high-altitude snow indicates airborne transport of plastic pollution with environmental and health consequences yet to be understood.

Journal ArticleDOI
TL;DR: In this article, the authors present a community effort to develop good practice guidelines for the validation of global coarse-scale satellite soil moisture products and provide theoretical background, a review of state-of-the-art methodologies for estimating errors in soil moisture data sets, practical recommendations on data pre-processing and presentation of statistical results, and a recommended validation protocol that is supplemented with an example validation exercise focused on microwave-based surface soil moisture product.

Journal ArticleDOI
TL;DR: Efficient sub-cycle THz pulse generation by using two-color midinfrared femtosecond laser filaments in ambient air using affordable table-top laser systems is experimentally demonstrated.
Abstract: Extreme nonlinear interactions of THz electromagnetic fields with matter are the next frontier in nonlinear optics. However, reaching this frontier in free space is limited by the existing lack of appropriate powerful THz sources. Here, we experimentally demonstrate that two-color filamentation of femtosecond mid-infrared laser pulses at 3.9 μm allows one to generate ultrashort sub-cycle THz pulses with sub-milijoule energy and THz conversion efficiency of 2.36%, resulting in THz field amplitudes above 100 MV cm−1. Our numerical simulations predict that the observed THz yield can be significantly upscaled by further optimizing the experimental setup. Finally, in order to demonstrate the strength of our THz source, we show that the generated THz pulses are powerful enough to induce nonlinear cross-phase modulation in electro-optic crystals. Our work paves the way toward free space extreme nonlinear THz optics using affordable table-top laser systems. Powerful terahertz pulses are generated during the nonlinear propagation of ultrashort laser pulses in gases. Here, the authors demonstrate efficient sub-cycle THz pulse generation by using two-color midinfrared femtosecond laser filaments in ambient air.

Journal ArticleDOI
TL;DR: In this paper, a unified treatment of conformally soft Goldstone modes which arise when spin-one or spin-two conformal primary wavefunctions become pure gauge for certain integer values of the conformal dimension ∆ is provided.
Abstract: We provide a unified treatment of conformally soft Goldstone modes which arise when spin-one or spin-two conformal primary wavefunctions become pure gauge for certain integer values of the conformal dimension ∆. This effort lands us at the crossroads of two ongoing debates about what the appropriate conformal basis for celestial CFT is and what the asymptotic symmetry group of Einstein gravity at null infinity should be. Finite energy wavefunctions are captured by the principal continuous series ∆ ∈ 1 + iℝ and form a complete basis. We show that conformal primaries with analytically continued conformal dimension can be understood as certain contour integrals on the principal series. This clarifies how conformally soft Goldstone modes fit in but do not augment this basis. Conformally soft gravitons of dimension two and zero which are related by a shadow transform are shown to generate superrotations and non-meromorphic diffeomorphisms of the celestial sphere which we refer to as shadow superrotations. This dovetails the Virasoro and Diff(S2) asymptotic symmetry proposals and puts on equal footing the discussion of their associated soft charges, which correspond to the stress tensor and its shadow in the two-dimensional celestial CFT.

Journal ArticleDOI
TL;DR: A group of international experts, members of the NEREUS COST Action ES1403, who for three years have been constructively discussing the efficiency of the best available technologies (BATs) for urban wastewater treatment to abate CECs and ARB&ARGs are gathered.

Book ChapterDOI
10 Feb 2020
TL;DR: In this article, the authors proposed layer-two protocols, which are built on top of (layer-one) blockchains, avoid disseminating every transaction to the whole network by exchanging authenticated transactions off-chain.
Abstract: Blockchains have the potential to revolutionize markets and services. However, they currently exhibit high latencies and fail to handle transaction loads comparable to those managed by traditional financial systems. Layer-two protocols, built on top of (layer-one) blockchains, avoid disseminating every transaction to the whole network by exchanging authenticated transactions off-chain. Instead, they utilize the expensive and low-rate blockchain only as a recourse for disputes. The promise of layer-two protocols is to complete off-chain transactions in sub-seconds rather than minutes or hours while retaining asset security, reducing fees and allowing blockchains to scale.

Journal ArticleDOI
TL;DR: A specific hierarchical coding scheme for NPIs is developed and a comprehensive structured dataset of government interventions and their respective timelines of implementation is generated via an open library to improve transparency and motivate collaborative validation process.
Abstract: In response to the COVID-19 pandemic, governments have implemented a wide range of non-pharmaceutical interventions (NPIs). Monitoring and documenting government strategies during the COVID-19 crisis is crucial to understand the progression of the epidemic. Following a content analysis strategy of existing public information sources, we developed a specific hierarchical coding scheme for NPIs. We generated a comprehensive structured dataset of government interventions and their respective timelines of implementation. To improve transparency and motivate collaborative validation process, information sources are shared via an open library. We also provide codes that enable users to visualise the dataset. Standardization and structure of the dataset facilitate inter-country comparison and the assessment of the impacts of different NPI categories on the epidemic parameters, population health indicators, the economy, and human rights, among others. This dataset provides an in-depth insight of the government strategies and can be a valuable tool for developing relevant preparedness plans for pandemic. We intend to further develop and update this dataset until the end of December 2020.

Journal ArticleDOI
10 Jul 2020
TL;DR: A large-scale density-functional theory study on the influence of the exchange-correlation functional in the calculation of electronic band gaps of solids confirms that mBJ, HLE16 and HSE06 are the most accurate functionals for band gap calculations, and reveals several other interesting functionals, chief among which are the local Slater potential approximation, the GGA AK13LDA, and the meta-GGAs HLE17 and TASK.
Abstract: We conducted a large-scale density-functional theory study on the influence of the exchange-correlation functional in the calculation of electronic band gaps of solids. First, we use the large materials data set that we have recently proposed to benchmark 21 different functionals, with a particular focus on approximations of the meta-generalized-gradient family. Combining these data with the results for 12 functionals in our previous work, we can analyze in detail the characteristics of each approximation and identify its strong and/or weak points. Beside confirming that mBJ, HLE16 and HSE06 are the most accurate functionals for band gap calculations, we reveal several other interesting functionals, chief among which are the local Slater potential approximation, the GGA AK13LDA, and the meta-GGAs HLE17 and TASK. We also compare the computational efficiency of these different approximations. Relying on these data, we investigate the potential for improvement of a promising subset of functionals by varying their internal parameters. The identified optimal parameters yield a family of functionals fitted for the calculation of band gaps. Finally, we demonstrate how to train machine learning models for accurate band gap prediction, using as input structural and composition data, as well as approximate band gaps obtained from density-functional theory.

Journal ArticleDOI
22 Jul 2020-Nature
TL;DR: It is shown that the past three decades were among the most flood- rich periods in Europe in the past 500 years, and that this period differs from other flood-rich periods in terms of its extent, air temperatures and flood seasonality.
Abstract: There are concerns that recent climate change is altering the frequency and magnitude of river floods in an unprecedented way1. Historical studies have identified flood-rich periods in the past half millennium in various regions of Europe2. However, because of the low temporal resolution of existing datasets and the relatively low number of series, it has remained unclear whether Europe is currently in a flood-rich period from a long-term perspective. Here we analyse how recent decades compare with the flood history of Europe, using a new database composed of more than 100 high-resolution (sub-annual) historical flood series based on documentary evidence covering all major regions of Europe. We show that the past three decades were among the most flood-rich periods in Europe in the past 500 years, and that this period differs from other flood-rich periods in terms of its extent, air temperatures and flood seasonality. We identified nine flood-rich periods and associated regions. Among the periods richest in floods are 1560–1580 (western and central Europe), 1760–1800 (most of Europe), 1840–1870 (western and southern Europe) and 1990–2016 (western and central Europe). In most parts of Europe, previous flood-rich periods occurred during cooler-than-usual phases, but the current flood-rich period has been much warmer. Flood seasonality is also more pronounced in the recent period. For example, during previous flood and interflood periods, 41 per cent and 42 per cent of central European floods occurred in summer, respectively, compared with 55 per cent of floods in the recent period. The exceptional nature of the present-day flood-rich period calls for process-based tools for flood-risk assessment that capture the physical mechanisms involved, and management strategies that can incorporate the recent changes in risk. Analysis of thousands of historical documents recording floods in Europe shows that flooding characteristics in recent decades are unlike those of previous centuries.


Posted Content
TL;DR: This work proposes a cross-architecture training procedure with a margin focused loss (Margin-MSE), that adapts knowledge distillation to the varying score output distributions of different BERT and non-BERT ranking architectures, and shows that across evaluated architectures it significantly improves their effectiveness without compromising their efficiency.
Abstract: Retrieval and ranking models are the backbone of many applications such as web search, open domain QA, or text-based recommender systems. The latency of neural ranking models at query time is largely dependent on the architecture and deliberate choices by their designers to trade-off effectiveness for higher efficiency. This focus on low query latency of a rising number of efficient ranking architectures make them feasible for production deployment. In machine learning an increasingly common approach to close the effectiveness gap of more efficient models is to apply knowledge distillation from a large teacher model to a smaller student model. We find that different ranking architectures tend to produce output scores in different magnitudes. Based on this finding, we propose a cross-architecture training procedure with a margin focused loss (Margin-MSE), that adapts knowledge distillation to the varying score output distributions of different BERT and non-BERT passage ranking architectures. We apply the teachable information as additional fine-grained labels to existing training triples of the MSMARCO-Passage collection. We evaluate our procedure of distilling knowledge from state-of-the-art concatenated BERT models to four different efficient architectures (TK, ColBERT, PreTT, and a BERT CLS dot product model). We show that across our evaluated architectures our Margin-MSE knowledge distillation significantly improves re-ranking effectiveness without compromising their efficiency. Additionally, we show our general distillation method to improve nearest neighbor based index retrieval with the BERT dot product model, offering competitive results with specialized and much more costly training methods. To benefit the community, we publish the teacher-score training files in a ready-to-use package.

Journal ArticleDOI
TL;DR: In this paper, a simple model for the time evolution of droplet/aerosol concentration is presented based on a theoretical analysis of the relevant physical processes, which can be used to study a wide variety of scenarios involving breathing, talking, coughing and sneezing and in a number of environmental conditions, as humid or dry atmosphere, confined or open environment.

Journal ArticleDOI
06 May 2020-Neuron
TL;DR: The state of the art of tissue-clearing methods and light-sheet microscopy is reviewed and applications of these techniques in profiling cells and circuits in mice are discussed, to provide a systems-level understanding of the physiology and pathology of the central nervous system.

Journal ArticleDOI
TL;DR: The International Celestial Reference Frame (ICRF) 3 as discussed by the authors is based on the work achieved by a working group of the International Astronomical Union (IAU) mandated for this purpose.
Abstract: A new realization of the International Celestial Reference Frame (ICRF) is presented based on the work achieved by a working group of the International Astronomical Union (IAU) mandated for this purpose. This new realization follows the initial realization of the ICRF completed in 1997 and its successor, ICRF2, adopted as a replacement in 2009. The new frame, referred to as ICRF3, is based on nearly 40 years of data acquired by very long baseline interferometry at the standard geodetic and astrometric radio frequencies (8.4 and 2.3 GHz), supplemented with data collected at higher radio frequencies (24 GHz and dual-frequency 32 and 8.4 GHz) over the past 15 years. State-of-the-art astronomical and geophysical modeling has been used to analyze these data and derive source positions. The modeling integrates, for the first time, the effect of the galactocentric acceleration of the solar system (directly estimated from the data) which, if not considered, induces significant deformation of the frame due to the data span. The new frame includes positions at 8.4 GHz for 4536 extragalactic sources. Of these, 303 sources, uniformly distributed on the sky, are identified as “defining sources” and as such serve to define the axes of the frame. Positions at 8.4 GHz are supplemented with positions at 24 GHz for 824 sources and at 32 GHz for 678 sources. In all, ICRF3 comprises 4588 sources, with three-frequency positions available for 600 of these. Source positions have been determined independently at each of the frequencies in order to preserve the underlying astrophysical content behind such positions. They are reported for epoch 2015.0 and must be propagated for observations at other epochs for the most accurate needs, accounting for the acceleration toward the Galactic center, which results in a dipolar proper motion field of amplitude 0.0058 milliarcsecond yr−1 (mas yr−1 ). The frame is aligned onto the International Celestial Reference System to within the accuracy of ICRF2 and shows a median positional uncertainty of about 0.1 mas in right ascension and 0.2 mas in declination, with a noise floor of 0.03 mas in the individual source coordinates. A subset of 500 sources is found to have extremely accurate positions, in the range of 0.03–0.06 mas, at the traditional 8.4 GHz frequency. Comparing ICRF3 with the recently released Gaia Celestial Reference Frame 2 in the optical domain, there is no evidence for deformations larger than 0.03 mas between the two frames, in agreement with the ICRF3 noise level. Significant positional offsets between the three ICRF3 frequencies are detected for about 5% of the sources. Moreover, a notable fraction (22%) of the sources shows optical and radio positions that are significantly offset. There are indications that these positional offsets may be the manifestation of extended source structures. This third realization of the ICRF was adopted by the IAU at its 30th General Assembly in August 2018 and replaced the previous realization, ICRF2, on January 1, 2019.

Journal ArticleDOI
TL;DR: In this paper, a new series of long-term vegetation optical depth (VOD) products, VODCA, is presented, which combines VOD retrievals from multiple sensors (SSM/I, TMI, AMSR-E, WindSat, and AMSR2) using the Land Parameter Retrieval Model.
Abstract: . Since the late 1970s, space-borne microwave radiometers have been providing measurements of radiation emitted by the Earth’s surface. From these measurements it is possible to derive vegetation optical depth (VOD), a model-based indicator related to the density, biomass, and water content of vegetation. Because of its high temporal resolution and long availability, VOD can be used to monitor short- to long-term changes in vegetation. However, studying long-term VOD dynamics is generally hampered by the relatively short time span covered by the individual microwave sensors. This can potentially be overcome by merging multiple VOD products into a single climate data record. However, combining multiple sensors into a single product is challenging as systematic differences between input products like biases, different temporal and spatial resolutions, and coverage need to be overcome. Here, we present a new series of long-term VOD products, the VOD Climate Archive (VODCA). VODCA combines VOD retrievals that have been derived from multiple sensors (SSM/I, TMI, AMSR-E, WindSat, and AMSR2) using the Land Parameter Retrieval Model. We produce separate VOD products for microwave observations in different spectral bands, namely the Ku-band (period 1987–2017), X-band (1997–2018), and C-band (2002–2018). In this way, our multi-band VOD products preserve the unique characteristics of each frequency with respect to the structural elements of the canopy. Our merging approach builds on an existing approach that is used to merge satellite products of surface soil moisture: first, the data sets are co-calibrated via cumulative distribution function matching using AMSR-E as the scaling reference. To do so, we apply a new matching technique that scales outliers more robustly than ordinary piecewise linear interpolation. Second, we aggregate the data sets by taking the arithmetic mean between temporally overlapping observations of the scaled data. The characteristics of VODCA are assessed for self-consistency and against other products. Using an autocorrelation analysis, we show that the merging of the multiple data sets successfully reduces the random error compared to the input data sets. Spatio-temporal patterns and anomalies of the merged products show consistency between frequencies and with leaf area index observations from the MODIS instrument as well as with Vegetation Continuous Fields from the AVHRR instruments. Long-term trends in Ku-band VODCA show that since 1987 there has been a decline in VOD in the tropics and in large parts of east-central and north Asia, while a substantial increase is observed in India, large parts of Australia, southern Africa, southeastern China, and central North America. In summary, VODCA shows vast potential for monitoring spatial–temporal ecosystem changes as it is sensitive to vegetation water content and unaffected by cloud cover or high sun zenith angles. As such, it complements existing long-term optical indices of greenness and leaf area. The VODCA products ( Moesinger et al. , 2019 ) are open access and available under Attribution 4.0 International at https://doi.org/10.5281/zenodo.2575599 .

Journal ArticleDOI
TL;DR: In this article, a review of multi-domain approaches to indoor-environmental perception and behaviour is presented, highlighting motivational backgrounds, key methodologies, and major findings of human perception and behavior in indoor environments.

Journal ArticleDOI
TL;DR: In this article, the authors investigated the "drought paradox" for the European Alps using a 1,212-station database and hyper-resolution ecohydrological simulations to quantify blue (runoff) and green (evapotranspiration) water fluxes.
Abstract: Climate change can reduce surface-water supply by enhancing evapotranspiration in forested mountains, especially during heatwaves. We investigate this ‘drought paradox’ for the European Alps using a 1,212-station database and hyper-resolution ecohydrological simulations to quantify blue (runoff) and green (evapotranspiration) water fluxes. During the 2003 heatwave, evapotranspiration in large areas over the Alps was above average despite low precipitation, amplifying the runoff deficit by 32% in the most runoff-productive areas (1,300–3,000 m above sea level). A 3 °C air temperature increase could enhance annual evapotranspiration by up to 100 mm (45 mm on average), which would reduce annual runoff at a rate similar to a 3% precipitation decrease. This suggests that green-water feedbacks—which are often poorly represented in large-scale model simulations—pose an additional threat to water resources, especially in dry summers. Despite uncertainty in the validation of the hyper-resolution ecohydrological modelling with observations, this approach permits more realistic predictions of mountain region water availability. Mountain forest drought can paradoxically increase evapotranspiration (green water), helping vegetation at the expense of runoff (blue water). This is quantified for the 2003 event in the European Alps, highlighting underappreciated vulnerability of blue-water resources to future warmer summers.

Journal ArticleDOI
TL;DR: This work aims at providing an up-to-date survey, especially covering the prominent works from the last 3 years of the hardware architectures research for DNNs, covering the latest techniques in the field of dataflow, reconfigurability, variable bit-width, and sparsity.
Abstract: Deep Neural Networks (DNNs) are nowadays a common practice in most of the Artificial Intelligence (AI) applications. Their ability to go beyond human precision has made these networks a milestone in the history of AI. However, while on the one hand they present cutting edge performance, on the other hand they require enormous computing power. For this reason, numerous optimization techniques at the hardware and software level, and specialized architectures, have been developed to process these models with high performance and power/energy efficiency without affecting their accuracy. In the past, multiple surveys have been reported to provide an overview of different architectures and optimization techniques for efficient execution of Deep Learning (DL) algorithms. This work aims at providing an up-to-date survey, especially covering the prominent works from the last 3 years of the hardware architectures research for DNNs. In this paper, the reader will first understand what a hardware accelerator is, and what are its main components, followed by the latest techniques in the field of dataflow, reconfigurability, variable bit-width, and sparsity.