Institution
Technical University of Berlin
Education•Berlin, Germany•
About: Technical University of Berlin is a education organization based out in Berlin, Germany. It is known for research contribution in the topics: Laser & Catalysis. The organization has 27292 authors who have published 59342 publications receiving 1414623 citations. The organization is also known as: Technische Universität Berlin & TU Berlin.
Topics: Laser, Catalysis, Quantum dot, Computer science, Context (language use)
Papers published on a yearly basis
Papers
More filters
••
TL;DR: The results revealed that the initial size of the urban heat island had significant influence on SUHIM, and cities of cooler climates and cities with higher shares of urban green spaces were more affected by additional heat during heat waves.
323 citations
••
TL;DR: Ailanthus altissima (tree of heaven), Simaroubaceae, is an early successional tree, native to China and North Vietnam, which has become invasive in Europe and on all other continents except Antarctica.
Abstract: Ailanthus altissima (tree of heaven), Simaroubaceae, is an early successional tree, native to China and North Vietnam, which has become invasive in Europe and on all other continents except Antarctica. It is most abundant in urban habitats and along transportation corridors, but can also invade natural habitats. This paper reviews the literature on the morphology, distribution, ecology, habitat requirements, population biology, genetics, physiology, impacts, management and uses of this species.
323 citations
••
TL;DR: In this paper, the Lazarus theory is used as a basis for developing a model to explain how leadership affects cognitive processes of perceiving the work setting (need for and susceptibility to change), innovative behaviors (generation and testing of ideas, and implementation), and innovation-blocking behaviors (intrapsychic coping and flight).
Abstract: The Lazarus theory, which has been adapted to the context of innovation, is used as a basis for developing a model to explain how leadership affects cognitive processes of perceiving the work setting (need for and susceptibility to change), innovative behaviors (generation and testing of ideas, and implementation), and innovation-blocking behaviors (intrapsychic coping and flight). Leadership is described in terms of selected bases of influence (identification, expert knowledge/information, granting freedom and autonomy, support for innovation, and openness of the decision-making process). The model's explanatory power is tested on a sample of 399 middle managers from different German organizations of various sizes and sectors. Hierarchical regression analyses show that granting freedom and autonomy and using expert knowledge and information have the most positive effect on these cognitive processes and innovative behaviors, and the most negative effect on innovation-blocking behaviors.
323 citations
•
28 Nov 2017
TL;DR: The authors introduced Probability Density Distillation, a new method for training a parallel feed-forward network from a trained WaveNet with no significant difference in quality, which is capable of generating high-fidelity speech samples at more than 20 times faster than real-time, and is deployed online by Google Assistant.
Abstract: The recently-developed WaveNet architecture is the current state of the art in realistic speech synthesis, consistently rated as more natural sounding for many different languages than any previous system. However, because WaveNet relies on sequential generation of one audio sample at a time, it is poorly suited to today's massively parallel computers, and therefore hard to deploy in a real-time production setting. This paper introduces Probability Density Distillation, a new method for training a parallel feed-forward network from a trained WaveNet with no significant difference in quality. The resulting system is capable of generating high-fidelity speech samples at more than 20 times faster than real-time, and is deployed online by Google Assistant, including serving multiple English and Japanese voices.
323 citations
••
TL;DR: This article shows how optimization tasks can be treated in the TT format by a generalization of the well-known alternating least squares (ALS) algorithm and by a modified approach (MALS) that enables dynamical rank adaptation.
Abstract: Recent achievements in the field of tensor product approximation provide promising new formats for the representation of tensors in form of tree tensor networks. In contrast to the canonical $r$-term representation (CANDECOMP, PARAFAC), these new formats provide stable representations, while the amount of required data is only slightly larger. The tensor train (TT) format [SIAM J. Sci. Comput., 33 (2011), pp. 2295-2317], a simple special case of the hierarchical Tucker format [J. Fourier Anal. Appl., 5 (2009), p. 706], is a useful prototype for practical low-rank tensor representation. In this article, we show how optimization tasks can be treated in the TT format by a generalization of the well-known alternating least squares (ALS) algorithm and by a modified approach (MALS) that enables dynamical rank adaptation. A formulation of the component equations in terms of so-called retraction operators helps to show that many structural properties of the original problems transfer to the micro-iterations, giving what is to our knowledge the first stable generic algorithm for the treatment of optimization tasks in the tensor format. For the examples of linear equations and eigenvalue equations, we derive concrete working equations for the micro-iteration steps; numerical examples confirm the theoretical results concerning the stability of the TT decomposition and of ALS and MALS but also show that in some cases, high TT ranks are required during the iterative approximation of low-rank tensors, showing some potential of improvement.
323 citations
Authors
Showing all 27602 results
Name | H-index | Papers | Citations |
---|---|---|---|
Markus Antonietti | 176 | 1068 | 127235 |
Jian Li | 133 | 2863 | 87131 |
Klaus-Robert Müller | 129 | 764 | 79391 |
Michael Wagner | 124 | 351 | 54251 |
Shi Xue Dou | 122 | 2028 | 74031 |
Xinchen Wang | 120 | 349 | 65072 |
Michael S. Feld | 119 | 552 | 51968 |
Jian Liu | 117 | 2090 | 73156 |
Ary A. Hoffmann | 113 | 907 | 55354 |
Stefan Grimme | 113 | 680 | 105087 |
David M. Karl | 112 | 461 | 48702 |
Lester Packer | 112 | 751 | 63116 |
Andreas Heinz | 108 | 1078 | 45002 |
Horst Weller | 105 | 451 | 44273 |
G. Hughes | 103 | 957 | 46632 |