scispace - formally typeset
Search or ask a question
Institution

Technical University of Berlin

EducationBerlin, Germany
About: Technical University of Berlin is a education organization based out in Berlin, Germany. It is known for research contribution in the topics: Laser & Catalysis. The organization has 27292 authors who have published 59342 publications receiving 1414623 citations. The organization is also known as: Technische Universität Berlin & TU Berlin.


Papers
More filters
Journal ArticleDOI
TL;DR: The results revealed that the initial size of the urban heat island had significant influence on SUHIM, and cities of cooler climates and cities with higher shares of urban green spaces were more affected by additional heat during heat waves.

323 citations

Journal ArticleDOI
TL;DR: Ailanthus altissima (tree of heaven), Simaroubaceae, is an early successional tree, native to China and North Vietnam, which has become invasive in Europe and on all other continents except Antarctica.
Abstract: Ailanthus altissima (tree of heaven), Simaroubaceae, is an early successional tree, native to China and North Vietnam, which has become invasive in Europe and on all other continents except Antarctica. It is most abundant in urban habitats and along transportation corridors, but can also invade natural habitats. This paper reviews the literature on the morphology, distribution, ecology, habitat requirements, population biology, genetics, physiology, impacts, management and uses of this species.

323 citations

Journal ArticleDOI
TL;DR: In this paper, the Lazarus theory is used as a basis for developing a model to explain how leadership affects cognitive processes of perceiving the work setting (need for and susceptibility to change), innovative behaviors (generation and testing of ideas, and implementation), and innovation-blocking behaviors (intrapsychic coping and flight).
Abstract: The Lazarus theory, which has been adapted to the context of innovation, is used as a basis for developing a model to explain how leadership affects cognitive processes of perceiving the work setting (need for and susceptibility to change), innovative behaviors (generation and testing of ideas, and implementation), and innovation-blocking behaviors (intrapsychic coping and flight). Leadership is described in terms of selected bases of influence (identification, expert knowledge/information, granting freedom and autonomy, support for innovation, and openness of the decision-making process). The model's explanatory power is tested on a sample of 399 middle managers from different German organizations of various sizes and sectors. Hierarchical regression analyses show that granting freedom and autonomy and using expert knowledge and information have the most positive effect on these cognitive processes and innovative behaviors, and the most negative effect on innovation-blocking behaviors.

323 citations

Proceedings Article
28 Nov 2017
TL;DR: The authors introduced Probability Density Distillation, a new method for training a parallel feed-forward network from a trained WaveNet with no significant difference in quality, which is capable of generating high-fidelity speech samples at more than 20 times faster than real-time, and is deployed online by Google Assistant.
Abstract: The recently-developed WaveNet architecture is the current state of the art in realistic speech synthesis, consistently rated as more natural sounding for many different languages than any previous system. However, because WaveNet relies on sequential generation of one audio sample at a time, it is poorly suited to today's massively parallel computers, and therefore hard to deploy in a real-time production setting. This paper introduces Probability Density Distillation, a new method for training a parallel feed-forward network from a trained WaveNet with no significant difference in quality. The resulting system is capable of generating high-fidelity speech samples at more than 20 times faster than real-time, and is deployed online by Google Assistant, including serving multiple English and Japanese voices.

323 citations

Journal ArticleDOI
TL;DR: This article shows how optimization tasks can be treated in the TT format by a generalization of the well-known alternating least squares (ALS) algorithm and by a modified approach (MALS) that enables dynamical rank adaptation.
Abstract: Recent achievements in the field of tensor product approximation provide promising new formats for the representation of tensors in form of tree tensor networks. In contrast to the canonical $r$-term representation (CANDECOMP, PARAFAC), these new formats provide stable representations, while the amount of required data is only slightly larger. The tensor train (TT) format [SIAM J. Sci. Comput., 33 (2011), pp. 2295-2317], a simple special case of the hierarchical Tucker format [J. Fourier Anal. Appl., 5 (2009), p. 706], is a useful prototype for practical low-rank tensor representation. In this article, we show how optimization tasks can be treated in the TT format by a generalization of the well-known alternating least squares (ALS) algorithm and by a modified approach (MALS) that enables dynamical rank adaptation. A formulation of the component equations in terms of so-called retraction operators helps to show that many structural properties of the original problems transfer to the micro-iterations, giving what is to our knowledge the first stable generic algorithm for the treatment of optimization tasks in the tensor format. For the examples of linear equations and eigenvalue equations, we derive concrete working equations for the micro-iteration steps; numerical examples confirm the theoretical results concerning the stability of the TT decomposition and of ALS and MALS but also show that in some cases, high TT ranks are required during the iterative approximation of low-rank tensors, showing some potential of improvement.

323 citations


Authors

Showing all 27602 results

NameH-indexPapersCitations
Markus Antonietti1761068127235
Jian Li133286387131
Klaus-Robert Müller12976479391
Michael Wagner12435154251
Shi Xue Dou122202874031
Xinchen Wang12034965072
Michael S. Feld11955251968
Jian Liu117209073156
Ary A. Hoffmann11390755354
Stefan Grimme113680105087
David M. Karl11246148702
Lester Packer11275163116
Andreas Heinz108107845002
Horst Weller10545144273
G. Hughes10395746632
Network Information
Related Institutions (5)
ETH Zurich
122.4K papers, 5.1M citations

93% related

École Polytechnique Fédérale de Lausanne
98.2K papers, 4.3M citations

93% related

RWTH Aachen University
96.2K papers, 2.5M citations

93% related

Technische Universität München
123.4K papers, 4M citations

92% related

École Normale Supérieure
99.4K papers, 3M citations

92% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2023191
2022650
20213,307
20203,387
20193,105
20182,910