Institution
Technical University of Berlin
Education•Berlin, Germany•
About: Technical University of Berlin is a education organization based out in Berlin, Germany. It is known for research contribution in the topics: Laser & Catalysis. The organization has 27292 authors who have published 59342 publications receiving 1414623 citations. The organization is also known as: Technische Universität Berlin & TU Berlin.
Topics: Laser, Catalysis, Quantum dot, Computer science, Context (language use)
Papers published on a yearly basis
Papers
More filters
••
TL;DR: The Scandinavian Approach to the development of computer-based systems is identified in certain common features shared by the different schools of thought, including efforts toward humanization and democratization as overriding design goals, in keeping with the aim of building an egalitarian society.
Abstract: This study set out to delineate the Scandinavian Approach to the development of computer-based systems. We aimed to help derive new ideas for human-oriented technology design in other countries. The study is based on the relevant literature, scientific contacts, and two field trips, and covers work in Denmark, Norway, and Sweden.
The study focuses on methodological questions and their theoretical foundations, on explicit strategies for social implementation, and on innovative design illustrated by reference to concrete projects. Though it makes no claim to present a sociopolitical analysis of Scandinavian technology design, the sociocultural background is given due consideration.
There is no general agreement among Scandinavians as to whether or not there is a well-defined Scandinavian Approach. We have come to identify such an approach in certain common features shared by the different schools of thought. These include efforts toward humanization and democratization as overriding design goals, in keeping with the aim of building an egalitarian society.
272 citations
••
TL;DR: This paper considers a time-division duplex system where uplink training is required and an active eavesdropper can attack the training phase to cause pilot contamination at the transmitter, and derives an asymptotic achievable secrecy rate when the number of transmit antennas approaches infinity.
Abstract: In this paper, we investigate secure and reliable transmission strategies for multi-cell multi-user massive multiple-input multiple-output systems with a multi-antenna active eavesdropper. We consider a time-division duplex system where uplink training is required and an active eavesdropper can attack the training phase to cause pilot contamination at the transmitter. This forces the precoder used in the subsequent downlink transmission phase to implicitly beamform toward the eavesdropper, thus increasing its received signal power. Assuming matched filter precoding and artificial noise (AN) generation at the transmitter, we derive an asymptotic achievable secrecy rate when the number of transmit antennas approaches infinity. For the case of a single-antenna active eavesdropper, we obtain a closed-form expression for the optimal power allocation policy for the transmit signal and the AN, and find the minimum transmit power required to ensure reliable secure communication. Furthermore, we show that the transmit antenna correlation diversity of the intended users and the eavesdropper can be exploited in order to improve the secrecy rate. In fact, under certain orthogonality conditions of the channel covariance matrices, the secrecy rate loss introduced by the eavesdropper can be completely mitigated.
272 citations
••
01 Aug 2008TL;DR: This method can be used to map a surface mesh to a parameter domain which is flat except for isolated cone singularities, and it is shown how these can be placed automatically in order to reduce the distortion of the parameterization.
Abstract: We present a new algorithm for conformal mesh parameterization. It is based on a precise notion of discrete conformal equivalence for triangle meshes which mimics the notion of conformal equivalence for smooth surfaces. The problem of finding a flat mesh that is discretely conformally equivalent to a given mesh can be solved efficiently by minimizing a convex energy function, whose Hessian turns out to be the well known cot-Laplace operator. This method can also be used to map a surface mesh to a parameter domain which is flat except for isolated cone singularities, and we show how these can be placed automatically in order to reduce the distortion of the parameterization. We present the salient features of the theory and elaborate the algorithms with a number of examples.
272 citations
••
TL;DR: The experiments complemented by microscopic modeling reveal that the carrier relaxation is significantly slowed down as the photon energy is tuned to values below the optical-phonon frequency; however, owing to the presence of hot carriers, optical-Phonon emission is still the predominant relaxation process.
Abstract: We study the carrier dynamics in epitaxially grown graphene in the range of photon energies from 10 to 250 meV. The experiments complemented by microscopic modeling reveal that the carrier relaxation is significantly slowed down as the photon energy is tuned to values below the optical-phonon frequency; however, owing to the presence of hot carriers, optical-phonon emission is still the predominant relaxation process. For photon energies about twice the value of the Fermi energy, a transition from pump-induced transmission to pump-induced absorption occurs due to the interplay of interband and intraband processes.
271 citations
••
TL;DR: In this article, the authors present a framework for data uncertainty assessment in life cycle inventories (LCI), where data uncertainty is divided into two categories: lack of data (data gaps) and data inaccuracy.
Abstract: Modelling data uncertainty is not common practice in life cycle inventories (LCI), although different techniques are available for estimating and expressing uncertainties, and for propagating the uncertainties to the final model results. To clarify and stimulate the use of data uncertainty assessments in common LCI practice, the SETAC working group ‘Data Availability and Quality’ presents a framework for data uncertainty assessment in LCI. Data uncertainty is divided in two categories: (1) lack of data, further specified as complete lack of data (data gaps) and a lack of representative data, and (2) data inaccuracy. Filling data gaps can be done by input-output modelling, using information for similar products or the main ingredients of a product, and applying the law of mass conservation. Lack of temporal, geographical and further technological correlation between the data used and needed may be accounted for by applying uncertainty factors to the non-representative data. Stochastic modelling, which can be performed by Monte Carlo simulation, is a promising technique to deal with data inaccuracy in LCIs.
271 citations
Authors
Showing all 27602 results
Name | H-index | Papers | Citations |
---|---|---|---|
Markus Antonietti | 176 | 1068 | 127235 |
Jian Li | 133 | 2863 | 87131 |
Klaus-Robert Müller | 129 | 764 | 79391 |
Michael Wagner | 124 | 351 | 54251 |
Shi Xue Dou | 122 | 2028 | 74031 |
Xinchen Wang | 120 | 349 | 65072 |
Michael S. Feld | 119 | 552 | 51968 |
Jian Liu | 117 | 2090 | 73156 |
Ary A. Hoffmann | 113 | 907 | 55354 |
Stefan Grimme | 113 | 680 | 105087 |
David M. Karl | 112 | 461 | 48702 |
Lester Packer | 112 | 751 | 63116 |
Andreas Heinz | 108 | 1078 | 45002 |
Horst Weller | 105 | 451 | 44273 |
G. Hughes | 103 | 957 | 46632 |