scispace - formally typeset
Search or ask a question
Institution

Technical University of Berlin

EducationBerlin, Germany
About: Technical University of Berlin is a education organization based out in Berlin, Germany. It is known for research contribution in the topics: Laser & Catalysis. The organization has 27292 authors who have published 59342 publications receiving 1414623 citations. The organization is also known as: Technische Universität Berlin & TU Berlin.


Papers
More filters
Journal ArticleDOI
01 Dec 2014
TL;DR: The overall system architecture design decisions are presented, Stratosphere is introduced through example queries, and the internal workings of the system’s components that relate to extensibility, programming model, optimization, and query execution are dive into.
Abstract: We present Stratosphere, an open-source software stack for parallel data analysis. Stratosphere brings together a unique set of features that allow the expressive, easy, and efficient programming of analytical applications at very large scale. Stratosphere's features include "in situ" data processing, a declarative query language, treatment of user-defined functions as first-class citizens, automatic program parallelization and optimization, support for iterative programs, and a scalable and efficient execution engine. Stratosphere covers a variety of "Big Data" use cases, such as data warehousing, information extraction and integration, data cleansing, graph analysis, and statistical analysis applications. In this paper, we present the overall system architecture design decisions, introduce Stratosphere through example queries, and then dive into the internal workings of the system's components that relate to extensibility, programming model, optimization, and query execution. We experimentally compare Stratosphere against popular open-source alternatives, and we conclude with a research outlook for the next years.

491 citations

Book ChapterDOI
02 Nov 2017
TL;DR: This work uses a simple and common pre-processing step ---adding a constant shift to the input data--- to show that a transformation with no effect on the model can cause numerous methods to incorrectly attribute.
Abstract: Saliency methods aim to explain the predictions of deep neural networks. These methods lack reliability when the explanation is sensitive to factors that do not contribute to the model prediction. We use a simple and common pre-processing step which can be compensated for easily—adding a constant shift to the input data—to show that a transformation with no effect on how the model makes the decision can cause numerous methods to attribute incorrectly. In order to guarantee reliability, we believe that the explanation should not change when we can guarantee that two networks process the images in identical manners. We show, through several examples, that saliency methods that do not satisfy this requirement result in misleading attribution. The approach can be seen as a type of unit test; we construct a narrow ground truth to measure one stated desirable property. As such, we hope the community will embrace the development of additional tests.

490 citations

Journal ArticleDOI
TL;DR: This review focuses on recent additions to TURBOMOLE’s functionality, including excited-state methods, RPA and Green's function methods, relativistic approaches, high-order molecular properties, solvation effects, and periodic systems.
Abstract: TURBOMOLE is a collaborative, multi-national software development project aiming to provide highly efficient and stable computational tools for quantum chemical simulations of molecules, clusters, periodic systems, and solutions. The TURBOMOLE software suite is optimized for widely available, inexpensive, and resource-efficient hardware such as multi-core workstations and small computer clusters. TURBOMOLE specializes in electronic structure methods with outstanding accuracy-cost ratio, such as density functional theory including local hybrids and the random phase approximation (RPA), GW-Bethe-Salpeter methods, second-order Moller-Plesset theory, and explicitly correlated coupled-cluster methods. TURBOMOLE is based on Gaussian basis sets and has been pivotal for the development of many fast and low-scaling algorithms in the past three decades, such as integral-direct methods, fast multipole methods, the resolution-of-the-identity approximation, imaginary frequency integration, Laplace transform, and pair natural orbital methods. This review focuses on recent additions to TURBOMOLE's functionality, including excited-state methods, RPA and Green's function methods, relativistic approaches, high-order molecular properties, solvation effects, and periodic systems. A variety of illustrative applications along with accuracy and timing data are discussed. Moreover, available interfaces to users as well as other software are summarized. TURBOMOLE's current licensing, distribution, and support model are discussed, and an overview of TURBOMOLE's development workflow is provided. Challenges such as communication and outreach, software infrastructure, and funding are highlighted.

489 citations

Journal ArticleDOI
TL;DR: In this article, a deep multi-task artificial neural network is used to predict multiple electronic ground and excited-state properties, such as atomization energy, polarizability, frontier orbital eigenvalues, ionization potential, electron affinity and excitation energies.
Abstract: The combination of modern scientific computing with electronic structure theory can lead to an unprecedented amount of data amenable to intelligent data analysis for the identification of meaningful, novel and predictive structure?property relationships. Such relationships enable high-throughput screening for relevant properties in an exponentially growing pool of virtual compounds that are synthetically accessible. Here, we present a machine learning model, trained on a database of ab initio calculation results for thousands of organic molecules, that simultaneously predicts multiple electronic ground- and excited-state properties. The properties include atomization energy, polarizability, frontier orbital eigenvalues, ionization potential, electron affinity and excitation energies. The machine learning model is based on a deep multi-task artificial neural network, exploiting the underlying correlations between various molecular properties. The input is identical to ab initio methods, i.e.?nuclear charges and Cartesian coordinates of all atoms. For small organic molecules, the accuracy of such a ?quantum machine? is similar, and sometimes superior, to modern quantum-chemical methods?at negligible computational cost.

488 citations

Proceedings ArticleDOI
04 Nov 2009
TL;DR: Observations from monitoring the network activity for more than 20,000 residential DSL customers in an urban area find that HTTP - not peer-to-peer - traffic dominates by a significant margin and that the DSL lines are frequently not the bottleneck in bulk-transfer performance.
Abstract: While residential broadband Internet access is popular in many parts of the world, only a few studies have examined the characteristics of such traffic. In this paper we describe observations from monitoring the network activity for more than 20,000 residential DSL customers in an urban area. To ensure privacy, all data is immediately anonymized. We augment the anonymized packet traces with information about DSL-level sessions, IP (re-)assignments, and DSL link bandwidth.Our analysis reveals a number of surprises in terms of the mental models we developed from the measurement literature. For example, we find that HTTP - not peer-to-peer - traffic dominates by a significant margin; that more often than not the home user's immediate ISP connectivity contributes more to the round-trip times the user experiences than the WAN portion of the path; and that the DSL lines are frequently not the bottleneck in bulk-transfer performance.

485 citations


Authors

Showing all 27602 results

NameH-indexPapersCitations
Markus Antonietti1761068127235
Jian Li133286387131
Klaus-Robert Müller12976479391
Michael Wagner12435154251
Shi Xue Dou122202874031
Xinchen Wang12034965072
Michael S. Feld11955251968
Jian Liu117209073156
Ary A. Hoffmann11390755354
Stefan Grimme113680105087
David M. Karl11246148702
Lester Packer11275163116
Andreas Heinz108107845002
Horst Weller10545144273
G. Hughes10395746632
Network Information
Related Institutions (5)
ETH Zurich
122.4K papers, 5.1M citations

93% related

École Polytechnique Fédérale de Lausanne
98.2K papers, 4.3M citations

93% related

RWTH Aachen University
96.2K papers, 2.5M citations

93% related

Technische Universität München
123.4K papers, 4M citations

92% related

École Normale Supérieure
99.4K papers, 3M citations

92% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2023191
2022650
20213,307
20203,387
20193,105
20182,910