W
Wojciech Samek
Researcher at Heinrich Hertz Institute
Publications - 176
Citations - 11971
Wojciech Samek is an academic researcher from Heinrich Hertz Institute. The author has contributed to research in topics: Artificial neural network & Deep learning. The author has an hindex of 37, co-authored 169 publications receiving 6100 citations. Previous affiliations of Wojciech Samek include Technical University of Berlin & Fraunhofer Society.
Papers
More filters
Posted Content
Explainable Artificial Intelligence: Understanding, Visualizing and Interpreting Deep Learning Models
TL;DR: Two approaches to explaining predictions of deep learning models are presented, one method which computes the sensitivity of the prediction with respect to changes in the input and one approach which meaningfully decomposes the decision in terms of the input variables.
Journal ArticleDOI
Robust and Communication-Efficient Federated Learning From Non-i.i.d. Data
TL;DR: In this paper, the authors propose sparse ternary compression (STC), a new compression framework that is specifically designed to meet the requirements of the federated learning environment, which extends the existing compression technique of top- $k$ gradient sparsification with a novel mechanism to enable downstream compression as well as ternarization and optimal Golomb encoding of the weight updates.
Journal ArticleDOI
Unmasking Clever Hans Predictors and Assessing What Machines Really Learn
Sebastian Lapuschkin,Stephan Wäldchen,Alexander Binder,Grégoire Montavon,Wojciech Samek,Klaus-Robert Müller,Klaus-Robert Müller,Klaus-Robert Müller +7 more
TL;DR: The authors investigate how these methods approach learning in order to assess the dependability of their decision making and propose a semi-automated Spectral Relevance Analysis that provides a practically effective way of characterizing and validating the behavior of nonlinear learning machines.
Posted Content
Robust and Communication-Efficient Federated Learning from Non-IID Data
TL;DR: Sparse ternary compression (STC) is proposed, a new compression framework that is specifically designed to meet the requirements of the federated learning environment and advocate for a paradigm shift in federated optimization toward high-frequency low-bitwidth communication, in particular in the bandwidth-constrained learning environments.