scispace - formally typeset
B

Bernhard Nessler

Researcher at Johannes Kepler University of Linz

Publications -  21
Citations -  10623

Bernhard Nessler is an academic researcher from Johannes Kepler University of Linz. The author has contributed to research in topics: Artificial neural network & Hebbian theory. The author has an hindex of 15, co-authored 20 publications receiving 7406 citations. Previous affiliations of Bernhard Nessler include Frankfurt Institute for Advanced Studies & Graz University of Technology.

Papers
More filters
Posted Content

GANs Trained by a Two Time-Scale Update Rule Converge to a Nash Equilibrium

TL;DR: In this article, a two time-scale update rule (TTUR) was proposed for training GANs with stochastic gradient descent on arbitrary GAN loss functions, which has an individual learning rate for both the discriminator and the generator.
Proceedings Article

GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium

TL;DR: In this paper, a two time-scale update rule (TTUR) was proposed for training GANs with stochastic gradient descent on arbitrary GAN loss functions, which has an individual learning rate for both the discriminator and the generator.
Journal ArticleDOI

Neural dynamics as sampling: a model for stochastic computation in recurrent networks of spiking neurons.

TL;DR: A neural network model is proposed and it is shown by a rigorous theoretical analysis that its neural activity implements MCMC sampling of a given distribution, both for the case of discrete and continuous time.
Journal ArticleDOI

Bayesian Computation Emerges in Generic Cortical Microcircuits through Spike-Timing-Dependent Plasticity

TL;DR: The results suggest that the experimentally observed spontaneous activity and trial-to-trial variability of cortical neurons are essential features of their information processing capability, since their functional role is to represent probability distributions rather than static neural codes.

Speeding up Semantic Segmentation for Autonomous Driving

TL;DR: A novel deep network architecture for image segmentation that keeps the high accuracy while being efficient enough for embedded devices is proposed, and achieves higher segmentation accuracy than other networks that are tailored to embedded devices.