scispace - formally typeset
E

Eldad Haber

Researcher at University of British Columbia

Publications -  231
Citations -  8696

Eldad Haber is an academic researcher from University of British Columbia. The author has contributed to research in topics: Inverse problem & Discretization. The author has an hindex of 48, co-authored 221 publications receiving 7328 citations. Previous affiliations of Eldad Haber include IBM & Emory University.

Papers
More filters
Journal ArticleDOI

Stable Architectures for Deep Neural Networks

TL;DR: New forward propagation techniques inspired by systems of Ordinary Differential Equations (ODE) are proposed that overcome this challenge and lead to well-posed learning problems for arbitrarily deep networks.
Journal ArticleDOI

Deep Neural Networks Motivated by Partial Differential Equations

TL;DR: In this article, a new PDE interpretation of a class of deep convolutional neural networks (CNN) was established, which are commonly used to learn from speech, image, and video data.
Journal ArticleDOI

On optimization techniques for solving nonlinear inverse problems

TL;DR: In this article, the problem of solving nonlinear inverse problems is formulated as a constrained or unconstrained optimization problem, and by employing sparse matrix techniques, the authors show that, by formulating the inversion problem as a sequential quadratic programming (SQP) problem, they can carry out variants of SQP and the full Newton iteration with only a modest additional cost.
Journal ArticleDOI

Joint inversion: a structural approach

TL;DR: In this article, a joint inversion of two different data sets with the assumption that the underlying models have a common structure is proposed. But the problem is nonlinear and is solved iteratively using Krylov space techniques.
Book ChapterDOI

Intensity gradient based registration and fusion of multi-modal images

TL;DR: This work investigates an alternative distance measure which is based on normalized gradients and compares its performance to Mutual Information, and calls it Normalized Gradient Fields (NGF).