scispace - formally typeset
Search or ask a question
Author

Uwe Naumann

Bio: Uwe Naumann is an academic researcher from RWTH Aachen University. The author has contributed to research in topics: Automatic differentiation & Jacobian matrix and determinant. The author has an hindex of 24, co-authored 130 publications receiving 2299 citations. Previous affiliations of Uwe Naumann include University of Hertfordshire & Argonne National Laboratory.


Papers
More filters
Book
01 Jan 2002
TL;DR: Automatic Differentiation of Algorithms provides a comprehensive and authoritative survey of all recent developments, new techniques, and tools for AD use.
Abstract: Automatic Differentiation (AD) is a maturing computational technology. It has become a mainstream tool used by practicing scientists and computer engineers. The rapid advance of hardware computing power and AD tools has enabled practitioners to generate derivative enhanced versions of their code for a broad range of applications in applied research and development. Automatic Differentiation of Algorithms provides a comprehensive and authoritative survey of all recent developments, new techniques, and tools for AD use.

328 citations

BookDOI
01 Jan 2002
TL;DR: Automatic Differentiation (AD) is a maturing computational technology as discussed by the authors, which has become a mainstream tool used by practicing scientists and computer engineers and has become one of the most widely used tools used in software development.
Abstract: Automatic Differentiation (AD) is a maturing computational technology. It has become a mainstream tool used by practicing scientists and computer engineers. The rapid advance of hardware computing power and AD tools has enabled practitioners to generate derivative enhanced versions of their code for a broad range of applications in applied research and development. Automatic Differentiation of Algorithms provides a comprehensive and authoritative survey of all recent developments, new techniques, and tools for AD use.

199 citations

Journal ArticleDOI
TL;DR: The Open/ADF tool allows the evaluation of derivatives of functions defined by a Fortran program, and supports various code reversal schemes with hierarchical checkpointing at the subroutine level for the generation of adjoint codes.
Abstract: The Open/ADF tool allows the evaluation of derivatives of functions defined by a Fortran program. The derivative evaluation is performed by a Fortran code resulting from the analysis and transformation of the original program that defines the function of interest. Open/ADF has been designed with a particular emphasis on modularity, flexibility, and the use of open source components. While the code transformation follows the basic principles of automatic differentiation, the tool implements new algorithmic approaches at various levels, for example, for basic block preaccumulation and call graph reversal. Unlike most other automatic differentiation tools, Open/ADF uses components provided by the Open/AD framework, which supports a comparatively easy extension of the code transformations in a language-independent fashion. It uses code analysis results implemented in the OpenAnalysis component. The interface to the language-independent transformation engine is an XML-based format, specified through an XML schema. The implemented transformation algorithms allow efficient derivative computations using locally optimized cross-country sequences of vertex, edge, and face elimination steps. Specifically, for the generation of adjoint codes, Open/ADF supports various code reversal schemes with hierarchical checkpointing at the subroutine level. As an example from geophysical fluid dynamics, a nonlinear time-dependent scalable, yet simple, barotropic ocean model is considered. OpenAD/F's reverse mode is applied to compute sensitivities of some of the model's transport properties with respect to gridded fields such as bottom topography as independent (control) variables.

170 citations

BookDOI
21 Jul 2008
TL;DR: This collection covers advances in automatic differentiation theory and practice and discusses various applications, which provide insight into effective strategies for using automatic differentiation for inverse problems and design optimization.
Abstract: This collection covers advances in automatic differentiation theory and practice. Computer scientists and mathematicians will learn about recent developments in automatic differentiation theory as well as mechanisms for the construction of robust and powerful automatic differentiation tools. Computational scientists and engineers will benefit from the discussion of various applications, which provide insight into effective strategies for using automatic differentiation for inverse problems and design optimization.

96 citations

Journal ArticleDOI
TL;DR: It is shown that the problem of accumulating Jacobian matrices by using a minimal number of floating-point operations is NP-complete by reduction from Ensemble Computation.
Abstract: We show that the problem of accumulating Jacobian matrices by using a minimal number of floating-point operations is NP-complete by reduction from Ensemble Computation. The proof makes use of the fact that, deviating from the state-of-the-art assumption, algebraic dependences can exist between the local partial derivatives. It follows immediately that the same problem for directional derivatives, adjoints, and higher derivatives is NP-complete, too.

78 citations


Cited by
More filters
Book
18 Nov 2016
TL;DR: Deep learning as mentioned in this paper is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts, and it is used in many applications such as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames.
Abstract: Deep learning is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts. Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning. The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames. Finally, the book offers research perspectives, covering such theoretical topics as linear factor models, autoencoders, representation learning, structured probabilistic models, Monte Carlo methods, the partition function, approximate inference, and deep generative models. Deep Learning can be used by undergraduate or graduate students planning careers in either industry or research, and by software engineers who want to begin using deep learning in their products or platforms. A website offers supplementary material for both readers and instructors.

38,208 citations

Christopher M. Bishop1
01 Jan 2006
TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Abstract: Probability Distributions.- Linear Models for Regression.- Linear Models for Classification.- Neural Networks.- Kernel Methods.- Sparse Kernel Machines.- Graphical Models.- Mixture Models and EM.- Approximate Inference.- Sampling Methods.- Continuous Latent Variables.- Sequential Data.- Combining Models.

10,141 citations

01 Jan 1990
TL;DR: An overview of the self-organizing map algorithm, on which the papers in this issue are based, is presented in this article, where the authors present an overview of their work.
Abstract: An overview of the self-organizing map algorithm, on which the papers in this issue are based, is presented in this article.

2,933 citations

Book
16 Apr 2009
TL;DR: This unique book provides an introduction to a subject whose use has steadily increased over the past 40 years, and provides broad coverage of the subject as well as the historical perspective of one of the originators of modern interval analysis.
Abstract: This unique book provides an introduction to a subject whose use has steadily increased over the past 40 years. An update of Ramon Moore s previous books on the topic, it provides broad coverage of the subject as well as the historical perspective of one of the originators of modern interval analysis. The authors provide a hands-on introduction to INTLAB, a high-quality, comprehensive MATLAB toolbox for interval computations, making this the first interval analysis book that does with INTLAB what general numerical analysis texts do with MATLAB. Readers will find the following features of interest: elementary motivating examples and notes that help maximize the reader s chance of success in applying the techniques; exercises and hands-on MATLAB-based examples woven into the text; INTLAB-based examples and explanations integrated into the text, along with a comprehensive set of exercises and solutions, and an appendix with INTLAB commands; an extensive bibliography and appendices that will continue to be valuable resources once the reader is familiar with the subject; and a Web page with links to computational tools and other resources of interest. Audience: Introduction to Interval Analysis will be valuable to engineers and scientists interested in scientific computation, especially in reliability, effects of roundoff error, and automatic verification of results. The introductory material is particularly important for experts in global optimization and constraint solution algorithms. This book is suitable for introducing the subject to students in these areas. Contents: Preface; Chapter 1: Introduction; Chapter 2: The Interval Number System; Chapter 3: First Applications of Interval Arithmetic; Chapter 4: Further Properties of Interval Arithmetic; Chapter 5: Introduction to Interval Functions; Chapter 6: Interval Sequences; Chapter 7: Interval Matrices; Chapter 8: Interval Newton Methods; Chapter 9: Integration of Interval Functions; Chapter 10: Integral and Differential Equations; Chapter 11: Applications; Appendix A: Sets and Functions; Appendix B: Formulary; Appendix C: Hints for Selected Exercises; Appendix D: Internet Resources; Appendix E: INTLAB Commands and Functions; References; Index.

2,070 citations

01 Mar 1987
TL;DR: The variable-order Adams method (SIVA/DIVA) package as discussed by the authors is a collection of subroutines for solution of non-stiff ODEs.
Abstract: Initial-value ordinary differential equation solution via variable order Adams method (SIVA/DIVA) package is collection of subroutines for solution of nonstiff ordinary differential equations. There are versions for single-precision and double-precision arithmetic. Requires fewer evaluations of derivatives than other variable-order Adams predictor/ corrector methods. Option for direct integration of second-order equations makes integration of trajectory problems significantly more efficient. Written in FORTRAN 77.

1,955 citations