scispace - formally typeset
Search or ask a question
Author

Richard Courant

Bio: Richard Courant is an academic researcher from New York University. The author has contributed to research in topics: Differential equation & Numerical partial differential equations. The author has an hindex of 35, co-authored 117 publications receiving 31604 citations. Previous affiliations of Richard Courant include University of Göttingen & Courant Institute of Mathematical Sciences.


Papers
More filters
Book
01 Jan 1947
TL;DR: In this paper, the authors present an algebraic extension of LINEAR TRANSFORMATIONS and QUADRATIC FORMS, and apply it to EIGEN-VARIATIONS.
Abstract: Partial table of contents: THE ALGEBRA OF LINEAR TRANSFORMATIONS AND QUADRATIC FORMS. Transformation to Principal Axes of Quadratic and Hermitian Forms. Minimum-Maximum Property of Eigenvalues. SERIES EXPANSION OF ARBITRARY FUNCTIONS. Orthogonal Systems of Functions. Measure of Independence and Dimension Number. Fourier Series. Legendre Polynomials. LINEAR INTEGRAL EQUATIONS. The Expansion Theorem and Its Applications. Neumann Series and the Reciprocal Kernel. The Fredholm Formulas. THE CALCULUS OF VARIATIONS. Direct Solutions. The Euler Equations. VIBRATION AND EIGENVALUE PROBLEMS. Systems of a Finite Number of Degrees of Freedom. The Vibrating String. The Vibrating Membrane. Green's Function (Influence Function) and Reduction of Differential Equations to Integral Equations. APPLICATION OF THE CALCULUS OF VARIATIONS TO EIGENVALUE PROBLEMS. Completeness and Expansion Theorems. Nodes of Eigenfunctions. SPECIAL FUNCTIONS DEFINED BY EIGENVALUE PROBLEMS. Bessel Functions. Asymptotic Expansions. Additional Bibliography. Index.

7,426 citations

Journal ArticleDOI
TL;DR: In this paper, the authors present an algebraic extension of LINEAR TRANSFORMATIONS and QUADRATIC FORMS, and apply it to EIGEN-VARIATIONS.
Abstract: Partial table of contents: THE ALGEBRA OF LINEAR TRANSFORMATIONS AND QUADRATIC FORMS. Transformation to Principal Axes of Quadratic and Hermitian Forms. Minimum-Maximum Property of Eigenvalues. SERIES EXPANSION OF ARBITRARY FUNCTIONS. Orthogonal Systems of Functions. Measure of Independence and Dimension Number. Fourier Series. Legendre Polynomials. LINEAR INTEGRAL EQUATIONS. The Expansion Theorem and Its Applications. Neumann Series and the Reciprocal Kernel. The Fredholm Formulas. THE CALCULUS OF VARIATIONS. Direct Solutions. The Euler Equations. VIBRATION AND EIGENVALUE PROBLEMS. Systems of a Finite Number of Degrees of Freedom. The Vibrating String. The Vibrating Membrane. Green's Function (Influence Function) and Reduction of Differential Equations to Integral Equations. APPLICATION OF THE CALCULUS OF VARIATIONS TO EIGENVALUE PROBLEMS. Completeness and Expansion Theorems. Nodes of Eigenfunctions. SPECIAL FUNCTIONS DEFINED BY EIGENVALUE PROBLEMS. Bessel Functions. Asymptotic Expansions. Additional Bibliography. Index.

4,525 citations

Journal ArticleDOI
TL;DR: In this paper, the authors present a Gebrauch bestimmt ausschließlich für den persönlichen, nicht kommerziellen Gebrauchs, which is a rechtschutzbestimmter gebrauch, and gilt vorbehaltlich der folgenden Einschränkungen.
Abstract: ----------------------------------------------------Nutzungsbedingungen DigiZeitschriften e.V. gewährt ein nicht exklusives, nicht übertragbares, persönliches und beschränktes Recht auf Nutzung dieses Dokuments. Dieses Dokument ist ausschließlich für den persönlichen, nicht kommerziellen Gebrauch bestimmt. Das Copyright bleibt bei den Herausgebern oder sonstigen Rechteinhabern. Als Nutzer sind Sie sind nicht dazu berechtigt, eine Lizenz zu übertragen, zu transferieren oder an Dritte weiter zu geben. Die Nutzung stellt keine Übertragung des Eigentumsrechts an diesem Dokument dar und gilt vorbehaltlich der folgenden Einschränkungen: Sie müssen auf sämtlichen Kopien dieses Dokuments alle Urheberrechtshinweise und sonstigen Hinweise auf gesetzlichen Schutz beibehalten; und Sie dürfen dieses Dokument nicht in irgend einer Weise abändern, noch dürfen Sie dieses Dokument für öffentliche oder kommerzielle Zwecke vervielfältigen, öffentlich ausstellen, aufführen, vertreiben oder anderweitig nutzen; es sei denn, es liegt Ihnen eine schriftliche Genehmigung von DigiZeitschriften e.V. und vom Herausgeber oder sonstigen Rechteinhaber vor. Mit dem Gebrauch von DigiZeitschriften e.V. und der Verwendung dieses Dokuments erkennen Sie die Nutzungsbedingungen an.

3,675 citations

Book
01 Jan 1948
TL;DR: In this article, the authors proposed a method to compressible ecoulement for compressible compressible and supersonique and onde de choc Reference Record created on 2005-11-18, modified on 2016-08-08
Abstract: Keywords: ecoulement : compressible ; ecoulement : supersonique ; onde de : choc Reference Record created on 2005-11-18, modified on 2016-08-08

3,239 citations

Book
04 Sep 2011
TL;DR: In this paper, a discussion of the behavior of the solution as the mesh width tends to zero is presented, and the applicability of the method to more general difference equations and to those with arbitrarily many independent variables is made clear.
Abstract: Problems involving the classical linear partial differential equations of mathematical physics can be reduced to algebraic ones of a very much simpler structure by replacing the differentials by difference quotients on some (say rectilinear) mesh. This paper will undertake an elementary discussion of these algebraic problems, in particular of the behavior of the solution as the mesh width tends to zero. For present purposes we limit ourselves mainly to simple but typical cases, and treat them in such a way that the applicability of the method to more general difference equations and to those with arbitrarily many independent variables is made clear.

2,047 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Abstract: The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data. High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

37,861 citations

Book
01 Nov 2008
TL;DR: Numerical Optimization presents a comprehensive and up-to-date description of the most effective methods in continuous optimization, responding to the growing interest in optimization in engineering, science, and business by focusing on the methods that are best suited to practical problems.
Abstract: Numerical Optimization presents a comprehensive and up-to-date description of the most effective methods in continuous optimization. It responds to the growing interest in optimization in engineering, science, and business by focusing on the methods that are best suited to practical problems. For this new edition the book has been thoroughly updated throughout. There are new chapters on nonlinear interior methods and derivative-free methods for optimization, both of which are used widely in practice and the focus of much current research. Because of the emphasis on practical methods, as well as the extensive illustrations and exercises, the book is accessible to a wide audience. It can be used as a graduate text in engineering, operations research, mathematics, computer science, and business. It also serves as a handbook for researchers and practitioners in the field. The authors have strived to produce a text that is pleasant to read, informative, and rigorous - one that reveals both the beautiful nature of the discipline and its practical side.

17,420 citations

Journal ArticleDOI
TL;DR: There are several arguments which support the observed high accuracy of SVMs, which are reviewed and numerous examples and proofs of most of the key theorems are given.
Abstract: The tutorial starts with an overview of the concepts of VC dimension and structural risk minimization. We then describe linear Support Vector Machines (SVMs) for separable and non-separable data, working through a non-trivial example in detail. We describe a mechanical analogy, and discuss when SVM solutions are unique and when they are global. We describe how support vector training can be practically implemented, and discuss in detail the kernel mapping technique which is used to construct SVM solutions which are nonlinear in the data. We show how Support Vector machines can have very large (even infinite) VC dimension by computing the VC dimension for homogeneous polynomial and Gaussian radial basis function kernels. While very high VC dimension would normally bode ill for generalization performance, and while at present there exists no theory which shows that good generalization performance is guaranteed for SVMs, there are several arguments which support the observed high accuracy of SVMs, which we review. Results of some experiments which were inspired by these arguments are also presented. We give numerous examples and proofs of most of the key theorems. There is new material, and I hope that the reader will find that even old material is cast in a fresh light.

15,696 citations

Proceedings ArticleDOI
01 Jul 1992
TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Abstract: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions. The effective number of parameters is adjusted automatically to match the complexity of the problem. The solution is expressed as a linear combination of supporting patterns. These are the subset of training patterns that are closest to the decision boundary. Bounds on the generalization performance based on the leave-one-out method and the VC-dimension are given. Experimental results on optical character recognition problems demonstrate the good generalization obtained when compared with other learning algorithms.

11,211 citations

Journal ArticleDOI
TL;DR: A new method for performing a nonlinear form of principal component analysis by the use of integral operator kernel functions is proposed and experimental results on polynomial feature extraction for pattern recognition are presented.
Abstract: A new method for performing a nonlinear form of principal component analysis is proposed. By the use of integral operator kernel functions, one can efficiently compute principal components in high-dimensional feature spaces, related to input space by some nonlinear map—for instance, the space of all possible five-pixel products in 16 × 16 images. We give the derivation of the method and present experimental results on polynomial feature extraction for pattern recognition.

8,175 citations