scispace - formally typeset
Search or ask a question
Author

Jean-François Richard

Bio: Jean-François Richard is an academic researcher. The author has contributed to research in topics: Numerical integration & Regression analysis. The author has an hindex of 2, co-authored 3 publications receiving 713 citations.

Papers
More filters
Book ChapterDOI
06 Jan 2000
TL;DR: Methods of numerical integration will lead you to always think more and more, and this book will be always right for you.
Abstract: Want to get experience? Want to get any ideas to create new things in your life? Read methods of numerical integration now! By reading this book as soon as possible, you can renew the situation to get the inspirations. Yeah, this way will lead you to always think more and more. In this case, this book will be always right for you. When you can observe more about the book, you will know why you need this.

784 citations

Book ChapterDOI
06 Jan 2000

2 citations

Book ChapterDOI
06 Jan 2000

1 citations


Cited by
More filters
Journal ArticleDOI
08 Nov 2004
TL;DR: The motivation, development, use, and implications of the UT are reviewed, which show it to be more accurate, easier to implement, and uses the same order of calculations as linearization.
Abstract: The extended Kalman filter (EKF) is probably the most widely used estimation algorithm for nonlinear systems. However, more than 35 years of experience in the estimation community has shown that is difficult to implement, difficult to tune, and only reliable for systems that are almost linear on the time scale of the updates. Many of these difficulties arise from its use of linearization. To overcome this limitation, the unscented transformation (UT) was developed as a method to propagate mean and covariance information through nonlinear transformations. It is more accurate, easier to implement, and uses the same order of calculations as linearization. This paper reviews the motivation, development, use, and implications of the UT.

6,098 citations

Journal ArticleDOI
TL;DR: This work presents a stable control strategy for groups of vehicles to move and reconfigure cooperatively in response to a sensed, distributed environment and focuses on gradient climbing missions in which the mobile sensor network seeks out local maxima or minima in the environmental field.
Abstract: We present a stable control strategy for groups of vehicles to move and reconfigure cooperatively in response to a sensed, distributed environment. Each vehicle in the group serves as a mobile sensor and the vehicle network as a mobile and reconfigurable sensor array. Our control strategy decouples, in part, the cooperative management of the network formation from the network maneuvers. The underlying coordination framework uses virtual bodies and artificial potentials. We focus on gradient climbing missions in which the mobile sensor network seeks out local maxima or minima in the environmental field. The network can adapt its configuration in response to the sensed environment in order to optimize its gradient climb.

1,291 citations

Dissertation
01 Jan 2001
TL;DR: This thesis presents an approximation technique that can perform Bayesian inference faster and more accurately than previously possible, and is found to be convincingly better than rival approximation techniques: Monte Carlo, Laplace's method, and variational Bayes.
Abstract: One of the major obstacles to using Bayesian methods for pattern recognition has been its computational expense. This thesis presents an approximation technique that can perform Bayesian inference faster and more accurately than previously possible. This method, “Expectation Propagation,” unifies and generalizes two previous techniques: assumed-density filtering, an extension of the Kalman filter, and loopy belief propagation, an extension of belief propagation in Bayesian networks. The unification shows how both of these algorithms can be viewed as approximating the true posterior distribution with simpler distribution, which is close in the sense of KL-divergence. Expectation Propagation exploits the best of both algorithms: the generality of assumed-density filtering and the accuracy of loopy belief propagation. Loopy belief propagation, because it propagates exact belief states, is useful for limited types of belief networks, such as purely discrete networks. Expectation Propagation approximates the belief states with expectations, such as means and variances, giving it much wider scope. Expectation Propagation also extends belief propagation in the opposite direction—propagating richer belief states which incorporate correlations between variables. This framework is demonstrated in a variety of statistical models using synthetic and real-world data. On Gaussian mixture problems, Expectation Propagation is found, for the same amount of computation, to be convincingly better than rival approximation techniques: Monte Carlo, Laplace's method, and variational Bayes. For pattern recognition, Expectation Propagation provides an algorithm for training Bayes Point Machine classifiers that is faster and more accurate than any previously known. The resulting classifiers outperform Support Vector Machines on several standard datasets, in addition to having a comparable training time. Expectation Propagation can also be used to choose an appropriate feature set for classification, via Bayesian model selection. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.)

1,036 citations

Journal ArticleDOI
TL;DR: A survey of numerical integrators that respect Lie-group structure is given in this paper, highlighting theory, algorithmic issues, and a number of applications in the field of Lie group discretization.
Abstract: Many differential equations of practical interest evolve on Lie groups or on manifolds acted upon by Lie groups. The retention of Lie-group structure under discretization is often vital in the recovery of qualitatively correct geometry and dynamics and in the minimization of numerical error. Having introduced requisite elements of differential geometry, this paper surveys the novel theory of numerical integrators that respect Lie-group structure, highlighting theory, algorithmic issues and a number of applications.

718 citations

Journal ArticleDOI
TL;DR: An error bound for multidimensional quadrature is derived that includes the Koksma-Hlawka inequality as a special case and includes as special cases the L p -star discrepancy and P α that arises in the study of lattice rules.
Abstract: An error bound for multidimensional quadrature is derived that includes the Koksma-Hlawka inequality as a special case. This error bound takes the form of a product of two terms. One term, which depends only on the integrand, is defined as a generalized variation. The other term, which depends only on the quadrature rule, is defined as a generalized discrepancy. The generalized discrepancy is a figure of merit for quadrature rules and includes as special cases the L p -star discrepancy and P α that arises in the study of lattice rules.

693 citations