scispace - formally typeset
Journal ArticleDOI

On a Global Complexity Bound of the Levenberg-Marquardt Method

Reads0
Chats0
TLDR
It is shown that the global complexity bound of the Levenberg-Marquardt method is O(ε−2), which is an upper bound to the number of iterations required to get an approximate solution.
Abstract
In this paper, we investigate a global complexity bound of the Levenberg-Marquardt method (LMM) for the nonlinear least squares problem. The global complexity bound for an iterative method solving unconstrained minimization of φ is an upper bound to the number of iterations required to get an approximate solution, such that ‖∇φ(x)‖≤e. We show that the global complexity bound of the LMM is O(e −2).

read more

Citations
More filters
Journal ArticleDOI

Visual Navigation Using Heterogeneous Landmarks and Unsupervised Geometric Constraints

TL;DR: This work extends the local bundle adjustment-based visual simultaneous localization and mapping (SLAM) framework by explicitly exploiting the heterogeneous features and their inner geometric relationships in an unsupervised manner and reduces the translational error under urban sequences where rectilinear structures dominate the scene.
Journal ArticleDOI

Robust Relative Location Estimation in Wireless Sensor Networks with Inexact Position Problems

TL;DR: A robust min-max optimization method is proposed for the relative location estimation problem by minimizing the worst-case estimation error and the robustness of the proposed approach is validated by simulations under different WSN topologies.
Book ChapterDOI

Distributed Range-Based Relative Localization of Robot Swarms

TL;DR: This paper compares two distributed localization algorithms with different trade-offs between their computational complexity and their coordination requirements, and relies on a non-linear least squares based strategy to allow robots to compute the relative pose of near-by robots.
Journal ArticleDOI

Second-Order Optimality and Beyond: Characterization and Evaluation Complexity in Convexly Constrained Nonlinear Optimization

TL;DR: A corresponding (expensive) measure of criticality for arbitrary order is proposed and extended to define high-order Epsilon-approximate critical points, providing the first evaluation complexity result for critical points of arbitrary order in nonlinear optimization.
Posted Content

Worst-case evaluation complexity and optimality of second-order methods for nonconvex smooth optimization

TL;DR: A new general class of inexact second-order algorithms for unconstrained optimization that includes regularization and trust-region variations of Newton's method as well as of their linesearch variants is considered, implying that these methods have optimal worst-case evaluation complexity within a wider class of second- order methods, and that Newton'smethod is suboptimal.
References
More filters
Book

Numerical Optimization

TL;DR: Numerical Optimization presents a comprehensive and up-to-date description of the most effective methods in continuous optimization, responding to the growing interest in optimization in engineering, science, and business by focusing on the methods that are best suited to practical problems.
Book

Nonlinear Programming

Book

Trust Region Methods

TL;DR: This chapter discusses Trust-Region Mewthods for General Constained Optimization and Systems of Nonlinear Equations and Nonlinear Fitting, and some of the methods used in this chapter dealt with these systems.
Book

Introductory Lectures on Convex Optimization

TL;DR: A polynomial-time interior-point method for linear optimization was proposed in this article, which has a complexity bound of O(n log n log n 2 n 2 ).
Related Papers (5)