scispace - formally typeset
Search or ask a question

Showing papers on "Line search published in 2009"


Proceedings ArticleDOI
28 Jun 2009
TL;DR: An adaptive line search scheme which allows to tune the step size adaptively and meanwhile guarantees the optimal convergence rate is proposed, which demonstrates the efficiency of the proposed Lassplore algorithm for large-scale problems.
Abstract: Logistic Regression is a well-known classification method that has been used widely in many applications of data mining, machine learning, computer vision, and bioinformatics. Sparse logistic regression embeds feature selection in the classification framework using the l1-norm regularization, and is attractive in many applications involving high-dimensional data. In this paper, we propose Lassplore for solving large-scale sparse logistic regression. Specifically, we formulate the problem as the l1-ball constrained smooth convex optimization, and propose to solve the problem using the Nesterov's method, an optimal first-order black-box method for smooth convex optimization. One of the critical issues in the use of the Nesterov's method is the estimation of the step size at each of the optimization iterations. Previous approaches either applies the constant step size which assumes that the Lipschitz gradient is known in advance, or requires a sequence of decreasing step size which leads to slow convergence in practice. In this paper, we propose an adaptive line search scheme which allows to tune the step size adaptively and meanwhile guarantees the optimal convergence rate. Empirical comparisons with several state-of-the-art algorithms demonstrate the efficiency of the proposed Lassplore algorithm for large-scale problems.

219 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present an approach for solving reliability-based optimization problems involving structural systems under stochastic loading. But the associated reliability problems to be solved during the optimization process are high-dimensional (1000 or more random variables).

91 citations


Journal ArticleDOI
TL;DR: A modified conjugate gradient method for solving unconstrained optimization problems, which inherits an important property of the well-known Polak-Ribiere-Polyak (PRP) method: the tendency to turn towards the steepest descent direction if a small step is generated away from the solution, preventing a sequence of tiny steps from happening.

81 citations


Journal ArticleDOI
TL;DR: This paper proposes a new search direction together with the Wolfe line search technique and one nonmonotone linesearch technique for solving unconstrained optimization problems and shows that the given methods possess sufficiently descent property without carrying out any line search rule.
Abstract: It is well known that the search direction plays a main role in the line search method. In this paper, we propose a new search direction together with the Wolfe line search technique and one nonmonotone line search technique for solving unconstrained optimization problems. The given methods possess sufficiently descent property without carrying out any line search rule. The convergent results are established under suitable conditions. For numerical results, analysis of one probability shows that the new methods are more effective, robust, and stable, than other similar methods. Numerical results of two statistical problems also show that the presented methods are more interesting than other normal methods.

62 citations


Journal ArticleDOI
TL;DR: This paper proposes an innovative curvilinear search method for minimizing $p-harmonic energies over spheres and shows that the method is globally convergent if the step length satisfies the Armijo-Wolfe conditions.
Abstract: The problem of finding $p$-harmonic flows arises in a wide range of applications including color image (chromaticity) denoising, micromagnetics, liquid crystal theory, and directional diffusion. In this paper, we propose an innovative curvilinear search method for minimizing $p$-harmonic energies over spheres. Starting from a flow (map) on the unit sphere, our method searches along a curve that lies on the sphere in a manner similar to that of a standard inexact line search descent method. We show that our method is globally convergent if the step length satisfies the Armijo-Wolfe conditions. Computational tests are presented to demonstrate the efficiency of the proposed method and a variant of it that uses Barzilai-Borwein steps.

56 citations


Posted Content
TL;DR: This work shows that the consistent matrix completion problem can be solved by searching for a column space that matches the observations, and designs mechanisms to detect barriers and transfer the estimated column space from one side of the barrier to the another.
Abstract: A new algorithm, termed subspace evolution and transfer (SET), is proposed for solving the consistent matrix completion problem. In this setting, one is given a subset of the entries of a low-rank matrix, and asked to find one low-rank matrix consistent with the given observations. We show that this problem can be solved by searching for a column space that matches the observations. The corresponding algorithm consists of two parts -- subspace evolution and subspace transfer. In the evolution part, we use a line search procedure to refine the column space. However, line search is not guaranteed to converge, as there may exist barriers along the search path that prevent the algorithm from reaching a global optimum. To address this problem, in the transfer part, we design mechanisms to detect barriers and transfer the estimated column space from one side of the barrier to the another. The SET algorithm exhibits excellent empirical performance for very low-rank matrices.

54 citations


Journal ArticleDOI
TL;DR: An affine-scaling algorithm for box-constrained optimization which has the property that each iterate is a scaled cyclic Barzilai–Borwein (CBB) gradient iterate that lies in the interior of the feasible set is developed.
Abstract: We develop an affine-scaling algorithm for box-constrained optimization which has the property that each iterate is a scaled cyclic Barzilai–Borwein (CBB) gradient iterate that lies in the interior of the feasible set. Global convergence is established for a nonmonotone line search, while there is local R-linear convergence at a nondegenerate local minimizer where the second-order sufficient optimality conditions are satisfied. Numerical experiments show that the convergence speed is insensitive to problem conditioning. The algorithm is particularly well suited for image restoration problems which arise in positron emission tomography where the cost function can be infinite on the boundary of the feasible set.

48 citations


Journal ArticleDOI
TL;DR: Preliminary numerical experiments show that the line search multigrid approach is promising, and global convergence is proved under fairly minimal requirements on the minimization method used at all grid levels.
Abstract: We present a line search multigrid method for solving discretized versions of general unconstrained infinite-dimensional optimization problems. At each iteration on each level, the algorithm computes either a “direct search” direction on the current level or a “recursive search” direction from coarser level models. Introducing a new condition that must be satisfied by a backtracking line search procedure, the “recursive search” direction is guaranteed to be a descent direction. Global convergence is proved under fairly minimal requirements on the minimization method used at all grid levels. Using a limited memory BFGS quasi-Newton method to produce the “direct search” direction, preliminary numerical experiments show that our line search multigrid approach is promising.

46 citations


Proceedings ArticleDOI
19 Oct 2009
TL;DR: It is shown that the well liquid rates are the best variables used for maximizing the net present value using gradient based optimization, and the optimal solution is applied on ten more history matched realizations to check the robustness of the solution.
Abstract: Finding the best strategy for production optimization is currently an important research task for closed-loop reservoir management. The closed-loop reservoir management consists of two main tasks: history matching and production optimization. A comparative closed loop reservoir management exercise was performed in connection with the SPE Applied Technology Workshop “Closed-loop reservoir management” in Bruges June 2008. The model used in this exercise was a synthetic reservoir with typical geological features of Northern Sea fields and considerably larger than those used in most previous studies. In a previous work (Lorentzen et al., 2009), a set of history matched models were obtained using the ensemble Kalman filter. We will use these models to investigate the effect of formulation and initial guess on gradient based optimization methods. Within production optimization, most of the works are focused on optimizing the reservoir performance under waterflooding. We will review the waterflooding optimization studies so far. The mathematical theories of an optimization problem as well as the practical issues regarding reactive and proactive approaches are discussed. The formulation of a waterflooding optimization problem is investigated using three different optimization variables: bottomhole pressure, oil and liquid production rates. Results show that proper formulation improves the performance of gradient based methods considerably. Then it is verified that manual optimization of the initial guess based on reservoir concepts enhances both the result and efficiency of the gradient based optimization. The manual optimization saves the gradient based methods from a number of local optima and also decreases the simulation costs substantially. Two line search methods, steepest descent and conjugate gradient, are used and compared in the adjoint based optimization approach. The conjugate gradient acts slightly faster than the steepest descent method. However, the selection of a proper initial guess is far more important for the performance. Finally, the optimal solution is applied on ten more history matched realizations to check the robustness of the solution. It is shown that the well liquid rates are the best variables used for maximizing the net present value using gradient based optimization. In previous works, well bottomhole pressure has been suggested.

42 citations


Journal ArticleDOI
TL;DR: A new primal–dual merit function is proposed by combining the barrier penalty function and the potential function within the framework of the line search strategy, and the global convergence property of the method is shown.
Abstract: In this paper, we are concerned with nonlinear minimization problems with second-order cone constraints. A primal-dual interior point method that uses a commutative class of search directions is considered. We propose a new primal-dual merit function by combining the barrier penalty function and the potential function within the framework of the line search strategy, and show the global convergence property of our method.

40 citations


Journal ArticleDOI
TL;DR: It is shown that the nonmonotone algorithm is globally convergent under an assumption that the solution set of the problem concerned is nonempty, which is weaker than those given in most existing algorithms for solving optimization problems over symmetric cones.
Abstract: In this paper, we propose a smoothing algorithm for solving the monotone symmetric cone complementarity problems (SCCP for short) with a nonmonotone line search. We show that the nonmonotone algorithm is globally convergent under an assumption that the solution set of the problem concerned is nonempty. Such an assumption is weaker than those given in most existing algorithms for solving optimization problems over symmetric cones. We also prove that the solution obtained by the algorithm is a maximally complementary solution to the monotone SCCP under some assumptions.

Journal ArticleDOI
TL;DR: In this article, two nonlinear conjugate gradient-type methods for solving unconstrained optimization problems are proposed, which are shown to be promising and competitive with the well-known PRP method.
Abstract: Two nonlinear conjugate gradient-type methods for solving unconstrained optimization problems are proposed. An attractive property of the methods, is that, without any line search, the generated directions always descend. Under some mild conditions, global convergence results for both methods are established. Preliminary numerical results show that these proposed methods are promising, and competitive with the well-known PRP method.

Journal ArticleDOI
TL;DR: A new class of conjugate gradient (CG) methods is proposed, called self-scaled CG methods, which are derived from the principles of Hestenes-Stiefel, Fletcher-Reeves, Polak-Ribiere and Perry's method, and can perform better with a much lower computational cost and better success performance.

Journal ArticleDOI
TL;DR: In this paper, a new smoothing function for the second-order cone programming is given by smoothing the symmetric perturbed Fischer-Burmeister function, based on this new function, a one-step smoothing Newton method is presented for solving the second order cone programming.

Proceedings Article
01 Jan 2009
TL;DR: This is the first paper coming up with a convergence rate for these algorithms without introducing unnecessarily restrictive assumptions, and it is general enough to cover the algorithms that are used in software packages like SVMTorch and (first or second order) LibSVM.
Abstract: We consider (a subclass of) convex quadratic optimization problems and analyze decomposition algorithms that perform, at least approximately, steepest-descent exact line search. We show that these algorithms, when implemented properly, are within ǫ of optimality after O(log 1/ǫ) iterations for strictly convex cost functions, and after O(1/ǫ) iterations in the general case. Our analysis is general enough to cover the algorithms that are used in software packages like SVMTorch and (first or second order) LibSVM. To the best of our knowledge, this is the first paper coming up with a convergence rate for these algorithms without introducing unnecessarily restrictive assumptions.

Posted Content
TL;DR: The convergence rate is analyzed for the sparse reconstruction by separable approximation (SpaRSA) algorithm for minimizing a sum f+psi, where f is smooth and $\psi$ is convex, but possibly nonsmooth and the error in the objective function is bounded by a/k.
Abstract: The convergence rate is analyzed for the SpaSRA algorithm (Sparse Reconstruction by Separable Approximation) for minimizing a sum $f (\m{x}) + \psi (\m{x})$ where $f$ is smooth and $\psi$ is convex, but possibly nonsmooth. It is shown that if $f$ is convex, then the error in the objective function at iteration $k$, for $k$ sufficiently large, is bounded by $a/(b+k)$ for suitable choices of $a$ and $b$. Moreover, if the objective function is strongly convex, then the convergence is $R$-linear. An improved version of the algorithm based on a cycle version of the BB iteration and an adaptive line search is given. The performance of the algorithm is investigated using applications in the areas of signal processing and image reconstruction.

Journal ArticleDOI
TL;DR: In this paper, a derivative-free nonmonotone line search for solving large-scale nonlinear systems of equations is proposed, and the spectral residual method with this line search is globally convergent.
Abstract: In this paper we propose a derivative-free nonmonotone line search for solving large-scale nonlinear systems of equations. Under appropriate conditions, we show that the spectral residual method with this line search is globally convergent. We also present some numerical experiments. The results show that the spectral residual method with the new nonmonotone line search is promising.

Journal ArticleDOI
TL;DR: A detailed convergence analysis reveals that global convergence properties of line-search and trust-region methods still hold when the methods are accelerated, and sheds new light on the behavior of several existing algorithms.
Abstract: In numerical optimization, line-search and trust-region methods are two important classes of descent schemes, with well-understood global convergence properties. We say that these methods are “accelerated” when the conventional iterate is replaced by any point that produces at least as much of a decrease in the cost function as a fixed fraction of the decrease produced by the conventional iterate. A detailed convergence analysis reveals that global convergence properties of line-search and trust-region methods still hold when the methods are accelerated. The analysis is performed in the general context of optimization on manifolds, of which optimization in $\mathbb{R}^n$ is a particular case. This general convergence analysis sheds new light on the behavior of several existing algorithms.

Journal ArticleDOI
TL;DR: An adaptive simulated annealing algorithm using fuzzy logic controller (FLC) can control the temperature and the local search repetition of simulatedannealing, thereby making the search process of simulatedAnnealing more efficient.
Abstract: Simulated annealing method has been successfully applied to various combinatorial optimization problems. In the conventional simulated annealing, temperature and local search repetition are determined by simple algorithms with a higher transition probability in the beginning of the search and lower probability toward the end of the search. But these simple methods can cause inefficient search process. In order to overcome this defect, this paper provides an adaptive simulated annealing algorithm using fuzzy logic controller (FLC). FLC can control the temperature and the local search repetition of simulated annealing, thereby making the search process of simulated annealing more efficient. The performance of the proposed method is evaluated and favorably compared with the conventional simulated annealing through traveling salesman problem and equal piles problem.

Journal ArticleDOI
TL;DR: Numerical results illustrate the efficiency of the Chambolle gradient projection method and indicate that such a nonmonotone method is more suitable to solve some large-scale inverse problems.
Abstract: The main aim of this paper is to accelerate the Chambolle gradient projection method for total variation image restoration. In the proposed minimization method model, we use the well known Barzilai-Borwein stepsize instead of the constant time stepsize in Chambolle's method. Further, we adopt the adaptive nonmonotone line search scheme proposed by Dai and Fletcher to guarantee the global convergence of the proposed method. Numerical results illustrate the efficiency of this method and indicate that such a nonmonotone method is more suitable to solve some large-scale inverse problems.

Journal ArticleDOI
TL;DR: In this article, a computationally efficient computational fluid dynamics (CFD)-based optimization method with the capability of finding optimal engine operating conditions with respect to emissions and fuel consumption has been developed.

Journal ArticleDOI
TL;DR: A smoothing Newton algorithm for solving the nonlinear complementarity problem with a new nonmonotone line search is proposed based on the smoothed Kanzow–Kleinmichel NCP function and shown to be globally and locally superlinearly convergent.
Abstract: The smoothing-type algorithm has been a powerful tool for solving various optimization problems. In order to improve the numerical results of the algorithm, the nonmonotone line search technique has been used when the algorithm is implemented. However, the theoretical analysis is based on the algorithm with some monotone line search. In this paper, based on the smoothed Kanzow-Kleinmichel NCP function, we propose a smoothing Newton algorithm for solving the nonlinear complementarity problem with a new nonmonotone line search. We show that the nonmonotone algorithm is globally and locally superlinearly convergent under suitable assumptions. The preliminary numerical results are also reported.

Journal ArticleDOI
TL;DR: This paper considers an equivalent optimization reformulation of GNEP using a regularized Nikaido–Isoda function and proposes a derivative-free descent type method with inexact line search to solve the equivalent optimization problem and proves that the algorithm is globally convergent.
Abstract: This paper deals with the generalized Nash equilibrium problem (GNEP), i.e. a noncooperative game in which the strategy set of each player, as well as his payoff function, depends on the strategies of all players. We consider an equivalent optimization reformulation of GNEP using a regularized Nikaido–Isoda function so that solutions of GNEP coincide with global minima of the optimization problem. We then propose a derivative-free descent type method with inexact line search to solve the equivalent optimization problem and we prove that our algorithm is globally convergent. The convergence analysis is not based on conditions guaranteeing that every stationary point of the optimization problem is a solution of GNEP. Finally, we present the performance of our algorithm on some examples.

Journal ArticleDOI
TL;DR: A new smoothing function of the well-known Fischer-Burmeister function is given and a smoothing Newton-type method is proposed for solving second-order cone programming.

Journal ArticleDOI
TL;DR: Under some suitable conditions, by using a modified Wolfe line search, global convergence results were established for the Polak-Ribiere-Polyak (PRP) conjugate gradient method using the standard Wolfe conditions.

Proceedings ArticleDOI
Bing Zhao1, Shengyuan Chen1
31 May 2009
TL;DR: A variation of simplex-downhill algorithm specifically customized for optimizing parameters in statistical machine translation (SMT) decoder for better end-user automatic evaluation metric scores for translations, such as versions of BLEU, TER and mixtures of them is proposed.
Abstract: We propose a variation of simplex-downhill algorithm specifically customized for optimizing parameters in statistical machine translation (SMT) decoder for better end-user automatic evaluation metric scores for translations, such as versions of BLEU, TER and mixtures of them. Traditional simplex-downhill has the advantage of derivative-free computations of objective functions, yet still gives satisfactory searching directions in most scenarios. This is suitable for optimizing translation metrics as they are not differentiable in nature. On the other hand, Armijo algorithm usually performs line search efficiently given a searching direction. It is a deep hidden fact that an efficient line search method will change the iterations of simplex, and hence the searching trajectories. We propose to embed the Armijo inexact line search within the simplex-downhill algorithm. We show, in our experiments, the proposed algorithm improves over the widely-applied Minimum Error Rate training algorithm for optimizing machine translation parameters.

Journal ArticleDOI
TL;DR: A new self-adaptive trust region method with a line search technique for solving unconstrained optimization problems by use of the simple subproblem model, which needs less memory capacitance and computational complexity.

Journal ArticleDOI
TL;DR: A new filter line search SQP method in which the violations of equality and inequality constraints are considered separately, and the filter in this algorithm is composed by three components: objective function value, equality and equality constraints violations.

Journal ArticleDOI
TL;DR: A new one-step smoothing Newton method proposed for solving the non-linear complementarity problem with P"0-function based on a new smoothing NCP-function that shows that any accumulation point of the iteration sequence generated by the algorithm is a solution of P" 0-NCP.

Journal ArticleDOI
TL;DR: In this article, a quasi-Newton method for the solution of systems of non-linear equations based on the nested application of adjoint Broyden updates was introduced. But the convergence of the iteration under the same requirements on F as Newton's method was not considered.