scispace - formally typeset
Search or ask a question
Book

Linear complementarity, linear and nonlinear programming

01 Jan 1988-
About: The article was published on 1988-01-01 and is currently open access. It has received 1012 citations till now. The article focuses on the topics: Mixed complementarity problem & Complementarity theory.
Citations
More filters
Journal ArticleDOI
TL;DR: In this article, a theory of least-change secant update methods for this class of processes is developed and sufficient conditions are established for local convergence and for convergence at an ideal linear or superlinear rate.
Abstract: This article studies iterative methods defined by \[ x_k + 1 = \Phi (x_k ,E_k ),\], where $x_{k \in } \mathbb{R}^n $ and $E_k $ lie on a space of parameters. Sufficient conditions are established for local convergence and for convergence at an ideal linear or superlinear rate. A theory of least-change secant update methods for this class of processes is developed. Several examples are given showing a wide range of applications of the new theory. Key words. quasi-

13 citations

Journal ArticleDOI
TL;DR: A two stage method for constructing a firm grip that can tolerate small slips of the fingertips and is a robust, reliable, and stable controller for rigid bodies that can be handled by a robot gripper.
Abstract: This paper presents a two stage method for constructing a firm grip that can tolerate small slips of the fingertips. The fingers are assumed to be of frictionless contact type. The first stage was to formulate the interaction in the gripper–object system as a linear complementarity problem (LCP). Then it was solved using a special neural network to find minimal fingers forces. The second stage was to use the obtained results in the first stage as a static mapping in training another neural network. The second neural network training included emulating the slips by random noise in the form of changes in the positions of the contact points relative to the reference coordinate system. This noisy training increased robustness against unexpected changes in fingers positions. Genetic algorithms were used in training the second neural network as global optimization techniques. The resulting neural network is a robust, reliable, and stable controller for rigid bodies that can be handled by a robot gripper. © 2001 John Wiley & Sons, Inc.

13 citations

Journal ArticleDOI
TL;DR: This paper presents a framework for inverse learning of objective functions for constrained optimal control problems, which is based on the Karush-Kuhn-Tucker conditions and discusses theoretic properties of the learning methods and presents simulation results that highlight the advantages of using the maximum likelihood formulation for learning objective functions.

13 citations


Additional excerpts

  • ...Algorithm 1 summarizes the procedure to solve (7), which is based on systematically enumerating candidate solutions and is conceptually similar to active-set methods, cf., Murty and Yu (1988)....

    [...]

Journal ArticleDOI
TL;DR: This work solves an open problem of Morris by showing that Murty's least-index pivot rule leads to a quadratic number of iterations on Morris’s highly cyclic P-LCP examples, and shows that on K-matrix LCP instances, all pivot rules require only a linear number of iteration.
Abstract: We study the behavior of simple principal pivoting methods for the P-matrix linear complementarity problem (P-LCP). We solve an open problem of Morris by showing that Murty’s least-index pivot rule (under any fixed index order) leads to a quadratic number of iterations on Morris’s highly cyclic P-LCP examples. We then show that on K-matrix LCP instances, all pivot rules require only a linear number of iterations. As the main tool, we employ unique-sink orientations of cubes, a useful combinatorial abstraction of the P-LCP.

13 citations


Cites background from "Linear complementarity, linear and ..."

  • ...For some P-LCPs, however, this algorithm cycles and never terminates [27]....

    [...]

01 Jan 2008
TL;DR: This work considers the relation between path-tracing for SVMs and the generalized mean-variance portfolio optimization from a PQP view, and shows the power of e‐ciently generating all models for one-class support vector classiflcation and typical support vector machine.
Abstract: The performance of SVM models often depends on the proper choice of their regularized parameter(s) Some recent researches have been focusing on e‐ciently building all SVM models against the regularized parameters, thus deflning a task of tracing the regularized piecewise linear solution path for SVMs It has been widely known from an optimization view many SVM types can be formulated into quadratic programming But it has seldom been pointed out the pathtracing for SVMs can be attacked by parametric quadratic programming (PQP) We from a PQP view consider the relation between path-tracing for SVMs and the generalized mean-variance portfolio optimization The interdisciplinary relation allows us to handle the path-tracing task by tailoring the critical line algorithm (CLA) originally for meanvariance portfolio optimization Although the CLA shares much in common with existing path-tracing approaches, it utilizes more systematically the equality and bounding constraints in the PQP formulation, and hints us to devise a robust one-per-iteration approach based on Karush-KuhnTucker conditions Moreover, under the unifled treatment powered by PQP and CLA, we can further derive ways to overcome pathological cases caused by empty or singular linear systems In experiments we show the power of e‐ciently generating all models for one-class support vector classiflcation and typical support vector machine

13 citations