scispace - formally typeset
Search or ask a question
Author

Vaithilingam Jeyakumar

Bio: Vaithilingam Jeyakumar is an academic researcher from University of New South Wales. The author has contributed to research in topics: Convex analysis & Convex optimization. The author has an hindex of 39, co-authored 176 publications receiving 4671 citations. Previous affiliations of Vaithilingam Jeyakumar include University of Kentucky & University of Mannheim.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, a relaxation of the sufficient optimality conditiond duality result for the generalised convex programming problem is proposed. But the relaxation is not applicable to nonlinear multi-objective fractional programming problems.
Abstract: The sufficient optimality an conditiond duality results s have recently been givenfor the following generalised convex programming problem:Minimise f(x) subjec, t t o g(x) 0,for some r\: X o x X o —> R" .It is shown here that a relaxation definin thge above generalised convexity leadsto a new class of multi-objective problems which preserve thse sufficient optimalityand duality results in the scalar case, and avoids the major difficulty of verifying thatthe inequality hold fosr the same functio rj(.,n Further .). , this relaxation allowsone to treat certain nonlinear multi-objective fractional programming problems andsome other classe osf nonlinear (composite) problem as specias l cases. 1. Introduction Consider the constrained multi-objective optimisation problem(VP) V-Minimis (/, (*),... , f p (x)) subjec to g(x)e R and g: Xo —> R

243 citations

Journal ArticleDOI
TL;DR: A new sequential Lagrange multiplier condition characterizing optimality without a constraint qualification for an abstract nonsmooth convex program is presented in terms of the subdifferentials and the $\epsilon$-subdifferentials.
Abstract: In this paper a new sequential Lagrange multiplier condition characterizing optimality without a constraint qualification for an abstract nonsmooth convex program is presented in terms of the subdifferentials and the $\epsilon$-subdifferentials. A sequential condition involving only the subdifferentials, but at nearby points to the minimizer for constraints, is also derived. For a smooth convex program, the sequential condition yields a limiting Kuhn--Tucker condition at nearby points without a constraint qualification. It is shown how the sequential conditions are related to the standard Lagrange multiplier condition. Applications to semidefinite programs, semi-infinite programs, and semiconvex programs are given. Several numerical examples are discussed to illustrate the significance of the sequential conditions.

126 citations

Journal ArticleDOI
TL;DR: A duality theory for convex programming problems in the face of data uncertainty via robust optimization is presented and strong duality between the robust counterpart of an uncertain convex program and the optimistic counterpart of its uncertain Lagrangian dual is characterized.
Abstract: Duality theory has played a key role in convex programming in the absence of data uncertainty. In this paper, we present a duality theory for convex programming problems in the face of data uncertainty via robust optimization. We characterize strong duality between the robust counterpart of an uncertain convex program and the optimistic counterpart of its uncertain Lagrangian dual. We provide a new robust characteristic cone constraint qualification which is necessary and sufficient for strong duality in the sense that the constraint qualification holds if and only if strong duality holds for every convex objective function of the program. We further show that this strong duality always holds for uncertain polyhedral convex programming problems by verifying our constraint qualification, where the uncertainty set is a polytope. We derive these results by way of first establishing a robust theorem of the alternative for parameterized convex inequality systems using conjugate analysis. We also give a convex characteristic cone constraint qualification that is necessary and sufficient for strong duality between the deterministic dual pair: the robust counterpart and its Lagrangian dual. Through simple numerical examples we also provide an insightful account of the development of our duality theory.

122 citations

Journal ArticleDOI
TL;DR: In this article, a categorisation of fonctions appelees pre-invexes is defined, and conditions d'optimalite and des theoremes de dualite for des programs a valeurs scalaires and a VALEUR scalaires comportant des fonsctions pre-Invexes are presented.
Abstract: On definit une categorie de fonctions appelees pre-invexes. Ces fonctions sont plus generales que les fonctions convexes et invexes lorsque derivables. On donne des conditions d'optimalite et des theoremes de dualite pour des programmes a valeurs scalaires et a valeurs scalaires comportant des fonctions pre-invexes

121 citations

Journal ArticleDOI
TL;DR: In this article, the notion of approximate Jacobian matrices is introduced for a continuous vector-valued map, based on the idea of convexificators of real-valued functions.
Abstract: The notion of approximate Jacobian matrices is introduced for a continuous vector-valued map. It is shown, for instance, that the Clarke generalized Jacobian is an approximate Jacobian for a locally Lipschitz map. The approach is based on the idea of convexificators of real-valued functions. Mean value conditions for continuous vector-valued maps and Taylor's expansions for continuously Gâteaux differentiable functions (i.e., C1-functions) are presented in terms of approximate Jacobians and approximate Hessians, respectively. Second-order necessary and sufficient conditions for optimality and convexity of C1-functions are also given.

110 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: It is shown that the full set of hydromagnetic equations admit five more integrals, besides the energy integral, if dissipative processes are absent, which made it possible to formulate a variational principle for the force-free magnetic fields.
Abstract: where A represents the magnetic vector potential, is an integral of the hydromagnetic equations. This -integral made it possible to formulate a variational principle for the force-free magnetic fields. The integral expresses the fact that motions cannot transform a given field in an entirely arbitrary different field, if the conductivity of the medium isconsidered infinite. In this paper we shall show that the full set of hydromagnetic equations admit five more integrals, besides the energy integral, if dissipative processes are absent. These integrals, as we shall presently verify, are I2 =fbHvdV, (2)

1,858 citations

Book
21 Feb 1970

986 citations

Book ChapterDOI
01 Jan 2003
TL;DR: This work derives necessary and sufficient optimality conditions, a minimal point theorem, a vector-valued variational principle of Ekeland’s type, Lagrangean multiplier rules and duality statements, and discusses a general scalarization procedure.
Abstract: We introduce several solution concepts for multicriteria optimization problems, give a characterization of approximately efficient elements and discuss a general scalarization procedure. Furthermore, we derive necessary and sufficient optimality conditions, a minimal point theorem, a vector-valued variational principle of Ekeland’s type, Lagrangean multiplier rules and duality statements. An overview on vector variational inequalities and vector equilibria is given. Moreover, we discuss the results for special classes of vector optimization problems (vector-valued location and approximation problems, multicriteria fractional programming and optimal control problems).

938 citations

Book ChapterDOI
01 Jan 2016
TL;DR: This chapter simplifies the Lagrangian support vector machine approach using process diagrams and data flow diagrams to help readers understand theory and implement it successfully.
Abstract: Support Vector Machine is one of the classical machine learning techniques that can still help solve big data classification problems. Especially, it can help the multidomain applications in a big data environment. However, the support vector machine is mathematically complex and computationally expensive. The main objective of this chapter is to simplify this approach using process diagrams and data flow diagrams to help readers understand theory and implement it successfully. To achieve this objective, the chapter is divided into three parts: (1) modeling of a linear support vector machine; (2) modeling of a nonlinear support vector machine; and (3) Lagrangian support vector machine algorithm and its implementations. The Lagrangian support vector machine with simple examples is also implemented using the R programming platform on Hadoop and non-Hadoop systems.

938 citations