BookDOI
Nonlinear Programming and Variational Inequality Problems
Reads0
Chats0
About:
The article was published on 1999-01-01. It has received 139 citations till now. The article focuses on the topics: Variational inequality & Nonlinear programming.read more
Citations
More filters
Book
Density Ratio Estimation in Machine Learning
TL;DR: A comprehensive introduction of various density ratio estimators including methods via density estimation, moment matching, probabilistic classification, density fitting, and density ratio fitting as well as describing how these can be applied to machine learning can be found in this paper.
Journal ArticleDOI
Complexity of Variants of Tseng's Modified F-B Splitting and Korpelevich's Methods for Hemivariational Inequalities with Applications to Saddle-point and Convex Optimization Problems
TL;DR: This paper considers both a variant of Tseng's modified forward-backward splitting method and an extension of Korpelevich's method for solving hemivariational inequalities with Lipschitz continuous operators as special cases of the hybrid proximal extragradient method introduced by Solodov and Svaiter.
Journal ArticleDOI
Application of the Proximal Point Method to Nonmonotone Equilibrium Problems
TL;DR: In this paper, the authors consider a general equilibrium problem defined on a convex set, whose cost bifunction may not be monotone, and show that this problem can be solved by the inexact proximal point method if there exists a solution to the dual problem.
Journal ArticleDOI
Distributed Computation of Equilibria in Monotone Nash Games via Iterative Regularization Techniques
Aswin Kannan,Uday V. Shanbhag +1 more
TL;DR: This work considers the development of single-timescale schemes for the distributed computation of equilibria associated with Nash games in which each player solves a convex program, a class of games that lead to monotone variational inequalities.
Journal ArticleDOI
Sufficient dimension reduction via squared-loss mutual information estimation
Taiji Suzuki,Masashi Sugiyama +1 more
TL;DR: A novel sufficient dimension-reduction method using a squared-loss variant of mutual information as a dependency measure that is formulated as a minimum contrast estimator on parametric or nonparametric models and a natural gradient algorithm on the Grassmann manifold for sufficient subspace search.