scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A strong convergence algorithm for a fixed point constrained split null point problem

01 Apr 2021-Vol. 70, Iss: 1, pp 389-408
TL;DR: In this article, a self-adaptive step-size method for finding a common solution of a split feasibility problem and a fixed point problem in real Hilbert spaces is proposed. But the authors do not consider the operator norm in the proposed method.
Abstract: In this paper, we introduce a new algorithm with self adaptive step-size for finding a common solution of a split feasibility problem and a fixed point problem in real Hilbert spaces. Motivated by the self adaptive step-size method, we incorporate the self adaptive step-size to overcome the difficulty of having to compute the operator norm in the proposed method. Under standard and mild assumption on the control sequences, we establish the strong convergence of the algorithm, obtain a common element in the solution set of a split feasibility problem for sum of two monotone operators and fixed point problem of a demimetric mapping. Numerical examples are presented to illustrate the performance and the behavior of our method. Our result extends, improves and unifies other results in the literature.
Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, an inertial extrapolation method for solving generalized split feasibility problems over the solution set of monotone variational inclusion problems in real Hilbert space is proposed. But this method is not suitable for real Hilbert spaces.
Abstract: In this paper, we propose a new inertial extrapolation method for solving the generalized split feasibility problems over the solution set of monotone variational inclusion problems in real Hilbert...

47 citations

Journal ArticleDOI
TL;DR: In this paper, an iterative scheme which combines the inertial subgradient extragradient method with viscosity technique and with self-adaptive stepsize was proposed.
Abstract: In this paper, we study a classical monotone and Lipschitz continuous variational inequality and fixed point problems defined on a level set of a convex function in the framework of Hilbert spaces. First, we introduce a new iterative scheme which combines the inertial subgradient extragradient method with viscosity technique and with self-adaptive stepsize. Unlike in many existing subgradient extragradient techniques in literature, the two projections of our proposed algorithm are made onto some half-spaces. Furthermore, we prove a strong convergence theorem for approximating a common solution of the variational inequality and fixed point of an infinite family of nonexpansive mappings under some mild conditions. The main advantages of our method are: the self-adaptive stepsize which avoids the need to know a priori the Lipschitz constant of the associated monotone operator, the two projections made onto some half-spaces, the strong convergence and the inertial technique employed which accelerates convergence rate of the algorithm. Second, we apply our theorem to solve generalised mixed equilibrium problem, zero point problems and convex minimization problem. Finally, we present some numerical examples to demonstrate the efficiency of our algorithm in comparison with other existing methods in literature. Our results improve and extend several existing works in the current literature in this direction.

42 citations

Journal ArticleDOI
TL;DR: In this article, a modified self-adaptive inertial subgradient extragradient algorithm is presented, in which the two projections are made onto some half spaces and under mild conditions, the sequence generated by the proposed algorithm for approximating a common solution of variational inequality problem and common fixed point of a finite family of demicontractive mappings in a real Hilbert space.
Abstract: In this paper, we present a new modified self-adaptive inertial subgradient extragradient algorithm in which the two projections are made onto some half spaces Moreover, under mild conditions, we obtain a strong convergence of the sequence generated by our proposed algorithm for approximating a common solution of variational inequality problem and common fixed point of a finite family of demicontractive mappings in a real Hilbert space The main advantages of our algorithm are: strong convergence result obtained without prior knowledge of the Lipschitz constant of the related monotone operator, the two projections made onto some half-spaces and the inertial technique which speeds up rate of convergence Finally, we present an application and a numerical example to illustrate the usefulness and applicability of our algorithm

34 citations

Journal ArticleDOI
TL;DR: Two new inertial-type algorithms for solving variational inequality problems (VIPs) with monotone and Lipschitz continuous mappings in real Hilbert spaces with strong convergence results are introduced.
Abstract: Abstract In this work, we introduce two new inertial-type algorithms for solving variational inequality problems (VIPs) with monotone and Lipschitz continuous mappings in real Hilbert spaces. The first algorithm requires the computation of only one projection onto the feasible set per iteration while the second algorithm needs the computation of only one projection onto a half-space, and prior knowledge of the Lipschitz constant of the monotone mapping is not required in proving the strong convergence theorems for the two algorithms. Under some mild assumptions, we prove strong convergence results for the proposed algorithms to a solution of a VIP. Finally, we provide some numerical experiments to illustrate the efficiency and advantages of the proposed algorithms.

28 citations


Additional excerpts

  • ...[40,41] Let ∈ ( ) δ 0, 1 , for ∈ x y H , , we have the following statements: (1) |〈 〉| ≤ ∥ ∥∥ ∥ x y x y , ; (2) ∥ + ∥ ≤ ∥ ∥ + 〈 + 〉 x y x y x y 2 , ; 2 2 (3) ∥ + ∥ = ∥ ∥ + 〈 〉 + ∥ ∥ x y x x y y 2 , ; 2 2 2 (4) ∥ + ( − ) ∥ = ∥ ∥ + ( − )∥ ∥ − ( − )∥ − ∥ δx δ y δ x δ y δ δ x y 1 1 1 ....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: This paper shows, by means of an operator called asplitting operator, that the Douglas—Rachford splitting method for finding a zero of the sum of two monotone operators is a special case of the proximal point algorithm, which allows the unification and generalization of a variety of convex programming algorithms.
Abstract: This paper shows, by means of an operator called asplitting operator, that the Douglas--Rachford splitting method for finding a zero of the sum of two monotone operators is a special case of the proximal point algorithm. Therefore, applications of Douglas--Rachford splitting, such as the alternating direction method of multipliers for convex programming decomposition, are also special cases of the proximal point algorithm. This observation allows the unification and generalization of a variety of convex programming algorithms. By introducing a modified version of the proximal point algorithm, we derive a new,generalized alternating direction method of multipliers for convex programming. Advances of this sort illustrate the power and generality gained by adopting monotone operator theory as a conceptual framework.

2,913 citations

Journal ArticleDOI
TL;DR: In this paper, a modification of Rockafellar's proximal point algorithm is obtained and proved to be always strongly convergent, and the ideas of these algorithms are applied to solve a quadratic minimization problem.
Abstract: Iterative algorithms for nonexpansive mappings and maximal monotone operators are investigated. Strong convergence theorems are proved for nonexpansive mappings, including an improvement of a result of Lions. A modification of Rockafellar’s proximal point algorithm is obtained and proved to be always strongly convergent. The ideas of these algorithms are applied to solve a quadratic minimization problem.

1,560 citations

Journal ArticleDOI
TL;DR: Using an extension of Pierra's product space formalism, it is shown here that a multiprojection algorithm converges and is fully simultaneous, i.e., it uses in each iterative stepall sets of the convex feasibility problem.
Abstract: Generalized distances give rise to generalized projections into convex sets. An important question is whether or not one can use within the same projection algorithm different types of such generalized projections. This question has practical consequences in the area of signal detection and image recovery in situations that can be formulated mathematically as a convex feasibility problem. Using an extension of Pierra's product space formalism, we show here that a multiprojection algorithm converges. Our algorithm is fully simultaneous, i.e., it uses in each iterative stepall sets of the convex feasibility problem. Different multiprojection algorithms can be derived from our algorithmic scheme by a judicious choice of the Bregman functions which govern the process. As a by-product of our investigation we also obtain blockiterative schemes for certain kinds of linearly constraned optimization problems.

1,085 citations

Journal ArticleDOI
TL;DR: In this paper, the existence of fixed points of non-compact mappings of a subset C of a Hilbert space H of a function space into H has been studied under a number of topological arguments.

1,029 citations