scispace - formally typeset
Search or ask a question

Showing papers on "Steffensen's method published in 2012"


Book
05 Jun 2012
TL;DR: In this article, the convergence of Newton's method under Frechet differentiability was studied. But the convergence was not shown to be optimal for the case of interior point techniques, and the convergence with outer inverses was not considered.
Abstract: INTRODUCTION NEWTON'S METHOD Convergence under Frechet differentiability. Convergence under twice Frechet differentiability. Newton's method on unbounded domains. Continuous analog of Newton's method. Interior point techniques. Regular smoothness. omega-convergence. Semilocal convergence and convex majorants. Local convergence and convex majorants. Majorizing sequences. SECANT METHOD Convergence. Least squares problems. Nondiscrete induction and Secant method. Nondiscrete induction and a double step Secant method. Directional Secant Methods. Efficient three step Secant methods. STEFFENSEN'S METHOD Convergence GAUSS-NEWTON METHOD Convergence. Average-Lipschitz conditions. NEWTON-TYPE METHODS Convergence with outer inverses. Convergence of a Moser-type Method. Convergence with slantly differentiable operator. A intermediate Newton method. INEXACT METHODS Residual control conditions. Average Lipschitz conditions. Two-step methods. Zabrejko-Zincenko-type conditions. WERNER'S METHOD Convergence analysis HALLEY'S METHOD Local convergence METHODS FOR VARIATIONAL INEQUALITIES Subquadratic convergent method. Convergence under slant condition. Newton-Josephy method. FAST TWO-STEP METHODS Semilocal convergence FIXED POINT METHODS Successive substitutions methods

97 citations


Journal ArticleDOI
TL;DR: By approximating the derivatives in the well known fourth-order Ostrowski's method and in a sixth-order improved Ostrowska's method by central-difference quotients, new modifications of these methods are obtained free from derivatives.

64 citations


Journal ArticleDOI
TL;DR: In this article, the inexact Newton method with a fixed relative residual error tolerance converges Q-linearly to a zero of the nonlinear operator under consideration under semi-local assumptions.

19 citations


Journal Article
TL;DR: Numerical examples show that the new Newton's method based on contra harmonic mean with cubically convergent can compete with the classical Newton’s method.
Abstract: For the last years, the variants of the Newton’s method with cubic convergence have become popular iterative methods to find approximate solutions to the roots of non-linear equations. These methods both enjoy cubic convergence at simple roots and do not require the evaluation of second order derivatives. In this paper, we present a new Newton’s method based on contra harmonic mean with cubically convergent. Numerical examples show that the new method can compete with the classical Newton’s method. Keywords—Third-order convergence, Non-linear equations, Rootfinding, Iterative method.

15 citations


Journal ArticleDOI
TL;DR: Using a self-correcting parameter, calculated by using Newton’s interpolatory polynomial of second degree, the R -order of convergence is increased from 2 to 3, which provides a very high computational efficiency of the proposed method.

13 citations


01 Jan 2012
TL;DR: The spirit of Newton method is continued to develop an alternative approximation for the Newton step via diagonal updating to reduce the computational complexity of the classical Newton’s method for solving large-scale systems of nonlinear equations.
Abstract: The prominent method for solving nonlinear equation is the classical Newton’s method. Nevertheless the method is computational expensive, especially when handling largescale systems, it requires more computation for storage of the Jacobian matrix, as well as solving a Newton’s system at each iteration. In this paper, we continue the spirit of Newton method to develop an alternative approximation for the Newton step via diagonal updating. The anticipation behind our approach is to reduce the computational complexity of the classical Newton’s method for solving large-scale systems of nonlinear equations. The convergence of the method proposed has been proven under standard assumptions. Numerical investigation into the effectiveness and consistency of the proposed scheme are given by numerical evaluation of some well-known benchmark nonlinear systems with some variants of Newton’s method.

13 citations


01 Jan 2012
TL;DR: In this paper, the authors apply Newton's method to stochastic functional dierential equations and formulate a Gronwall-type inequality which plays an important role in the proof of the convergence theorem for the Newton method.
Abstract: In this article, we apply Newton's method to stochastic functional dierential equations. The first part concerns a first-order convergence. We formulate a Gronwall-type inequality which plays an important role in the proof of the convergence theorem for the Newton method. In the second part a probabilistic second-order convergence is studied.

7 citations


Journal ArticleDOI
TL;DR: In this paper, a family of derivative-free methods of cubic convergence for solving nonlinear equations is suggested In the proposed methods, several linear combinations of divided differences are used in order to get a good estimation of the derivative of the given function at the different steps of the iteration The efficiency indices of the members of this family are equal to 1442 The convergence and error analysis are given Also, numerical examples are used to show the performance of the presented methods and to compare with other derivative free methods.
Abstract: In this paper, a family of derivative-free methods of cubic convergence for solving nonlinear equations is suggested In the proposed methods, several linear combinations of divided differences are used in order to get a good estimation of the derivative of the given function at the different steps of the iteration The efficiency indices of the members of this family are equal to 1442 The convergence and error analysis are given Also, numerical examples are used to show the performance of the presented methods and to compare with other derivative-free methods And, were applied these methods on smooth and nonsmooth equations which is a well-known basic method and converges quadratically in the neighbor-hood of simple root r This method is not applicable when the derivative of any function is not defined in any interval Therefore the Newton's method was modified by Steffensen, who replaced the first derivative f '(x) in Newton's method by forward difference approximation

6 citations


Journal ArticleDOI
TL;DR: A new generalized univariate Newton method for solving nonlinear equations, motivated by Bregman distances and proximal regularization of optimization problems is devised, a special instance of which is the classical Newton method.
Abstract: We devise a new generalized univariate Newton method for solving nonlinear equations, motivated by Bregman distances and proximal regularization of optimization problems. We prove quadratic convergence of the new method, a special instance of which is the classical Newton method. We illustrate the possible benefits of the new method over the classical Newton method by means of test problems involving the Lambert W function, Kullback–Leibler distance, and a polynomial. These test problems provide insight as to which instance of the generalized method could be chosen for a given nonlinear equation. Finally, we derive a closed-form expression for the asymptotic error constant of the generalized method and make further comparisons involving this constant.

5 citations


01 Jan 2012
TL;DR: In this paper, a new modified Newton method for solving a nonlinear equation by using the well known Mid-Point Newton's method was proposed and it is shown by numerical results that this new iteration method is more quickly convergence than Newton's methods.
Abstract: In this paper, we present a new modified Newton method for solving a nonlinear equation by using the well known Mid-point Newton's method. We proposed a new iteration method and we show by numerical results that this new iteration method is more quickly convergence than Newton's method, hybrid iteration method, new hybrid iteration method and Mid-point Newton's method and so pre sent iteration needs lesser number of functional evaluations.

5 citations


Journal ArticleDOI
TL;DR: In this article, a new semilocal convergence theorem for inexact Newton method is presented under the hypothesis that the first derivative satisfies some kind of weak Lipschitz conditions.
Abstract: Under the hypothesis that the first derivative satisfies some kind of weak Lipschitz conditions, a new semilocal convergence theorem for inexact Newton method is presented. Unified convergence criteria ensuring the convergence of inexact Newton method are also established. Applications to some special cases such as the Kantorovich type conditions and 𝛾-Conditions are provided and some well-known convergence theorems for Newton's method are obtained as corollaries.

01 Jan 2012
TL;DR: In this paper, a general Steffensen type inequality with respect to the derivatives of Riemann-Liouville, Canavati, Caputo, Hadamard and Erdelyi-Kober types is discussed.
Abstract: In this paper, we state, prove and discuss new general Steffensen type inequality. As a special case of that general result we obtain fractional inequalities involving fractional integrals and derivatives of Riemann-Liouville, Canavati, Caputo, Hadamard and Erdelyi-Kober types as well as fractional integrals of a function with respect to another function. Furthermore, we show that our main result covers much more general situations applying it to multidimensional settings. Finally we give mean value theorems for linear functionals related to obtained Steffensen type inequalities.

Journal Article
TL;DR: In this article, a modified technique for solving nonlinear equations is proposed, which is fully free from derivative calculation per full cycle and consumes only four pieces of function evaluations to reach the local convergence rate.
Abstract: This paper proposes a modified technique for solving nonlinear equations The technique is fully free from derivative calculation per full cycle and consumes only four pieces of function evaluations to reach the local convergence rate eight This shows that our technique is optimal due to the conjecture of Kung and Traub The contributed class is built by using weight function approach In the sequel, theoretical results are given and finally numerical examples are employed to evaluate and illustrate the accuracy of the novel methods derived of the modified technique

01 Jan 2012
TL;DR: This paper introduces a new procedure that will reduce the well known shortcoming of Newton method and was implemented on some benchmarks nonlinear systems which show that, the proposed method is very encouraging.
Abstract: There is a great deal of interest on reducing overall computational budget of classical Newton’s method for solving nonlinear systems of equations. The appealing approach is based on chord Newton’s but the method mostly requires high number of iteration as the dimension of the systems increases. In this paper, we introduce a new procedure that will reduce the well known shortcoming of Newton method. Our approach was implemented on some benchmarks nonlinear systems which show that, the proposed method is very encouraging.

Proceedings ArticleDOI
23 Jun 2012
TL;DR: The regularized Newton method for multiobjective optimization generates a sequence that converges to the optimal points from any starting point, and does not require strong convexity property in the entire space.
Abstract: In this paper, we introduce the regularized Newton method for multiobjective optimization. The method does not scalarize the original multiobjective optimization problem. For any vector convex function, with a compact level set, the regularized Newton method generates a sequence that converges to the optimal points from any starting point. Moreover the regularized Newton method does not require strong convexity property in the entire space.

Proceedings ArticleDOI
03 Jun 2012
TL;DR: In this article, the authors established the Newton-Kantorovich convergence theorem for a deformed Newton method in Banach space by using three orders majorizing function, which was used to solve the nonlinear operator equation.
Abstract: We establish the Newton-Kantorovich convergence theorem for a deformed Newton methods in Banach space by using three orders majorizing function, which is used to solve the nonlinear operator equation. We also present the error estimate. Finally, some examples are provided to show the application of our theorem.

Proceedings ArticleDOI
23 Aug 2012
TL;DR: In this paper, the authors established the Newton-Kantorovich convergence theorem with three orders for a deformed Newton methods in Banach space by using two orders majorizing function, which was used to solve the nonlinear operator equation.
Abstract: In this study, we establish the Newton-Kantorovich convergence theorem with three orders for a deformed Newton methods in Banach space by using two orders majorizing function, which is used to solve the nonlinear operator equation. We also present the error estimate. Finally, the examples are provided to show the application of our theorem.

01 Jan 2012
TL;DR: In this article, the authors proposed two new iterative methods for solving nonlinear equations, which do not use derivatives and only two evaluations of the function are needed per iteration; furthermore, the new proposed methods have the same performance as Newton's method with the advantage of being derivative free.
Abstract: This paper proposes two new iterative methods for solving nonlinear equations. In comparison to the classical Newton’s method, the new proposed methods do not use derivatives; furthermore only two evaluations of the function are needed per iteration. Using the methods proposed, when the starting value is selected close to the root, the order of convergence is 2. The development of the method allows you to achieve classical methods such as secant and Steffensen’s as an alternative to the usual process. The numerical examples show that the proposed methods have the same performance as Newton’s method with the advantage of being derivative free. In comparison to other methods which are derivative free, these methods are more efficient.

Journal ArticleDOI
15 May 2012
TL;DR: In this paper, the authors presented new results for the local convergence of Newton's method to a unique solution of an equation in a Banach space setting, which can compare favorably to other ones using Newton-Kantorovich and Lipschitz conditions.
Abstract: We present new results for the local convergence of Newton's method to a unique solution of an equation in a Banach space setting. Under a flexible gamma-type condition [12], [13], we extend the applicability of Newton's method by enlarging the radius and decreasing the ratio of convergence. The results can compare favorably to other ones using Newton-Kantorovich and Lipschitz conditions [3]-[7], [9]-[13]. Numerical examples are also provided.

Book ChapterDOI
05 Jun 2012

Proceedings ArticleDOI
29 May 2012
TL;DR: Numerical results show two variants of Newton's methods with seventh-order convergence with definite performance have definite performance.
Abstract: In this paper, we present two variants of Newton's methods with seventh-order convergence. Numerical results show these methods have definite performance.

Journal ArticleDOI
TL;DR: In this article, a modified seventh-order convergent Newton-type method for solving nonlinear equations is presented, which is free from second derivatives, and requires three evaluations of the functions and two evaluations of derivatives at each step.
Abstract: In this paper, we present a modified seventh-order convergent Newton-type method for solving nonlinear equations. It is free from second derivatives, and requires three evaluations of the functions and two evaluations of derivatives at each step. Therefore the efficiency index of the presented method is 1.47577 which is better than that of classical Newton’s method 1.41421. Some numerical results demonstrate the efficiency and performance of the presented method.

Journal ArticleDOI
TL;DR: In this paper, the authors present and analyze a 6-order convergent method for solving nonlinear equations, which is free from second derivatives and permits f'(x) = 0 in some points.
Abstract: In this paper, we present and analyze a sixth-order convergent method for solving nonlinear equations. The method is free from second derivatives and permits f’(x)=0 in some points. It requires three evaluations of the given function and two evaluations of its derivative in each step. Some numerical examples illustrate that the presented method is more efficient and performs better than classical Newton’s method.

Journal ArticleDOI
TL;DR: In this paper, a variant of Newton method with order of convergence eight for solving nonlinear equations is presented, which requires three evaluations of the functions and two evaluations of derivatives in each step, therefore the efficiency index of the presented method is 1.5157 which is better than that of classical Newton's method 1.4142.
Abstract: In this paper, we present a variant of Newton method with order of convergence eight for solving nonlinear equations. The method is free from second derivatives. It requires three evaluations of the functions and two evaluations of derivatives in each step. Therefore the efficiency index of the presented method is 1.5157 which is better than that of classical Newton’s method 1.4142. Some numerical experiments illustrate that the proposed method is more efficient and performs better than classical Newton's method.

Journal ArticleDOI
TL;DR: In this article, a new modified Newton's method with third-order convergence was presented and compared with the Jarratt method, which is of fourth-order, and a family of Newton-type methods, which converge cubically.
Abstract: We present a new modified Newton's method with third-order convergence and compare it with the Jarratt method, which is of fourth-order. Based on this new method, we obtain a family of Newton-type methods, which converge cubically. Numerical examples show that the presented method can compete with Newton's method and other known third-order modifications of Newton's method.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a fifth-order iterative method for the solution of nonlinear equation based on the Noor's third-order method, which is a modified Householder method without second derivatives.
Abstract: We present a fifth-order iterative method for the solution of nonlinear equation. The new method is based on the Noor's third-order method, which is a modified Householder method without second derivatives. Its efficiency index is 1.4953 which is better than that of Newton's method and Noor's method. Numerical results show the efficiency of the proposed method.

01 Jan 2012
TL;DR: In this paper, a new method for solving a non-linear equation f(x) = 0 is presented, which is quadratically convergent, it converges faster than the classical Newton-Raphson method and the Newton- Raphson algorithm appears as the limiting case of the presented method.
Abstract: We present a new method for solving a non-linear equation f(x)=0. The presented method is quadratically convergent, it converges faster than the classical Newton-Raphson method and the Newton- Raphson method appears as the limiting case of the presented method.

Journal ArticleDOI
TL;DR: In this paper, a new iterative method for solving nonlinear equations is proposed, which is free from second derivatives and it requires three evaluations of the functions and two evaluations of derivatives in each iteration.
Abstract: In this paper, we present and analyze a new iterative method for solving nonlinear equations. It is proved that the method is six-order convergent. The algorithm is free from second derivatives, and it requires three evaluations of the functions and two evaluations of derivatives in each iteration. The efficiency index of the presented method is 1.431 which is better than that of classical Newton’s method 1.414. Some numerical experiments illustrate that the proposed method is more efficient and performs better than classical Newton's method and some other methods.

Proceedings ArticleDOI
13 Aug 2012
TL;DR: In this paper, an infinite dimensional space framework using the concept of generalized Newton differentiation was developed to approximate the problem with the unconstrained minimization problem and to make the pointwise maximum function Newton differentiable.
Abstract: In this paper we treat a gradient constrained minimization problem, particular case of which is the elasto-plastic torsion problem. In order to get the numerical approximation to the solution we have developed an algorithm in an infinite dimensional space framework using the concept of the generalized (Newton) differentiation. Regularization was done in order to approximate the problem with the unconstrained minimization problem and to make the pointwise maximum function Newton differentiable. Using semismooth Newton method, continuation method was developed in function space. For the numerical implementation the variational equations at Newton steps are discretized using finite elements method.

Proceedings ArticleDOI
04 Jul 2012
TL;DR: A new iterative learning control scheme based on damped Newton method for nonlinear systems is proposed, which has advantage of the fast convergence speed and the structure of the algorithm is entirely investigated.
Abstract: A new iterative learning control scheme based on damped Newton method for nonlinear systems is proposed. Originated from the Newton method, this new algorithm has advantage of the fast convergence speed. New parameter with clear structure is employed here. Furthermore, sufficient conditions for convergence of this new Newton-type method are given, and the proposed algorithm ensures the tracking error converges to zero and the sequence of the input is convergent with order 2 speed. Finally, the structure of the algorithm is entirely investigated.