MODIFIED HALLEY ’ S METHOD FOR SOLVING NONLINEAR FUNCTIONS WITH CONVERGENCE OF ORDER SIX AND EFFICIENCY INDEX 1 . 8171

In this paper, we describe the modified Halley’s method for solving nonlinear functions and analyzed. The modified Halley’s method has convergence of order 6 and efficiency index 1.8171 and converges faster than Newton’s method and Halley’s method. The comparison tables demonstrate the faster convergence of modified Halley’s method. AMS Subject Classification: 65H05, 65D32


Introduction
The problem, to recall, is solving equations in one variable.We are given a function f and would like to find at least one solution of the equation f (x) = 0. Note that, we do not put any restrictions on the function f , we need to be able to evaluate the function; otherwise, we cannot even check that a given x = ξ is true, that is f (r) = 0.In reality, the more ability to be able to evaluate the function does not suffice.We need to assume some kind of "good behavior".The more we assume,the more potential we have, on the one hand, to develop fast iteration scheme for finding the root.At the same time, the more we assume, the fewer the functions are going to satisfy our assumptions!This a fundamental paradigm in numerical analysis.
We know that one of the fundamental algorithm for solving nonlinear equations is so-called fixed point iteration method [10].
In the fixed-point iteration method for solving nonlinear equation f (x) = 0, the equation is usually rewritten as where: Considering the following iteration scheme x n+1 = g(x n ), n = 0, 1, 2..., (1.2) and starting with a suitable initial approximation x 0 , we built up a sequence of approximations, say {x n }, for the solution of nonlinear equation, say ξ. the scheme will be converge to ξ, provided that: It is well known that the fixed point method has first order convergence.By using Taylor expansion, expanding f (x) about the point x k such that If f ′ (x k ) = 0, we can evaluate the above expression as follow's: If we choose x k+1 the root of equation, then we have This is so-called the Newton's method (NM) [4,14,16] for root-finding of nonlinear functions, which converges quadratically.From (1.3) one can evaluate This is so-called the Halley's method (HM) [5,7,8] for root-finding of nonlinear functions, which converges cubically.During the last century, the numerical techniques for solving nonlinear equations have been successfully applied (see, e.g., [1,2,3,4,5,6,7,8,9,11,12,13,14,15,16,17] and the references therein).McDougall and Wotherspoon [13] modified the Newton's method and their modified Newton's method have convergence of order 1 + √ 2.
In this paper, we proposed a modified Halley's method having convergence of order 6 extracted from Halley's method to solve nonlinear functions motivated by the technique of McDougall and Wotherspoon [13].The proposed modified Halley's method applied to solve some problems in order to assess its validity and accuracy.

Iterative Method
Let f : X −→ R, X ⊂ R is a scalar function then by using Taylor expansion, expanding f (x) about the point x n , the recognized Halley's method as which is of third ordered convergence.By following the approach of McDougall and Wotherspoon [13], we create modified Halley's method as follows: Initially, we select two starting points x and x * .Then, we set x * = x our modified Halley's method that we observe herein is given by which implies (for k ≥ 1) ) . (2.5) These are the main steps of our modified Halley's method.The value of x 2 is computed from x 1 using f (x 1 ) and the values of first and second derivatives of f (x) measured at 1  2 (x 1 + x * 1 ) (which is more appropriate value of the derivatives to use than the one at x 1 ), and this same value of derivatives is re-used in the next predictor step to find x * 3 .This re-use of the derivative means that the evaluations of the started values of x in (2.4) essentially come for free, which then implements the more appropriate value of the derivatives to be expended in the corrector step (2.5).

Convergence Analysis
Theorem 3.1.Suppose that α is a root of the equation f (x) = 0.If f (x) is sufficiently smooth in the neighborhood of α, then the convergence order of modified Halley's method given in (2.5) is at least six.
Proof.Suppose that α is a root of f (x) = 0 and e n be the error at nth iteration then e n = x n − α.Using Taylor series expansion, we get where We have Now expanding f (x), f ′ (x), f ′′ (x) about α and using (3.4), we get 3 )e 5 n + (96c Combining (3.1)-(3.8),we get which shows that the convergence order of modified Halley's method is at least six.

Comparisons of Efficiency Index
Weerakoon and Fernando [17], Homeier [9] and Frontini, and Sormani [6] have presented numerical methods having cubic convergence.In each iteration of these numerical methods three evaluations are required of either the function or its derivative.The best way of comparing these numerical methods is to express the of convergence per function or derivative evaluation, the so-called "efficiency" of the numerical method.On this basis, the Newton's method has an efficiency of 2 1 2 ≈ 1.4142, the cubic convergence methods have an efficiency of 3 1 3 ≈ 1.4422.Kuo [12] has developed several methods that each require two function evaluations and two derivative evaluations and these methods achieve an order of convergence of either five or six, so having efficiencies of 5 1 4 ≈ 1.4953 and 6 1 4 ≈ 1.5651 respectively.In these Kuo's methods the denominator is a linear combination of derivatives evaluated at different values of x, so that, when the starting value of x is not close to the root, this denominator may go to zero and the methods may not converge.Of the four 6th order methods suggested in Kuo [12], if the ratio of function's derivatives at the two value of x differ by a factor of more than three, then the method gives an infinite change in x.That is, the derivatives at the predictor and corrector stages can both be the same sign, but if their magnitudes differ by more than a factor of three, the method does not converge.
Jarrat [11] developed a 4th order method that requires only one function evaluation and two derivative evaluations, and similar 4th order method have been described by Soleymani et al. [15].Jarrat's method is similar to those of Kuo's methods in that if the ratio of derivatives at the predictor and corrector steps exceeds a factor of three, the method gives an infinite change in x.Jarratt's methods is similar to those of Kou in that if the ratio of the derivatives at the predictor and corrector steps exceeds a factor of three, the method gives an infinite change in x.
While the modified Halley's method has an efficiency of 6 1 3 ≈ 1.8171 larger than the efficiencies of all methods discussed above.
The efficiencies of the methods we have discussed are summarized in Table 1.

Applications
In this section we interpolated some nonlinear functions to explain the efficiency of our established modified Halley's method.In Tables 2-7, The column represent the number of iterations N and the number of functional or derivative evaluations N f required to meet the stopping criteria, and the magnitude |f (x)| of f (x) at the final estimate x n+1 .Then we compare the performance of modified Halley's method (MHM) with Newton's method (NM) and Halley's method (HM) by taking the following nonlinear functions.
Example 5.1.Suppose that f (x) = sin 2 (x)−x 2 +1.Now take x 0 = 1, the number of iterations 7 and the number of functions or derivative evaluations 14 in Newton's method, the number of iterations 4 and the number of functions or derivative evaluations 12 in Halley's method, the number of iterations 3 and the number of functions or derivative evaluations 9 in modified Halley's method, and then take x 0 = 3, the number of iterations 7 and the number of functions or derivative evaluations 14 in Newton's method, the number of iterations 5 and the number of functions or derivative evaluations 15 in Halley's method, the number of iterations 3 and the number of functions or derivative evaluations 9 in modified Halley's method.Then the comparison of Newton's method, Halley's method, and modified Halley's method as shown in Tables 2 and 3.The comparison table presents that modified Halley's method shows better than the other iterative methods.Example 5.2.Suppose that f (x) = x 2 − e x − 3x + 2. Now take x 0 = 0.5, the number of iterations 6 and the number of functions or derivative evaluations 12 in Newton's method, the number of iterations 4 and the number of functions or derivative evaluations 12 in Halley's method, the number of iterations 3 and the number of functions or derivative evaluations 9 in modified Halley's method, and then take x 0 = 0, the number of iterations 5 and the number of functions or derivative evaluations 10 in Newton's method, the number of iterations 3 and the number of functions or derivative evaluations 9 in Halley's method, the number of iterations 2 and the number of functions or derivative evaluations 6 in modified Halley's method.Then the comparison of Newton's method, Halley's method, and modified Halley's method as shown in Tables 4 and 5.The comparison table presents that modified Halley's method shows better than the other iterative methods.Example 5.3.Suppose that f (x) = x + ln(x − 2).Now take x 0 = 2.2, and the number of iterations 6 and the number of functions or derivative evaluations 12 in Newton's method, the number of iterations 4 and the number of functions or derivative evaluations 12 in Halley's method, the number of iterations 3 and the number of functions or derivative evaluations 9 in modified Halley's method.Then the comparison of Newton's method, Halley's method, and modified Halley's method as shown in Table 6.The comparison table presents that modified Halley's method shows better than the other iterative methods.Example 5.4.Suppose that f (x) = x 2 + sin( x 5 ) − 1 4 .Now take x 0 = 0.5, and the number of iterations 8 and the number of functions or derivative evaluations 16 in Newton's method, the number of iterations 5 and the number of functions or derivative evaluations 15 in Halley's method, the number of iterations 3 and the number of functions or derivative evaluations 9 in modified Halley's method.Then the comparison of Newton's method, Halley's method, and modified Halley's method as shown in Table 7.The comparison table presents that modified Halley's method shows better than the other iterative methods.Tables 2-7 present the numerical comparison of Newton's method, Halley's method and modified Halley's method.The columns represent the number of iterations N and the number of functional or derivative evaluations N f required to meet the stopping criteria, and the magnitude |f (x)| of f (x) at the final estimate x n+1 .
Tables 2-7 present that our modified Halley's method converge to root of f (x) = 0 faster than Newton's method, Halley's method.

Conclusions
A new modified Halley's method for solving nonlinear functions has been established.We can concluded from Tables 1-7 that: 1.The efficiency index of modified Halley's method is 1.8171 that is larger than the other iterative methods.
2. The modified Halley's method has convergence of order six.
3. By using some examples the performance of modified Halley's method is also discussed.The modified Halley's method is performing very well in comparison to Halley's method and Newton's method as discussed in Tables 2-7.

Table 1 .
Comparison of efficiencies of various methods

Table 2 .
Comparison of NM, HM, and MHM

Table 3 .
Comparison of NM, HM, and MHM