ROOTS OF ECUATIONS

Let’s take a look at a very powerful tool in Mathcad that started a revolution in computational analysis. It
started in the late 1980’s when I was an undergraduate. A company called Wolfram created a computer
program called Mathematica. This was the first computer code that could solve algebraic and calculus
equations symbolically. That is, if I had an equation that said x*y = z, Mathematica could tell me that
y = z / x, without ever needing me to assign numbers to x, y, or z. It also was able to solve integrals, differential
equations, and derivatives symbolically. This was an incredible advance, and opened the doors
to a whole new world of programming, numerical methods, pure mathematics, engineering, and science.
Since then, a competing code called Maple was developed and sold itself to other software companies to
include in their programs.
The end result: Mathcad uses Maple as a solving engine in the background (you don’t see it) to solve
problems symbolically. Here we will look at a brief example of how to use this capability in the context
of solving a system of linear equations.

In numerical analysis, Newton's method (also known as the Newton–Raphson method), named after Isaac Newton and Joseph Raphson, is perhaps the best known method for finding successively better approximations to the roots of areal-valued function. Newton's method can often converge remarkably quickly, especially if the iteration begins "sufficiently near" the desired root. Just how near "sufficiently near" needs to be, and just how quickly "remarkably quickly" can be, depends on the problem. This is discussed in detail below. Unfortunately, when iteration begins far from the desired root, Newton's method can easily lead an unwary user astray with little warning. Thus, good implementations of the method embed it in a routine that also detects and perhaps overcomes possible convergence failures.
Given a function ƒ(x) and itsderivative ƒ '(x), we begin with a first guess x0. Provided the function is reasonably well-behaved a better approximation x1 is

X1=X0-(F(xo)/F'(xo))

The process is repeated until a sufficiently accurate value is reached:

Xn+1=Xn-(F(Xn)/(F'(Xn))

An important and somewhat surprising application is Newton–Raphson division, which can be used to quickly find the reciprocal of a number using only multiplication and subtraction.


The idea of the method is as follows: one starts with an initial guess which is reasonably close to the true root, then the function is approximated by its tangent line (which can be computed using the tools of calculus), and one computes the x-intercept of this tangent line (which is easily done with elementary algebra). This x-intercept will typically be a better approximation to the function's root than the original guess, and the method can be iterated.
Suppose ƒ : [a, b] → R is a differentiable function defined on the interval [a, b] with values in the real numbers R. The formula for converging on the root can be easily derived. Suppose we have some current approximation xn. Then we can derive the formula for a better approximation, xn+1 by referring to the diagram on the right. We know from the definition of the derivative at a given point that it is the slope of a tangent at that point.

That is F'(Xn)=rise/run=delY/delX= F(Xn)-0/Xn-Xn+1

Here, f ' denotes the derivative of the function f. Then by simple algebra we can derive

Xn+1=Xn-(F(Xn)/F'(Xn))

We start the process off with some arbitrary initial value x0. (The closer to the zero, the better. But, in the absence of any intuition about where the zero might lie, a "guess and check" method might narrow the possibilities to a reasonably small interval by appealing to the intermediate value theorem .The method will usually converge, provided this initial guess is close enough to the unknown zero, and that ƒ'(x0) ≠ 0. Furthermore, for a zero of multiplicity 1, the convergence is at least quadratic (see rate of convergence) in a neighbourhood of the zero, which intuitively means that the number of correct digits roughly at least doubles in every step. More details can be found in the analysis section below.

0 Response to "ROOTS OF ECUATIONS"

Publicar un comentario