# Runge's phenomenon

In the mathematical field of numerical analysis, **Runge's phenomenon** is a problem of oscillation at the edges of an interval that occurs when using polynomial interpolation with polynomials of high degree over a set of equispaced interpolation points. It was discovered by Carl David Tolmé Runge (1901) when exploring the behavior of errors when using polynomial interpolation to approximate certain functions.^{[1]}
The discovery was important because it shows that going to higher degrees does not always improve accuracy. The phenomenon is similar to the Gibbs phenomenon in Fourier series approximations.

## Contents

## Introduction

The Weierstrass approximation theorem states that for every continuous function *f*(*x*) defined on an interval [*a*,*b*], there exists a set of polynomial functions *P*_{n}(*x*) for *n*=0, 1, 2, …, each of degree *n*, that approximates *f*(*x*) with uniform convergence over [*a*,*b*] as *n* tends to infinity, that is,

Consider the case where one desires to interpolate through *n*+1 equispaced points of a function *f*(*x*) using the *n*-degree polynomial *P*_{n}(*x*) that passes through those points. Naturally, one might expect from Weierstrass' theorem that using more points would lead to a more accurate reconstruction of *f*(*x*). However, this *particular* set of polynomial functions *P*_{n}(*x*) is not guaranteed to have the property of uniform convergence; the theorem only states that a set of polynomial functions exists, without providing a general method of finding one.

The *P*_{n}(*x*) produced in this manner may in fact diverge away from *f*(*x*) as *n* increases; this typically occurs in an oscillating pattern that magnifies near the ends of the interpolation points. This phenomenon is attributed to Runge.

## Problem

Consider the **Runge function**

Runge found that if this function is interpolated at equidistant points *x*_{i} between −1 and 1 such that:

with a polynomial *P*_{n}(*x*) of degree ≤ *n*, the resulting interpolation oscillates toward the end of the interval, i.e. close to −1 and 1. It can even be proven that the interpolation error increases (without bound) when the degree of the polynomial is increased:

This shows that high-degree polynomial interpolation at equidistant points can be troublesome.

### Reason

The error between the generating function and the interpolating polynomial of order *n* is given by

and let be the maximum of the function :

Then it can be proved that, if equidistant nodes are used,^{[2]} then :

Moreover, assume that the *n*-th derivative of is bounded, i.e.

Therefore,

But the magnitude of the n-th derivative of Runge's function increases when n increases, and very fast. The result is that the product in the previous equation tends to infinity when *n* tends to infinity.

Although often used to explain the Runge phenomenon, the fact that the upper bound of the error goes to infinity does not necessarily
imply, of course, that the error itself also diverges with *n.*

## Mitigations to the problem

### Change of interpolation points

The oscillation can be minimized by using nodes that are distributed more densely towards the edges of the interval, specifically, with asymptotic density (on the interval [−1,1]) given by the formula^{[3]}
.
A standard example of such a set of nodes is Chebyshev nodes, for which the maximum error in approximating the Runge function is guaranteed to diminish with increasing polynomial order. The phenomenon demonstrates that high degree polynomials are generally unsuitable for interpolation with equidistant nodes.

### Use of piecewise polynomials

The problem can be avoided by using spline curves which are piecewise polynomials. When trying to decrease the interpolation error one can increase the number of polynomial pieces which are used to construct the spline instead of increasing the degree of the polynomials used.

### Constrained minimization

One can also fit a polynomial of higher degree (for instance instead of ), and fit an interpolating polynomial whose first (or second) derivative has minimal norm.

### Least squares fitting

Another method is fitting a polynomial of lower degree using the method of least squares. Generally, when using m equidistant points, if then least squares approximation is well-conditioned.^{[4]}

### Bernstein Polynomials

Using Bernstein Polynomials, one can uniformly approximate every continuous function in a closed interval, although this method is rather computationally expensive.

## Related statements from the approximation theory

For every predefined table of interpolation nodes there is a continuous function for which the sequence of interpolation polynomials on those nodes diverges.^{[5]} For every continuous function there is a table of nodes on which the interpolation process converges. {{ safesubst:#invoke:Unsubst||date=__DATE__ |$B=
{{#invoke:Category handler|main}}{{#invoke:Category handler|main}}^{[citation needed]}
}} Chebyshev interpolation (i.e., on Chebyshev nodes) converges uniformly for every absolutely continuous function.

## See also

- Compare with the Gibbs phenomenon for sinusoidal basis functions
- Taylor series
- Chebyshev nodes
- Stone–Weierstrass theorem

## References

- ↑ {{#invoke:citation/CS1|citation |CitationClass=citation }} available at www.archive.org
- ↑ {{#invoke:citation/CS1|citation |CitationClass=book }}
- ↑ {{#invoke:citation/CS1|citation |CitationClass=citation }}
- ↑ {{#invoke:citation/CS1|citation |CitationClass=citation }}
- ↑ {{#invoke:citation/CS1|citation |CitationClass=citation }}