In the mathematical field of numerical analysis, Runge's phenomenon is a problem of oscillation at the edges of an interval that occurs when using polynomial interpolation with polynomials of high degree over a set of equispaced interpolation points. It was discovered by Carl David Tolmé Runge (1901) when exploring the behavior of errors when using polynomial interpolation to approximate certain functions. The discovery was important because it shows that going to higher degrees does not always improve accuracy. The phenomenon is similar to the Gibbs phenomenon in Fourier series approximations.
The Weierstrass approximation theorem states that for every continuous function f(x) defined on an interval [a,b], there exists a set of polynomial functions Pn(x) for n=0, 1, 2, …, each of degree n, that approximates f(x) with uniform convergence over [a,b] as n tends to infinity, that is,
Consider the case where one desires to interpolate through n+1 equispaced points of a function f(x) using the n-degree polynomial Pn(x) that passes through those points. Naturally, one might expect from Weierstrass' theorem that using more points would lead to a more accurate reconstruction of f(x). However, this particular set of polynomial functions Pn(x) is not guaranteed to have the property of uniform convergence; the theorem only states that a set of polynomial functions exists, without providing a general method of finding one.
The Pn(x) produced in this manner may in fact diverge away from f(x) as n increases; this typically occurs in an oscillating pattern that magnifies near the ends of the interpolation points. This phenomenon is attributed to Runge.
Consider the Runge function