In mathematics, the Gibbs phenomenon, discovered by Henry Wilbraham (1848) and rediscovered by J. Willard Gibbs (1899), is the peculiar manner in which the Fourier series of a piecewise continuously differentiable periodic function behaves at a jump discontinuity. The nth partial sum of the Fourier series has large oscillations near the jump, which might increase the maximum of the partial sum above that of the function itself. The overshoot does not die out as n increases, but approaches a finite limit. This sort of behavior was also observed by experimental physicists, but was believed to be due to imperfections in the measuring apparatuses.
This is one cause of ringing artifacts in signal processing.
The Gibbs phenomenon involves both the fact that Fourier sums overshoot at a jump discontinuity, and that this overshoot does not die out as more terms are added to the sum.
The three pictures on the right demonstrate the phenomenon for a square wave (of height ) whose Fourier expansion is
More precisely, this is the function f which equals between and and between and for every integer n; thus this square wave has a jump discontinuity of height at every integer multiple of .