In mathematics, a real number is a value that represents a quantity along a line. The adjective real in this context was introduced in the 17th century by René Descartes, who distinguished between real and imaginary roots of polynomials.
The real numbers include all the rational numbers, such as the integer −5 and the fraction 4/3, and all the irrational numbers, such as √2 (1.41421356..., the square root of 2, an irrational algebraic number). Included within the irrationals are the transcendental numbers, such as π (3.14159265...). Real numbers can be thought of as points on an infinitely long line called the number line or real line, where the points corresponding to integers are equally spaced. Any real number can be determined by a possibly infinite decimal representation, such as that of 8.632, where each consecutive digit is measured in units one tenth the size of the previous one. The real line can be thought of as a part of the complex plane, and complex numbers include real numbers.
These descriptions of the real numbers are not sufficiently rigorous by the modern standards of pure mathematics. The discovery of a suitably rigorous definition of the real numbers – indeed, the realization that a better definition was needed – was one of the most important developments of 19th century mathematics. The current standard axiomatic definition is that real numbers form the unique complete totally ordered field (ℝ ; + ; · ; <), up to an isomorphism, whereas popular constructive definitions of real numbers include declaring them as equivalence classes of Cauchy sequences of rational numbers, Dedekind cuts, or infinite decimal representations, together with precise interpretations for the arithmetic operations and the order relation. All these definitions satisfy the axiomatic definition and are thus equivalent.