The history of logarithms is the story of a correspondence (in modern terms, a group isomorphism) between multiplication on the positive real numbers and addition on the real number line that was formalized in seventeenth century Europe and was widely used to simplify calculation until the advent of the digital computer. The Napierian logarithms were published first in 1614. Henry Briggs introduced common (base 10) logarithms, which were easier to use. Tables of logarithms were published in many forms over four centuries. The idea of logarithms was also used to construct the slide rule, which became ubiquitous in science and engineering until the 1970s. A breakthrough generating the natural logarithm was the result of a search for an expression of area against a rectangular hyperbola, and required the assimilation of a new function into standard mathematics.
The Babylonians sometime in 2000–1600 BC may have invented the quarter square multiplication algorithm to multiply two numbers using only addition, subtraction and a table of quarter squares. Thus, such a table served a similar purpose to tables of logarithms, which also allow multiplication to be calculated using addition and table lookups. However, the quarter-square method could not be used for division without an additional table of reciprocals (or the knowledge of a sufficiently simple algorithm to generate reciprocals). Large tables of quarter squares were used to simplify the accurate multiplication of large numbers from 1817 onwards until this was superseded by the use of computers.
The Indian mathematician Virasena worked with the concept of ardhaccheda: the number of times a number of the form 2n could be halved. For exact powers of 2, this equals the binary logarithm, but it differs from the logarithm for other numbers. He described a product formula for this concept and also introduced analogous concepts for base 3 (trakacheda) and base 4 (caturthacheda).