Lambda calculus (also written as λ-calculus) is a formal system in mathematical logic for expressing computation based on function abstraction and application using variable binding and substitution. It is a universal model of computation that can be used to simulate any single-taped Turing machine and was first introduced by mathematician Alonzo Church in the 1930s as part of his research of the foundations of mathematics.
Lambda calculus consists of constructing lambda terms and performing reduction operations on them. In the simplest form of lambda calculus, terms are built using only the following rules:
producing expressions such as: (λx.λy.(λz.(λx.z x) (λy.z y)) (x y)). Parentheses can be dropped if the expression is unambiguous. For some applications, terms for logical and mathematical constants and operations may be included.
The reduction operations include:
If repeated application of the reduction steps eventually terminates then by the Church-Rosser theorem it will produce a beta normal form.
Lambda calculus is Turing complete, that is, it is a universal model of computation that can be used to simulate any single-taped Turing machine. Its namesake, the Greek letter lambda (λ), is used in lambda expressions and lambda terms to denote binding a variable in a function.
Lambda calculus may be untyped or typed. In typed lambda calculus, functions can be applied only if they are capable of accepting the given input's "type" of data. Typed lambda calculi are weaker than the untyped lambda calculus that is the primary subject of this article, in the sense that typed lambda calculi can express less than the untyped calculus can, but on the other hand typed lambda calculi allow more things to be proved; in the simply typed lambda calculus it is for example a theorem that every evaluation strategy terminates for every simply typed lambda-term, whereas evaluation of untyped lambda-terms need not terminate. One reason there are many different typed lambda calculi has been the desire to do more (of what the untyped calculus can do) without giving up on being able to prove strong theorems about the calculus.