Lambda calculus (also written as λ-calculus) is a formal system in mathematical logic for expressing computation based on function abstraction and application using variable binding and substitution. It is a universal model of computation that can be used to simulate any single-taped Turing machine and was first introduced by mathematician Alonzo Church in the 1930s as part of an investigation into the foundations of mathematics.
Lambda calculus consists of constructing lambda terms and performing reduction operations on them. In the simplest form of lambda calculus, terms are built using only the following rules:
producing expressions such as: (λx.λy.(λz.(λx.z x) (λy.z y)) (x y)). Parentheses can be dropped if the expression is unambiguous. For some applications, terms for logical and mathematical constants and operations may be included.
The reduction operations include:
If repeated application of the reduction steps eventually terminates then by the Church-Rosser theorem it will produce a beta normal form,
Lambda calculus is Turing complete, that is, it is a universal model of computation that can be used to simulate any single-taped Turing machine. Its namesake, the Greek letter lambda (λ), is used in lambda expressions and lambda terms to denote binding a variable in a function.
Lambda calculus may be typed and untyped. In typed lambda calculus, functions can be applied only if they are capable of accepting the given input's "type" of data.