*** Welcome to piglix ***

Gödel numbering


In mathematical logic, a Gödel numbering is a function that assigns to each symbol and well-formed formula of some formal language a unique natural number, called its Gödel number. The concept was used by Kurt Gödel for the proof of his incompleteness theorems. (Gödel 1931)

A Gödel numbering can be interpreted as an encoding in which a number is assigned to each symbol of a mathematical notation, after which a sequence of natural numbers can then represent a sequence of symbols. These sequences of natural numbers can again be represented by single natural numbers, facilitating their manipulation in formal theories of arithmetic.

Since the publishing of Gödel's paper in 1931, the term "Gödel numbering" or "Gödel code" has been used to refer to more general assignments of natural numbers to mathematical objects.

Gödel noted that statements within a system can be represented by natural numbers. The significance of this was that properties of statements - such as their truth and falsehood - would be equivalent to determining whether their Gödel numbers had certain properties. The numbers involved might be very long indeed (in terms of number of digits), but this is not a barrier; all that matters is that we can show such numbers can be constructed.

In simple terms, we devise a method by which every formula or statement that can be formulated in our system gets a unique number, in such a way that we can mechanically convert back and forth between formulas and Gödel numbers. Clearly there are many ways this can be done. Given any statement, the number it is converted to is known as its Gödel number. A simple example is the way in which English is stored as a sequence of numbers in computers using ASCII or Unicode:

Gödel used a system based on prime factorization. He first assigned a unique natural number to each basic symbol in the formal language of arithmetic with which he was dealing.


...
Wikipedia

...