*** Welcome to piglix ***

Bounded-error probabilistic polynomial


In computational complexity theory, BPP, which stands for bounded-error probabilistic polynomial time is the class of decision problems solvable by a probabilistic Turing machine in polynomial time with an error probability bounded away from 1/3 for all instances. BPP is one of the largest practical classes of problems, meaning most problems of interest in BPP have efficient probabilistic algorithms that can be run quickly on real modern machines. BPP also contains P, the class of problems solvable in polynomial time with a deterministic machine, since a deterministic machine is a special case of a probabilistic machine.

Informally, a problem is in BPP if there is an algorithm for it that has the following properties:

A language L is in BPP if and only if there exists a probabilistic Turing machine M, such that

Unlike the complexity class ZPP, the machine M is required to run for polynomial time on all inputs, regardless of the outcome of the random coin flips.

Alternatively, BPP can be defined using only deterministic Turing machines. A language L is in BPP if and only if there exists a polynomial p and deterministic Turing machine M, such that

In this definition, the string y corresponds to the output of the random coin flips that the probabilistic Turing machine would have made. For some applications this definition is preferable since it does not mention probabilistic Turing machines.

In practice, an error probability of 13 might not be acceptable, however, the choice of 13 in the definition is arbitrary. It can be any constant between 0 and 12 (exclusive) and the set BPP will be unchanged. It does not even have to be constant: the same class of problems is defined by allowing error as high as 12nc on the one hand, or requiring error as small as 2nc on the other hand, where c is any positive constant, and n is the length of input. The idea is that there is a probability of error, but if the algorithm is run many times, the chance that the majority of the runs are wrong drops off exponentially as a consequence of the Chernoff bound. This makes it possible to create a highly accurate algorithm by merely running the algorithm several times and taking a "majority vote" of the answers. For example, if one defined the class with the restriction that the algorithm can be wrong with probability at most 12100, this would result in the same class of problems.


...
Wikipedia

...