*** Welcome to piglix ***

Quantum probability


Quantum probability was developed in the 1980s as a noncommutative analog of the Kolmogorovian theory of . One of its aims is to clarify the mathematical foundations of quantum theory and its statistical interpretation.

A significant recent application to physics is the dynamical solution of the quantum measurement problem, by giving constructive models of quantum observation processes which resolve many famous paradoxes of quantum mechanics.

Some recent advances are based on quantum filtering and feedback control theory as applications of .

Orthodox quantum mechanics has two seemingly contradictory mathematical descriptions:

Most physicists are not concerned with this apparent problem. Physical intuition usually provides the answer, and only in unphysical systems (e.g., Schrödinger's cat, an isolated atom) do paradoxes seem to occur.

Orthodox quantum mechanics can be reformulated in a quantum-probabilistic framework, where quantum filtering theory (see Bouten et al. for introduction or Belavkin, 1970s) gives the natural description of the measurement process. This new framework encapsulates the standard postulates of quantum mechanics, and thus all of the science involved in the orthodox postulates.

In classical probability theory, information is summarized by the sigma-algebra F of events in a classical probability space (Ω, F,P). For example, F could be the σ-algebra σ(X) generated by a random variable X, which contains all the information on the values taken by X. We wish to describe quantum information in similar algebraic terms, in such a way as to capture the non-commutative features and the information made available in an experiment. The appropriate algebraic structure for observables, or more generally operators, is a *-algebra. A (unital) *- algebra is a complex vector space A of operators on a Hilbert space H that


...
Wikipedia

...