In quantum computing, a qubit (/ˈkjuːbɪt/) or quantum bit (sometimes qbit) is a unit of quantum information—the quantum analogue of the classical bit. A qubit is a two-state quantum-mechanical system, such as the polarization of a single photon: here the two states are vertical polarization and horizontal polarization. In a classical system, a bit would have to be in one state or the other. However, quantum mechanics allows the qubit to be in a superposition of both states at the same time, a property that is fundamental to quantum computing.
The concept of the qubit was unknowingly introduced by Stephen Wiesner in 1983, in his proposal for quantum money, which he had tried to publish for over a decade.
The coining of the term "qubit" is attributed to Benjamin Schumacher. In the acknowledgments of his paper, Schumacher states that the term qubit was invented in jest due to its phonological resemblance with an ancient unit of length called cubit, during a conversation with William Wootters. The paper describes a way of compressing states emitted by a quantum source of information so that they require fewer physical resources to store. This procedure is now known as Schumacher compression.
The bit is the basic unit of information. It is used to represent information by computers. Regardless of its physical realization, a bit has two possible states typically thought of as 0 and 1, but more generally—and according to applications—interpretable as true and false, or any other dichotomous choice. An analogy to this is a light switch—its off position can be thought of as 0 and its on position as 1.