|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
The bit used to denote either, 1) the basic and physical unit of information in computing and digital communications or 2) a binary digit. A binary digit can have only one of two values, and may therefore be physically implemented with a two-state device. These values are most commonly represented as either a 0or1. The term bit is a portmanteau of binary digit. In information theory, bit may be used as a synonym of shannon as a unit of information, named after Claude Shannon.
The two values of a binary digit can also be interpreted as logical values (true/false, yes/no), algebraic signs (+/−), activation states (on/off), or any other two-valued attribute. The correspondence between these values and the physical states of the underlying storage or device is a matter of convention, and different assignments may be used even within the same device or program. The length of a binary number may be referred to as its bit-length.
In information theory, one bit is typically defined as the uncertainty of a binary random variable that is 0 or 1 with equal probability, or the information that is gained when the value of such a variable becomes known.
In quantum computing, a quantum bit or qubit is a quantum system that can exist in superposition of two classical (i.e., non-quantum) bit values.