|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
The bit (a portmanteau of binary digit) is a basic unit of information used in computing and digital communications. A binary digit can have only one of two values, and may be physically represented with a two-state device. These state values are most commonly represented as either a 0or1.
The two values of a binary digit can also be interpreted as logical values (true/false, yes/no), algebraic signs (+/−), activation states (on/off), or any other two-valued attribute. The correspondence between these values and the physical states of the underlying storage or device is a matter of convention, and different assignments may be used even within the same device or program. The length of a binary number may be referred to as its bit-length.
In information theory, one bit is typically defined as the uncertainty of a binary random variable that is 0 or 1 with equal probability, or the information that is gained when the value of such a variable becomes known.
In quantum computing, a quantum bit or qubit is a quantum system that can exist in superposition of two classical (i.e., non-quantum) bit values.
The symbol for binary digit is either simply bit (recommended by the IEC 80000-13:2008 standard) or lowercase b (recommended by the IEEE 1541-2002 standard). A group of eight binary digits is commonly called one byte, but historically the size of the byte is not strictly defined.