Term:bit

bit is a single digit in the binary number system, and the smallest unit of information that a digital computer can deal with. It’s a single item that is either “on” (a numerical value of 1) or “off” (0).

Everything (and I do mean everything) that digital computers do and deal with just use collections of bits. Computer programs? Represented using bits. Photographs? Bits. Videos? Lots and lots of bits. This definition? A collection of bits stored on a computer somewhere.

bit (Wikipedia)

The bit is the most basic unit of information in computing and digital communications. The name is a contraction of binary digit. The bit represents a logical state with one of two possible values. These values are most commonly represented as either "1" or "0", but other representations such as true/false, yes/no, +/, or on/off are commonly used.

The correspondence between these values and the physical states of the underlying storage or device is a matter of convention, and different assignments may be used even within the same device or program. It may be physically implemented with a two-state device.

The symbol for the binary digit is either 'bit' per recommendation by the IEC 80000-13:2008 standard, or the lowercase character 'b', as recommended by the IEEE 1541-2002 standard.

A contiguous group of binary digits is commonly called a bit string, a bit vector, or a single-dimensional (or multi-dimensional) bit array. A group of eight binary digits is called one byte, but historically the size of the byte is not strictly defined. Frequently, half, full, double and quadruple words consist of a number of bytes which is a low power of two.

In information theory, one bit is the information entropy of a binary random variable that is 0 or 1 with equal probability, or the information that is gained when the value of such a variable becomes known. As a unit of information, the bit is also known as a shannon, named after Claude E. Shannon.

« Back to Glossary Index