Bit, the smallest unit of data in computing, is a portmanteau of a binary and digit.
The term bit, that has two possible values 0 or 1, was introduced by American mathematician John Tukey in 1947 while he was working on developing statistical methods for computers at Bell Labs.
Claude Shannon would note in 'A Mathematical Theory of Communication' : "The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If the base 2 is used the resulting units may be called binary digits, or more briefly bits, a word suggested by J. W. Tukey."
Source: A Mathematical Theory of Communication