binary digit (portmanteau)
bit
English
“John Tukey needed to name the smallest possible unit of information in 1947—and he hated both 'binit' and 'bigit,' so he went with 'bit.'”
In 1947, John Tukey at Princeton was working on early computing machines. Information in these machines was stored as zeros and ones—binary. The smallest unit was a single zero or one. Tukey needed a word for it. He combined binary + digit = binit, initially.
Tukey hated 'binit.' It sounded ugly. He tested 'bigit.' Worse. In a memo dated July 9, 1947, he proposed using the already-existing word 'bit'—which had meant a small piece of something for centuries. Bit, from Old English bita 'a bite,' suddenly meant the fundamental unit of digital information.
The Bell Telephone Laboratories adopted Tukey's term immediately. By the 1950s, bit was the standard word across computer science. Eight bits made a byte. Kilobits per second measured modem speed. The entire language of digital information grew from Tukey's distaste for 'binit.'
Now bit is the foundation of all digital communication. Your voice is millions of bits transmitted as electrical pulses. Your DNA could be encoded in bits. The word is older than computers by centuries—it meant 'a small piece'—but Tukey reanimated it for the digital age.
Related Words
Today
Every digital device you own is built on the word 'bit.' Your phone transmits billions of bits per second. Your cloud storage is measured in bits. Your genome will soon be translated into bits for analysis and storage.
Tukey disliked 'binit.' So instead we live in a world where 'a small piece'—a word that meant a morsel of bread—became the smallest unit of meaning itself.
Explore more words