“A variable is something capable of changing — Latin variabilis meant changeable, liable to vary, and mathematics borrowed the word to name the unknown quantity that takes different values.”
Latin varius meant diverse, varied, changing — things that were not uniform, that shifted and differed. Variabilis was the adjectival form: capable of changing, subject to variation. The word described things that could not be fixed or predicted: variable weather, variable moods, variable quality. The root varius also gave English variety, various, vary, and variation.
Mathematics adopted variable in the 17th century for the quantity that could take different values in an equation or function. François Viète in the late 16th century introduced the practice of using letters to represent unknown quantities in algebra; René Descartes in 1637 established the convention of using x, y, z for unknowns and a, b, c for known quantities. These x, y, z were variables — literally, changeable quantities.
The concept of a variable was central to the development of calculus by Newton and Leibniz in the 1660s-1680s. Calculus studies how quantities change with respect to each other — how a variable changes as another variable changes. The rate of change of position (a variable) with respect to time (another variable) is velocity. The power of calculus is precisely that it works with changing quantities rather than fixed ones.
Computer programming adopted variable from mathematics: a variable is a named location in memory that can hold different values at different times. Programming variables can hold numbers, text, images, or any type of data. Every program that has ever run is a dance of variables: values changing, being compared, being stored and retrieved. The Latin changeable thing is now the fundamental unit of computation.
Related Words
Today
A variable in mathematics is a placeholder for what is not yet known. In computer programming it is a name for a value that might change. In everyday speech, a variable is anything unpredictable. All three uses share the Latin origin: something that can vary, whose value is not fixed.
The distinction between variables and constants is one of the fundamental structures of quantitative thinking. To model anything — weather, economics, disease spread — you must decide which quantities change (variables) and which you treat as fixed (constants). Getting that distinction right is most of the intellectual work.
Explore more words