en + tropē

ἐν + τροπή

en + tropē

Greek

A German physicist in 1865 coined 'entropy' from Greek words for 'transformation' — to name the quantity that always increases in an isolated system, the mathematical arrow that gives time its direction.

Entropy was coined in 1865 by the German physicist Rudolf Clausius from Greek ἐν (en, 'in') and τροπή (tropē, 'transformation, turning, change'), producing a word meaning roughly 'the content of transformation' or 'that which turns.' Clausius was deliberate in his choice of Greek roots: he wanted entropy to parallel 'energy' (from Greek ἐνέργεια, energeia, 'activity, operation'), a word that had been coined a century earlier from Greek en + ergon (work). Energy and entropy were to be companion concepts, both measured in thermodynamic systems, both expressed in Greek. Clausius wrote: 'I have intentionally formed the word entropy so as to be as similar as possible to the word energy; for the two magnitudes to be denoted by these words are so nearly allied in their physical meanings, that a certain similarity in designation appears to be desirable.' The naming was deliberate symmetry.

The concept of entropy emerged from Clausius's work on the second law of thermodynamics, a law he formulated with increasing precision through the 1850s and 1860s. The first law of thermodynamics states that energy is conserved — it cannot be created or destroyed, only transformed. The second law adds a crucial asymmetry: heat flows spontaneously from hot to cold, not from cold to hot. Perpetual motion machines of the second kind — devices that extract work from a single heat reservoir without degrading it — are impossible. Clausius sought a mathematical quantity that captured this asymmetry. Entropy was his answer: in any reversible thermodynamic process, entropy remains constant; in any irreversible process (which is every real process), entropy increases. The entropy of an isolated system never decreases. This is the second law stated as a formula: ΔS ≥ 0.

The statistical interpretation of entropy came from Ludwig Boltzmann in 1877, in one of the most profound single equations in the history of physics: S = k log W. In this formula, S is entropy, k is the constant now named after Boltzmann, and W (from German Wahrscheinlichkeit, 'probability') is the number of microscopic configurations of a system that correspond to its macroscopic state. Entropy, Boltzmann revealed, is a measure of disorder or probability: a high-entropy state is one that can be realized in many different microscopic arrangements; a low-entropy state is one that can be realized in only a few. The reason entropy increases is not a fundamental law of physics at the microscopic level (the underlying equations are time-symmetric) but a consequence of probability: high-entropy states are vastly more numerous than low-entropy states, so any random process is overwhelmingly likely to move toward higher entropy.

Claude Shannon's 1948 paper 'A Mathematical Theory of Communication' extended entropy into information theory. Shannon needed a measure of the average information content of a message — how much uncertainty a message resolves. He defined information entropy using the same logarithmic formula as Boltzmann's thermodynamic entropy: H = −Σ p log p. The story goes that Shannon showed the formula to John von Neumann, who said: 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one knows what entropy really is, so in a debate you will always have the advantage.' Whether the story is true or apocryphal, the name stuck, and thermodynamic entropy and information entropy have been in productive conversation ever since.

Related Words

Today

Entropy is unique among scientific concepts in having a colloquial meaning that is nearly accurate. When people say 'entropy' to mean disorder, decay, or the tendency of things to fall apart, they are not far wrong — Boltzmann showed that entropy is precisely the measure of microscopic disorder. A tidy room has low entropy (few microscopic arrangements of molecules lead to an ordered room); a messy room has high entropy (vastly more arrangements produce disorder than order). The second law says that isolated systems evolve toward higher entropy — not because some fundamental force pushes them toward disorder, but because disordered states are overwhelmingly more numerous. Disorder is not the enemy of order; it is simply more probable.

The deepest implication of entropy is that it gives time its direction. The laws of physics at the microscopic level are time-symmetric: a film of molecular collisions run backward looks identical to the film run forward. Yet we experience time as having a definite direction — the past is fixed, the future is open; eggs break but do not unbreak; memories form of the past but not the future. The second law explains this asymmetry: the universe began in an extraordinarily low-entropy state, and entropy has been increasing ever since. The arrow of time is the arrow of increasing entropy. Every act of forgetting, every irreversible process, every death and dissolution, is entropy increasing in a universe that began improbably ordered and is becoming, slowly, maximally disordered. Clausius named the transformation. He did not know he was naming the reason we remember the past and not the future.

Explore more words