Energy & Entropy
Energy & Entropy

Energy & Entropy

Energy & Entropy


Waste energy is associated with all processes. This waste can be reduced, but it can never be eliminated. Anyone who says otherwise is trying to con you.

The Second Law of Thermodynamics is a statement of the painfully obvious; that is, at least it’s obvious to us now. The statements you are about to read were worked out over a period of many years, were refined and clarified by countless minds, and were gradually combined into one large idea that we now recognize as a comment on the fundamental behavior of the universe. Of all the relationships in physics given the title of “law”, the Second Law of Thermodynamics is the one law for which there appears to be no exception. Anyone who disagrees with it or says they have a method to avoid it is immediately and rightfully dismissed as an idiot who doesn’t understand the law, a crackpot pushing an agenda, or the perpetrator of some kind of hoax. When someone says they’ve found a way around the Second Law of Thermodynamics, chances are quite good they’re trying to separate you from your money. Arguing against the Second Law is hopeless. You can’t beat it. That’s how powerful this idea is.


Kelvin Statement: perfect engines can’t exist
William Thomson, Lord Kelvin (1824–1907) Ireland–Scotland

It is impossible, by means of inanimate material agency, to derive mechanical effect from any portion of matter by cooling it below the temperature of the coldest of the surrounding objects since any restoration of this mechanical energy without more than an equivalent dissipation is impossible.

A Universal Tendency in Nature to the Dissipation of Mechanical Energy

It is impossible to produce work in the surroundings using a cyclic process connected to a single heat reservoir. (1851?)

It is impossible, by means of inanimate material agency, to derive mechanical effect from any portion of matter by cooling it below the temperature of the coldest of the surrounding objects.

A transformation whose only final result is to convert heat, extracted from a source at constant temperature, into work, is impossible.

On the Dynamical Theory of Heat, with numerical results deduced from Mr Joule’s equivalent of a Thermal Unit, and M. Regnault’s Observations on Steam.

Clausius Statement: perfect refrigerators can’t exist
Rudolf Clausius (1822–1888) Germany

Heat cannot of itself pass from a colder to a hotter body.

It is impossible to carry out a cyclic process using an engine connected to two heat reservoirs that will have as its only effect the transfer of a quantity of heat from the low-temperature reservoir to the high-temperature reservoir. (1854?)

“Es existiert keine zyklisch arbeitende Maschine, deren einzige Wirkung Wärmetransport von einem kühleren zu einem wärmeren Reservoir ist.”

It does not exist a cyclically operating machine, whose only effect is heat transport from a cooler to a warmer reservoir. [autotranslator]

It is impossible for a self-acting machine, unaided by any external agency, to convey heat from one body to another at a higher temperature. [kelvin’s translation]

Carnot Statement: no engine can be more efficient than a reversible engine
Sadi Carnot (1796–1832), Réflexions sur la puissance motrice du feu sur les machines propres a developper cette puissance. (Reflections on the Motive Power of Fire and on Machines Fitted to Develop That Power.) 1824.

Everyone knows that heat can produce motion. That it possesses vast motive-power no one can doubt, in these days when the steam engine is everywhere known.

The question has often been raised whether the motive power of heat is unbounded, whether the possible improvements in steam engines have an assignable limit — a limit which the nature of things will not allow to be passed by any means whatever; or whether, on the contrary, these improvements may be carried on indefinitely.

According to established principles at the present time, we can compare with sufficient accuracy the motive power of heat to that of a waterfall. Each has a maximum that we cannot exceed, whatever may be, on the one hand, the machine which is acted upon by the water, and whatever, on the other hand, the substance acted upon by the heat. The motive power of a waterfall depends on its height and on the quantity of the liquid; the motive power of heat depends also on the quantity of caloric used, and on what may be termed, on what in fact we will call, the height of its fall, that is to say, the difference of temperature of the bodies between which the exchange of caloric is made. In the waterfall the motive power is exactly proportional to the difference of level between the higher and lower reservoirs. In the fall of caloric the motive power undoubtedly increases with the difference of temperature between the warm and the cold bodies….

The motive power of heat is independent of the agents employed to realize it; its quantity is fired solely by the temperatures of the bodies between which is effected, finally, the transfer of the caloric [heat].

The difference between specific heat under constant pressure and specific heat under constant volume is the same for all gases.

The fall of caloric produces more motive power at inferior than at superior temperatures.

Equivalence of Kelvin, Clausius, and Carnot statements.

“Not all processes which are possible according to the law of the conservation of energy can be realized in nature.”

Entropy is

  • the degree to which energy is wasted (dispersed)
  • a measure of the unavailability of heat energy for work

Clausius 1865, entropie, en to contain, trope turning or transformation, “transformation-contents” (trophy: enemy turning around, tropic: sun turns around at tropics of Cancer/Capricorn). Energy is a measure of work en to contain, ergon deed or work (work, liturgy, organ, orgy). The universe has never been more ordered than it was at its inception. The second law gives us “the arrow of time”.

Clausius Says (1865): Ueber verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie. Rudolf Clausius. Annalen der Physik und Chemie., Vol. 125 No. 7 (1865): 353–400.

  1. Die Energie der Welt ist konstant.
    The energy of the world is constant.
    The energy of the universe is constant.
  2. Die Entropie der Welt strebt einem Maximum zu.
    The entropy of the world strives to a maximum.
    The entropy of the universe is increasing.

The Heat Death.


Newton’s Laws of Motion are reversible in time; that is, they do not distinguish between future and past. Imagine a movie of a process. If one can distinguish whether the movie is run forward or backward, then one has identified a special direction of time and the process is called irreversible. If one cannot tell whether the movie is run forward or backward, then one has not identified a special direction of time and the process is said to be reversible. A movie showing the microscopic behavior of a small group of atoms would look the same if it were run forward or backward, but a movie showing the macroscopic behavior of a collection of molecules wouldn’t.

Galileo: We can take wood and see it go up in fire and light, but we do not see them recombine to form wood; we see fruits and flowers and a thousand other solid bodies dissolve largely into odors, but we do not observe these fragrant atoms coming together to from fragrant solids.

In the age of the computer, it’s quite easy to imagine writing a program that could keep track of a small number of particles in some kind of container (small, at least, when compared to the unimaginably large number of molecules in a mole). A computer program that could keep track of the particles as they careened around inside the container and as they collided into each other; obeying the conservation of momentum and energy at all times. Although it would be exceptionally tedious for a human, computers don’t complain when given these kinds of tasks (not yet, anyway). Before the program started, we could assign to our particles any values of speed and direction that might amuse us. After the program had been running for a while we could halt it, save the positions and speeds of all the particles and reverse their directions. After as much time elapsed as we let the program run originally, our dumb particles would then find themselves returned to their original positions (the ones we arbitrarily chose when we started the simulation) but heading in the opposite direction. On the microscopic scale, the laws of classical mechanics are reversible. That is, there is no preferred direction of time. A movie of a few molecules looks the same run forward as it does when run backward.

Now imagine a whole lot of molecules arranged to form a coffee cup filled with hot coffee. Imagine again, that another collection of molecules arranged to form a hand happen to have been given a velocity that sends them on a collision course with the coffee cup. The two molecular assemblies meet. The force of the collision is great enough to propel the cup toward the edge of the table on which it’s resting, but not great enough to cause either the hand or the cup to disintegrate. The molecules in the hand, the cup, and the table are given a bit of a jostle; the bonds between in the cup and the table are insufficient to stop its progress; and the cup slides off the edge. After interacting briefly with the air molecules on the way down, the molecules of the cup strike the molecules of the floor and the cup loses its coherence. Large sections of the cup separate from other large sections; collectively obeying the conservation of momentum in the process. The molecules of the water, which aren’t held together as strongly, fly off in a more diffuse manner. All these dispersing molecules are eventually stopped by the molecules of the floor. The cup has broken and the coffee has spilled. On the microscopic level, all the molecules of this system have observed the laws of classical mechanics, which are reversible.

Now imagine a whole lot of molecules arranged to form the shards of a cup coated with a film of molecules that together constitute coffee resting on the molecules of the floor. Imagine again that the molecules of the floor, which are constantly in motion, suddenly conspire to move in a manner that thrusts the shards and the liquid together (but leaves the dirt and grime behind) and that these molecules fall together in the right manner such that the liquid is wrapped by the solid shards and that these shards coalesce into a shape capable of containing the moving liquid. The molecules of the floor give one last push together as a unit and propel the cup and coffee up off of the floor at just the right trajectory so that the whole mess arrives at the level of the table where the molecules of a hand meet them moving in the same direction and, like the hand of a baseball catcher’s mitt, bring them to a halt on the table. The coffee has unspilled, the cup has unbroken itself, and the whole assembly has come to rest right where it belongs — within my grasp. On the macroscopic level, such an event is impossible. No group of molecules would ever conspire to do this, but there is no physical reason why they couldn’t. Molecules in the floor pushing the molecules of a cup aren’t fundamentally different in any way from the molecules of a cup pushing the molecules in the floor. A smashed cup of coffee spilled on the floor transforming into a cup of coffee sitting on a table is an impossibility. This is the heart of what became known as the reversibility paradox.

Ludwig Boltzmann (1844–1906) statistical concept of entropy as a logarithmic tally of the number of microscopic states corresponding to a given macroscopic (thermodynamic) state.

Boltzmann sought to derive the Second Law of Thermodynamics from the statistical behavior of a large number of molecules obeying the simple laws of mechanics during collision — namely the conservation of momentum and energy. Such analyses are given the name statistical mechanics. Boltzmann showed that given a collection of molecules with any velocity distribution, after enough time has passed the velocities of the molecules will eventually acquire a continuous normal-like distribution described by James Clerk Maxwell (1831–1879) and that the gas will eventually acquire a uniform temperature. That is, Boltzmann showed that the arrow of time was statistical in origin. A smashed and spilled coffee cup will never be seen to reassemble into a cup of coffee, not because it can’t happen, but because it’s extremely unlikely that it ever could happen. The Second Law of Thermodynamics is just a way of telling us not to worry about such things. Time runs forward because that is the most likely way for it to run. It’s so overwhelmingly impossible for it to do anything else that the universe as we know it will likely not be around long enough for any macroscopic violation of the Second Law of Thermodynamics to occur.

Entropy is

  • a measure of disorder
  • the number of identical microstates

S = k log w

Loose notes.

  • Entropy decreases may occur in small systems over short time-scales. In 2002, researchers at Australian National University monitoring a water solution of 100 microscopic latex beads (6.3 μm in diameter) and saw the entropy decrease for periods lasting several tenths of a second.
  • The mathematical relation describing the probability of observing microscopic violations of the Second Law is known as the Fluctuation Theorem and was formulated by Denis J. Evans, Cohen, and Morriss at Australian National University in 1993.
  • “… every dynamical phase space trajectory and its conjugate time reversed anti-trajectory, are both solutions of the underlying equations of motion.”
  • On the microscopic scale:
    • All interactions are mediated through conservative forces. Energy is never “lost” in the microscopic realm of molecules, atoms, and subatomic particles. A reduction in the kinetic energy of a group of particles will always be exactly balanced by an increase in their potential energy in one of the four fundamental forms. Energy can always be accounted for.
    • All processes are reversible. A movie of the atoms in a gas run in reverse would be indistinguishable from one run forward. A movie of a photon scattering off an electron would look no more right or wrong than a movie of an electron scattering off a photon. Such events have temporal symmetry.
  • On the macroscopic scale:
    • All interactions between systems include nonconservative forces. The mechanical energy of macroscopic systems will always dissipate. Fresh energy must be added to keep them going. The energy is not destroyed, of course, it just gets dissipated as heat.
    • All processes are irreversible. A movie of most any common activity would look odd if run in reverse. Broken objects on the ground don’t reassemble and then jump back on the table. There is an obvious arrow of time to all activities.
  • Microscopic systems are characterized by conservative forces and reversible processes.
  • Macroscopic systems are characterized by nonconservative forces and irreversible processes.
  • Disordered states outnumber ordered states.
  • 1876 Josef Loschmidt reversibility argument — Imagine a highly ordered arrangement. Run it forwards, it becomes disordered. Run it backward from this point, it becomes ordered. Both events follow the laws of mechanics. Doesn’t the second one disprove the second law of thermodynamics? Answer, no. Let the reversed movie continue running in reverse. The system returns to disorder. Any violations of the second law are only temporary.
1 1
1 2 1
1 3 3 1
1 4 6 4 1
1 5 10 10 5 1
1 6 15 20 15 6 1
1 7 21 35 35 21 7 1
1 8 28 56 70 56 28 8 1
1 9 36 84 126 126 84 36 9 1
1 10 45 120 210 252 210 120 45 10 1
1 11 55 165 330 462 462 330 165 55 11 1
1 12 66 220 495 792 924 792 495 220 66 12 1

information theory

Claude Shannon (1916–2001) United States In 1948, Shannon published “The Mathematical Theory of Communication” in the Bell System Technical Journal.

Information is like entropy. The number of distinguishable symbols in a channel is like the number of distinguishable states in a system.

Entropy is

  • the number of identical microstates
  • information
bits of information  = log2(number of states)
entropy  = k ln(number of states)

The total number of different symbols that can be transmitted through a noisy channel

number of distinguishable symbols = S

The information capacity of a channel (I) in units of bits/s Hz.

I = log2
S + N

where S is the received signal power and N is the noise power. (S and N are each the variance of a Gaussian distribution.)

Is information conserved? While it certainly appears to increase with time, it also appears to be destroyed in black holes.

Entropy of language

  • In 1950, Claude Shannon estimated the entropy of written English to be between 0.6 and 1.3 bits per character (bpc), based on the ability of human subjects to guess successive characters in text. Shannon estimated the entropy of written English in 1950 by having human subjects guess successive characters in a string of text selected at random from various sources. In one experiment, random passages were selected from Jefferson the Virginian by Dumas Malone. The subject was shown the previous 100 characters of text and asked to guess the next character until successful. The text was reduced to 27 characters (A-Z and space). Subjects were allowed to use a dictionary and character frequency tables (up to trigram) as aids. A text written in English can be considered to a certain extent as a random signal composed of a finite number of symbols (mainly the letters of the alphabet). Applying statistical techniques related to information theory, it is possible to compute an estimate of the entropy rate of English, thus enabling optimal compression of English texts or even “simulate” it

Entropy of proteins

  • The Zipf analysis and k-tuplet analysis give Shannon entropies of approximately 2.5 bits/amino acid. Using the Chou-Fasman gambler algorithm, a somewhat smaller entropy of 2.0 bits/ amino acid was obtained. This entropy is much smaller than the value of 4.18 bits/amino acid obtained from the nonuniform composition of amino acids in proteins.

Is information entropy?

My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage. Claude Shannon (1949).

Leave a Reply

Your email address will not be published. Required fields are marked *


This site uses Akismet to reduce spam. Learn how your comment data is processed.