The entropy production is not the production of entropy

Main message: the concepts “entropy” and “production” do not commute, if they make sense at all.

I said that entropy production is not the production of entropy out loud at the JETC, and I believe many were disoriented. So let’s try to clarify what I mean. The confusion arises because people freely carry lingo from equilibrium thermodynamics out of equilibrium. At equilibrium, state functions are great. But out of equilibrium, talking about state functions might create confusion. I believe the major culprit for this shameful situation is Clausius nonetheless (I’ve heard Boltzmannn medal Dan Frankel saying something similar, so it’s not completely my responsibility).

Entropy, like energy etc., is a state function. That is, for a system X with states x, entropy is some function of the state S(x), a scalar. Full stop. At the moment I don’t care how this function is determined, if it’s uniquely defined (which is not) and whether it is interesting at all (not so much). Let us further assume that there is an internal dynamics within the system. As a consequence, there is a certain increment of entropy dS as the system moves from one state to another. Increases of scalars are very special types of differentials, called exact 1-forms. Because it is exact, the production of entropy ∫dS along a path is independent of the path and only depends on the initial and final states. Therefore, whenever the system goes back to the initial state the production of entropy vanishes. If we assume that the system comes close to the initial state often enough, then there will never be a net production of entropy.

Notice that so far no mention to the equilibrium vs. nonequilibrium character of the dynamics was made. But, from the above description of the production of entropy, it would seem that the rate at which entropy is produced within the system is practically zero for long enough times, which would seem to be un-characteristic of a non-equilibrium system, which is expected to produce entropy forever and ever. So what’s the problem?

Fact is that the production of internal entropy is not the whole story, because it is understood that “entropy flows” away from the system and towards the environment, adding something to the overall balance.

Let’s call this other flow across the system’s boundary the entropy production. I think this is a very unfortunate nomenclature, but let’s stick to it. Because entropy production is something that increases along a path in the system’s state space, it is well described by differential 1-form σ (by definition a 1-form is something that can be integrated along a path). Differing from the production of entropy, this form is generally not exact, which means that there is no function Σ  such that σ = dΣ. As a consequence, the entropy production does not vanish when integrated along a cyclic trajectory:

∫σ ≠ 0 along a cyclic trajectory

And, as a matter of fact, this integral is “most often positive” with respect to some probability measure over the trajectories in state space, according to the 2nd Law of thermodynamics. But let’s throw this issue into the closet for the moment.

If σ happens to be exact, then the system is “not a nonequilibrium system” (I could at this point turn this double-negation positive by mentioning “equilibrium”, “detailed balanced”, or “time-reversibility”,  but these three expressions have subtle meanings that I prefer to bury in the above-mentioned closet) . The connection to thermodynamics is that, for non-non-equilibrium systems, Σ comes to coincide precisely with the internal entropy of the system, Σ=S.

– – –

So far so good. Unless we are considering an non-non-equilibrium system, entropy production and production of entropy seem to be very different concepts with a very similar name: one is an exact form, the other is an inexact form, and the two only coincide if the system is not a nonequilibrium system. So where does the confusion originate from?

The reason is that people like to think of entropy production along a path occurring within the system, as production of entropy within some environment Y. The idea is that such environment (which can be further split into several reservoirs, if there is good reason to distinguish among them, but this is also in the closet) accumulates entropy in an ever-increasing way as transitions occur within the system. However, as I will try to argue below, this idea of “adding up to the entropy balance” is very rough and imprecise, and if taken too seriously leads to confusion, in particular to that masterwork of confusion that is the following formulation of the Second Law of thermodynamics:

“the entropy of the universe cannot decrease”.

To me, this is completely nonsensical. Let’s see why.

When people think of the universe they don’t actually mean the Universe, the multitude of things that actually surround us, but a portion of that multitude that is reasonably isolated from external influence, that is, something that ideally includes the system and the environment and that has no further environment beyond its boundaries. Let XY denote this universe with states (x,y) resolving the system’s and the environment’s states. Now, we are free to define the universe entropy as some state function SU(x,y). We’re obviously back to the initial considerations in this post: As a state function, the production of entropy vanishes along any closed path in XY so that the entropy of the universe will remain fairly constant in time  (in fact, if the evolution in the universe is unitary – for quantum systems – or Hamiltonian – for classical systems – then the entropy – Von Neumann’s or Shannon’s – is a constant of motion). There is no “thermodynamics of the universe”.

So let’s focus on the system by disregarding the environmental degrees of freedom, that is by projecting (x,y) → x. The idea is that, upon this projection, the exact differential dSU describing the universe increase of entropy provides an inexact differential σ in the system’s state space describing the… well, how to call it? People like to call it entropy increase, or production. Like it or not.

Let’s give a simple example for simplicity. Suppose  XY = {(a,A), (a,B), (b,A)}. Now let’s project down to X = {a,b}. A closed trajectory in X is for example a → b → a. To this trajectory, the following paths might correspond in XY:

(a,A)→ (b,A) → (a,A)

(a,A)→ (b,A) → (a,B)

Notice that the first path is closed in the universe, but the second is not (but it’s still closed in the system!). Hence, at the end of that second path the “entropy of the environment” has increased by amount SU(a,B) – SU(a,A). This defines our entropy production σ in the system, and shows that the entropy production is path-dependent. Notice that this entropy production is not the differential of a state function in a quite literal way, because SU takes two possible values at x=a. As a consequence, its increase between two system’s states takes different values according  the “hidden path” in the environment.

I didn’t give this example just “for simplicity”, but because defining σ in a rigorous way out of dSU is a quite challenging task.

Conclusion: Is it legitimate to call the entropy production a production of entropy? Yes and no: we need to keep in mind that this entropy production in the production of entropy is “virtual”. It can be figured to happen in the environment but it does not correspond to an actual state function called “entropy” that increases. If one wants to include the environment into the description, and have a system+environment = universe, and talk about an entropy function there, one is back to the situation where this entropy function cannot increase, and thus one can forget about the second law. This is because thermodynamics is intrinsically about open systems.

Perspective: Defining σ out of SU and a system/environment splitting is certainly doable in discrete state spaces, though it requires a lot of analysis of special cases. It is certainly much more involved when it comes to continuous state space. But what is really interesting is how to generalize this question to arbitrary differential forms, discrete or continuous. And to understand the role of dualities.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s