Live-blogging from STATPHYS26 /3

After the first two days, it is remarkable to notice that the nonequilibrium parallel session is by far the most populated. That’s encouraging…

Fluid models as scaling limits of systems of particles, L. Saint-Raymond. From microscopic Newton’s equations, to mesoscopic description in terms of Boltzmann’s kinetic equation, and to a macroscopic description as a continuous fluid equation of hydrodynamics. Boltzmann equation: f(t,x,v) fraction of particles of position x and velocity v at time t. Particles are only transported by velocity in vacuum, and the velocity is only changed by collisions. Of course at some point a dissipative step (in the eye of the beholder) will have to be attained. So one obtains

\partial_t f + v \nabla_x f = \alpha \int \int [f(t,x,v') f(t,x,v'_2) - f(t,x,v)f(t,x,v_2) ]|(v-v_2)\cdot \nu_2| dv dv_2 d\nu_2

Mass, momentum and kinetic energy are collision invariants [Q: is angular momentum as well?]. E.g. mass:

\partial_t \int f dv + \nabla_x \int d v dv

is we have a well-prescribed profile for f(x,v,t) we would obtain from this an hydrodynamic equation. Lyapunov functional of the Boltzmann equation:

S(t) = - \int f \log f dx dv

is an increasing function of time. That this is the case for the Boltzmann equation it’s fine. But that this should be the case for any system, it’s a crazy idea and lots of people are obsessed with this. The maximum of the entropy, constrained on conserved mass, energy and momentum is

\log f(v) = \gamma + u\cdot v + \beta v^2

This is true in the limit where the collision process is much faster than transport, in the long time. By plugging this equation in the conservation laws one obtains the Navier-Stokes equations, which is then the quasi-classical mean-field sort of limit of the Boltzmann equation. Notice that from Newton equation to Boltzmann equation some irreversibility entered (because the entropy is a constant of motion of Newtonian mechanics), somehow embedded in the collision integral: but it’s not clear where such dissipation is. Maybe in the angular momentum which might not be conserved? [Gianmaria says that also angular momentum is conserved. So what is not conserved?]

Statistical point of view: The starting point is the Liouville equation of the N particles, and one is interested in the first marginal with respect to one particle (that’s where the dissipation will come about, in this N to 1 process). At some point one resorts to the BBGKY hierarchy, where the molecular dyamics can be analyzed in terms of all possible histories, which give rise to collision trees, from the final time to the initial (you want to retrace the “parents” in the tree; the difference between the Boltzmann hierarchy and the BBGKY hierarchy is that in one of them you cannot have recollisions, i.e. the meeting of two particles that already met and hence are already correlated. There exists a theorem that tell you convergence after just a few collisions, but not in large times where one recovers the fluid dynamics.

[An idea: the BBGKY-kind of methods truncate the equations for the moments of some variable; but what if one considers another variable, or a coordinate transformation of a variable, which in some way is “optimal”?]

Human rights session. Apparently it’s a tradition instituted by J. Lebowitz at the time of the cold war, when it was difficult to have Russian scientists travelling through the iron curtain. A resolution of the United Nations for over-concern with security measures made so that five North-Corean scientists had to be expelled from SISSA and other institutions, to avoid (I guess) to provide them with some sort of strategic military information (they were working on string theory and DNA…). [By the way I was somehow relieved that the first mention of violation of human rights is a charge on the UN, since I have the eerie impression that human rights talk often serves as a vacuous propaganda tool when it comes from the side of this sort of international institutions]. Stefano Ruffo presents the case of Omid Kokabe, who was arrested in Teheran possibly because he refused to work on military projects.

The disorder created by entropy is in the mind, Daan Frenkel. Promising foundational talk. Entropy: the pre-history, the computer age, puzzles and send. Stat Mech started was pre-existent in thermodynamics, Sadi Carnot built the entire language (apparently he thought that heat had to be conserved and missed that it was heat+work, otherwise he would have got all of thermodynamics). Thermodynamics: 1st law energy is conserved, 2nd law heat does not spontaneously flow from cold to hot. Thermodynamics would not be so much hated by students if we sticked to these formulations (sic!). We must blame Clausius, who defined a quantity called entropy and found that in a closed system it never decreases and has that famous sentence that the entropy of the universe never decreases. But he didn’t say what is entropy, and we had to wait for Boltzmann. The person who actually first wrote the Boltzmann formula was Max Planck who said that inspired by Boltmann S = k log W + const. (the constant was then lost). And it was Max Planck who proposed to put that formula on Boltzmann’s monument. Popular view of entropy (Wikipedia): entropy is a measure of disorder: this is very often true, but it is NOT the second law, and it is dangerous to interpret it in this way [I so much agree with this! But I also think this is embedded in thinking that the second law is a statement about a thermodynamic potential (an exact form), which is not: it’s about the inexact form “flow” which can only be interpreted as entropy differences in reversible systems: so in a way the confusion about entropy arises because one keeps on talking about a potential for a 1-form that is not exact: which is precisely what happens when one moves from Carnot’s statement to Clausius’s]. e.g. Hard-sphere freezing (“Kirkwood-Alder transition”, whose simulations were run by another woman not acknowledged) is increase in order driven by entropy. Lesson: entropic ordering can lead to complex structures [Here I think we should point out that the use of the word “entropic” sometimes is not related to an actual entropy, but just to the understanding that certain processes are not “energetic” [what this difference amounts to is also mysterious];  but at least this becomes a discourse about processes and not about states]. “If I can’t simulate it then I can’t understand it”. He then goes into his favourite free energy calculations and start computing entropies. In one of them, he observes that not all volumes are all equally alike and uses the Gibbs entropy [I find that the interesting foundational premise is a bit betrayed by the development: the results are all above calculating “the entropy”]. In one of his systems, he finds that the log N! factor matters, and tries to combine this with the Gibbs paradox but without quantum mechanics, if you don’t have an operational procedure to distinguish particles (yet stating that entropy is all about the paradigms in one’s mind, with Kant). Entropy and society: BREXIT people vouching against entropy (and I would be among them); the other one with a cartel saying something like “I don’t trust the experts and science but I vote” (and I agree with this again…).

Ultimate statistical physics: fluorescence of a single atom, Yves Pomeau. Very technical.

Leo Kadanoff memorial, M. Feigenbaum. After a saddening introduction by Procaccia who ascertained how some of Kadanoff’s friends are either dead like Kadanoff himself or so old that they cannot easily go to conferences, emphasizing that a generation is gone – whereby I had the eerie sensation that he was implying most of these guys will soon be dead as well – he left the stage to Feigenbaum. After a lyrical prologue about the celestial eyes of Leo, Feigenbaum took the chance of talking about his own results (one of which absolutely crucial in his own words) in the light of his friendship with Leo, and lingered on a few episodes that were meant to give an impression on the humanity of Kadanoff, but which the other way around returned the image of a very competitive, overly proud person.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s