# Live-blogging from STATPHYS26 /2

Statistical physics for real biological networks, W. Bialek. What you look back to history the attempt to apply statphys to life was to (over?)simplify and disconnect from experiment. What changed recently is the amount and quality of data. After a quite generic introduction, he concentrates on the question “what is the probability distribution?” of whatever, and then attempts an answer in terms of usual equilibrium concepts such as energy, maximum entropy (consistent with some constraints). With flocks of birds, he will “try the idea” that the most important thing is the correlation of velocities between neighbours.  He writes down a sort of Hamiltonian and then computes the maximum entropy model consistent wth these measurements. I’m not a fan of maximum entropy methods because they seem to be to be tentatives of objectivize what is really just a subjective judgement; and in doing this one might not realize that he is not making statements about nature, but about our way to describe nature (but is it possible to do otherwise? That’s a tricky question) (This also came about as an answer to a question, where he says that MAXENT is a method to do the minimal model that contains no further structure than that that is needed to reproduce the data, but what kind of analysis of the data, what kind of structure one wants to see in the data he does not take into consideration]. This is a sort of Bayesian inverse reasoning where things eventually work because they work, and if they don’t work one shall move to something else, so I hardly believe it has any predictive power, and overall I don’t see a unifying inspiring principle in this talk. I’m surprised that searching for the word “nonequilibrium” in his draft Biophysics: Searching for Principles I only find one marginal instance; even the chapter on chemical reaction dynamics has a treatment of diffusions only in conservative fields. I don’t believe that equilibrium statistical mechanics has anything whatsoever to say about biological systems; it is a usual tendency that people fall in love for their favourite model or scheme and try to apply it to just anything. That happened with the Ising model, with “self-organized criticality”, and it will also happen to nonequilibrium stat mech at some point. He openly states that “What I’ve done is I built the simplest model that is consistent with a set of data”, and then he checks that it predicts things correctly; but for this process to be scientifically sound one should show or argue that the set of “predictions” is significantly different from the facts to which the model was fit. Moreover, I see the possibility of a problem with “overanalysis”, which is well-known for example in large collaborations such as LIGO-Virgo: the tendency to do a lot of different analysis, some of which eventually turn out right. It might be that there is more substance beyond this talk (which is mostly a collection of well-known words and ideas from the field, vague enough so that nobody could possibly disagree), but at the moment I don’t see this as a “search or principles” in biophysics.

Floquet quantum many-body physics, R. Moessner. Equilibrium thermodynamics: equilibration, and thermalization via “entropy maximization”: everything can be described by few parameters. He wants to address thermodynamics for coherent quantum dynamics (no baths, just the wave function). “Thermodynamics holds as a matter of course”. Mentions “eigenstate thermalization” [Deutsch, Srednicki]. So apparently it’s precisely in that line of research where things don’t make sense to me; of course at some point there will come approximations that open up the system. Soon enough in fact: periodic driving is one such thing; he considers tides on a beach and talks about the “Hamiltonian”. Anyway, let’s go to the math. He takes $H(t+T)=H(t)$. [“Something that goes out of the window is equilibration”: and otherwise?] then defines $H_{Floq} = i /T \log U(T)$. This Floquet Hamltonian has properties. By the way, I think that the whole program of ETH is the obsession for “energy” that people have.

NonEq Stat Mech of systems with long-range interactions, Y. Levin. Core and halo structure of particles with long-distance interactions. He says that this structure violates the second law of thermodynamics (I guess, in its pointless formulation as “entropy should be maximized”, which should mean that the system should go to a uniform distribution). [Question: in what sense would that violate the second law?] He then concludes that the system should be nonequilibrium, on the basis that it does not have Maxwell-Boltzmann distribution. Fortunately he states that one should just go to the basis, which is kinetics. Fortunately, because that’s the only thing that makes sense: to consider processes and not names of things. [Question: by nonequilibrium one often means that there is a sustained current through the system: is there a sustained current in this system at its steady state?] Not surprisingly, under the Vlasov dynamics the entropy is a constant of motion and this should not be a surprise! [Levin et al. Phys. Rep.] That’s because the volume of phase space is preserved. He has to introduce a dissipation mechanism, which lies in the coarse-graining of the phase space: once again, it’s in the eye of the beholder! That’s where dissipative dynamics ensues. Who decides in what coordinates these cells should form a lattice? There is no favourite canonical set of variables*.

Statistical mechanics of general equilibrium theory, M. Marsili. Assumptions: Markets are competitive, price taking behavior, single period economy, welfare theorems etc. general theory theory states that you can obtain Pareto-optimal situations. [Question: beyond the specific assumptions that regulate markets, isn’t the very paradigm of “equilibrium” to be shaky, given that there are lots of external driving forces that drive these theorems?] Sheer reductionism at its highest expression.

A general comment: Unfortunately this is not a conference where it is possible to ask questions and start discussions easily, most people just stick to their communities. In huge controlled societies where in principle there is the possibility to interact with anybody, in practice people find enclosures and niches.

Phase separation out of equilibrium, T. Speck. Colloidal suspension in shear flow, with broken detailed balance. Phase separation of active particles, the single particle has symmetry-breaking.

Thermalization of a quantum system from first principles, Ithier and Benaych-Georges. They at some point deliberately introduce some randomness in the problem with a high-dimensional perturbation of the reduced density matrix. They argue that this is the same as the motivation of Wigner and Dyson, and they obtain the phenomenon of concentration of measure. The math s quite straightforward so I should be able to work that out form their paper. This talk is only based on an arXiv preprint and a paper “in preparation”.

Dissipation bound for thermodynamic control, B. Machta. He uses the Fisher-Rao metric and states that the entropy production bounds it with respect to two protocols [also: Sivak & Crooks]. He claims that there is a finite cost to control the Carnot engine for example and make it into a cycle, and they estimate this quantity. He models the system in terms of an Hamiltonian depending on some parameters. This might be the more interesting talk today. [Q: Where is the stochasticity entering? Not clear] [Q: The Fisher-Rao information metric is a measure of distance in a space of probability distribution: where does this space of distribution come from? Also, a classical result in estimation theory is the Cramer-Rao inequality: is this related to your bound?]. He obtains:

$\langle \Delta S_{tot}\rangle \geq 2 \mathcal{L}(\lambda_0,\lambda_f)$

B.B. Machta, Dissipation bound for thermodynamic control, PRL 115, 260603

Entropy production rate near nonequilibrium phase transitions, J. D. Noh. The paper by [Sevilla et al. Jstat. Phys 2104] is interesting. Also this talk is interesting but I’m tired.

MAXENT out of equilibrium, I. Ford. He argues that MAXENT is not only a principle of logical inference, but it has a basis on stochastic dynamics: entropy is configurational uncertainty, and it increases under typical dynamics. But he wants to maximize some components of entropy production. He will consider a Fokker-Planck equation with a nonconstant temperature (that’s what he means by nonequilibrium). He considers the mean entropy production. He then interprets the heat transfer as the change of uncertainty in the environment. He has the strange formula:

$\frac{ d \langle \Delta s_{tot} \rangle }{dt} \geq 0$

which is not the second law, it is something stronger. What it means is not clear to me and I don’t believe it holds in general stochastic systems. But it is a conjecture that can be checked (possibly my paper on convexity might already contain the answer…).

Canonical theory of dissipative systems, M. Suzuki. Wonderful vintage presentation, hand-written. Also the concepts are vintage. So sweet.

* This would be an interesting research project: to study different “equilibration” mechanisms by coarse-graining systems that have different sets of canonical variables.