# Mathematical Trends in Reaction Network Theory in Copenhagen /1

I’ll try to live-blog my notes from this very interesting conference, knowing that I can hardly compete with Azimuth

Sebastian Walcher, Computational aspects of quasi-steady state reduction. Discussion of some simple-enough and rigorous methods to perform separation of time scales on chemical kinetic equations. Suppose we have an unknown separation of slow and fast coordinates. e.g. the (usual) fully reversible Michaelis-Menten reaction $E+S \rightleftharpoons C \rightleftharpoons E + P$. To derive the reduced equation, different assumption can be made (Briggs-Haldane vs. Michaelis-Menten), more or less rigorous. For systems in Tikhonov standard form

$y'_1 = f(y_1,y_2) + \epsilon (\ldots), \quad \epsilon y'_2 = f(y_1,y_2) + \epsilon (\ldots), \quad$

there exists a theorem that says that for $\epsilon \to 0$ solutions of the system converge uniformly to solutions of the reduced system

$y'_1 = f(y_1,y_), \quad g(y_1,y_2) =0$.

Another theorem by Fenichel essentially allows to write a system $x' = h_0 + \epsilon h_1 + \epsilon^2 \ldots$ into Tychonov form. The construction makes use of a matrix-vector decomposition $h_0(x) = P(x)\mu(x)$ (which doesn’t seem to be unique to me).

In these cases the small parameter (such as a reaction rate) is known. Sometimes som variables might be slow because of dynamical conjures. The small parameter has to be found by “freezing” the system. He then defines some Tikhonov-Fenichel parameter value that has the due properties. After a lot of math, he is able to provide a computational tool that does the computation of separation of time-scales quite straightforward. Based on this.

Gheorghe Craciun, A proof od the Global Attractor Conjecture. He first tried to explain the conjecture by a new object called “geometrically embedded graphs”, with possibly non-reversible edges. Such objects lead to a system of differential equations

$\dot{x} = \sum_{y \to y'} K_{y \to y'}(x) (y'-y)$

with mass-action chemical kinetics being a special case with $K_{y \to y'}(x) = k_{y \to y'} x^y$, thus allowing also for fractional powers, negative powers etc. Then in this language the Horn-Jackson theorem establishes that if a system is “vertex balanced” (that Craciun explains in terms of the graph being in a “general position”), then there exists a strict Lapunov function within each linear invariant subspace. The conjecture (by Horn 1974) states that if a system is vertex balanced then it has a globally attracting point within each linear invariant subspace. The problem is that some trajectories could be moving to the boundaries of the linearly invariant simplex, to points that are “persistent”. The talk then went very technical, lots of prerequisites required, and most of the complication was coming from the need to abandon mass-action kinetics and define a concept of “toric differential inclusion” which, I think, is a general representation of solutions of exponential dynamical systems. The whole point is to be able to systematically expand the region of attraction of the attracting point, by making the problem more geometric.

John Baez, Probabilities vs. Amplitudes. Chemistry is fundamentally quantum, but the master equation treats reactions in a probabilistic way, with $p = |\psi|^2$. One should derive the master equation as an approximation to quantum chemistry, but that’s not what he wants to do. He rather focues on the analogies if one replaces $p \leftrightarrow \psi$, and applies the lingo of Quantum Field Theory to chemical reaction networks. All of this material is very well explained on Azimuth. A note: Annihilation picks a particle and destroys it, creation just creates one. Hence it is normal that the first has a multiplying factor.

Michel Komorowski, Information flow in signal transducton pathways. Now a different perspective. Signal transduction, to perceive the outer world. Ligands binds and a cascade of events and information is trasmitted through the cytoplasm to the nucleus and DNA. Molecular details are known. The information theoretic setup is “à la Shannon”, “Input -> Channel -> Output”, a coding mechanism (decoding if operated the other way around). In formulas $Y \to P(X|Y) \to X$ and one can ask for either $P(Y|X=x)$ and $P(X|Y=y)$ and calculate the mutual information:

$I(X,Y) = H(Y) - \langle H(Y|X=x)\rangle$

with the first being entrpy and the second mutual entropy, and the average is over $x$. One can then optimize with respect to the input distribution to otain the capacity $C = \sum_{P} I(Y|X)$. He then observes that increasing complexity of the pathway increases the channel’s capacity. I missed the connection between these elements of theory and the several examples he gave.

Georg Regensburger, Parametrizing complex balancing equilibria of generalized mass-action systems. He considers generalized mass-action with noninteger exponents. Introduces the graph Laplacian, considers complex balance, and shws that solving for the steady state can be done via the usual spanning-tree formulas. All is linear until the end, when one has to interstect the manifold of solutions (which is exponential) to the stochiometric subspace.

Shodhan Rao, Complex and detailed balancing of chemical reaction networks revisited. Quite close to the previous talk, he goes through all the details behind the characterization of deficiency and introduces a concept of “formal balancing”. Sublt question about analogies to a paper by Dickenstein et al.

Meritxell Sàez, Recovering a reaction network after linear elimination of species. She focuses no time-scale separation: part of the reaction network is faster than another part, so part of the species (like enzymes) are assumed to be in a steady state. She considers the ODEs for the faster species, so including the enzymes’s concentrations into the rates. Then they obtain an effective reaction network. They don’t eliminate any species, but only those of a cut, that is, together they are conserved (so that there is no conservation law breaking) and they do not interact (so that, I think, they do not create newer cycles). With Feliu and Wiuf, conference organizers.

Tat Dat Tran, A connection between population genetics and chemical reaction network. Leit-motiv: birth-death processes are analogous to chemical networks. But I could not see the motivation behind one and another.

Antonio A. Alonso, The structure of feasible equilibria for mass-action law kinetic systems. Thermodynamic feasibility is a crucial problem in CN reconstruction. He models the environment by the input of external species at constant rate. In this case, though, by feasibility it is meant that the system is compatible with mass-action law (which I don’t understand where was lost), and he then says that the feasibility functions are equivalent to Wegscheider’s conditions. System is open.

Stefan Müller, Optimal resource allocation in metabolic networks. An optimization approach to metabolic networks. Nutrients -> metabolism -> biomolecules. With a bounded number of enzymes, one has an enzume allocation problem, and they prove a theorem that sas that the optimal solutions for arbitrary kinetics are the Elementary Flux Modes. Which means that the cell switches off as many cells to optimize a given output. He considered a specific metabolic pathway showing that there is a competition between to EFMs (one for respiration and one for fermentation) due to control of glucose or oxigen reservoirs. In some conditions, for kinetic reasons the energetically inconvenient pathway of fermentation is used by the cell, and there are discrete changes in behavior (sort-of phase transitions). One of the most striking results is that FBA (Flux-Balance Analysis) cannot give even qualitative predictions.

Gabor Szederkényi, A computation-oriented representation of kinetic systems with rational reaction rates. “If all you have is a hammer, everyting looks like a nail”. His hammer is computer science, and his nail is biochemical kinetics with rational reaction rates. Setup is as usual, with the Laplacian matrix put into light. The rest flies a bit over my head.