Nonequilibrium Physics & Large Deviation Theory
Updated: Mar 21, 2020
Given all the work I have been doing in this area, I figured (actually, more like due to the insistence of father) I'd try to write some introductory notes on the subject. You'll possibly need some math/science background and it will get progressively more complicated, but hopefully some things will make sense. (Feel free to ask questions to clarify points; as a matter of necessity I will sacrifice some rigor for the sake clarity, but I will point out such heinous departures as heinous checks (HC).)
Fig. (1): Ensemble of spherical chickens in 2D. Each such particle is characterized by its position and velocity. Here T denotes temperature and there is a fixed distribution called the Maxwell Boltzmann Distribution obtained by following rules of equilibrium statistical mechanics.
So what is nonequilibrium physics? Actually, for that matter what is equilibrium physics? A simple illustration of the latter is depicted in Fig. (1). Think of a bunch of particles (or spherical chickens as we physicists frequently employ) in a box. These guys can be characterized by their location and their velocity (essentially speed with an added directional component that says which way they are going). If you allow all these particles to collide with one another and the box walls (elastically, i.e, without loss of energy) eventually (in the long time limit) their velocities will assume a stable distribution, called the steady state. This is a model system that describes many situations in realistic cases. (Physicists like to make the simplest model that capture the essential elements that describe a phenomenon. It is always easy to add elements and make things more complicated but if they don't contribute to the fundamental phenomenology then they are not crucial to what we want to understand, and thus, as a zeroth order approximation, can be safely ignored.)
It is the purview of equilibrium statistical mechanics to tell us what the distribution of the velocities (or positions) looks like for an enormously large set of particles. This is the key point of statistical mechanics: to get an effective measure of bulk properties of systems without having to keep track of individual particles. The reason it is so powerful is that it reduces keeping track of the position and velocity pair of each particle to a handful of global (measurable and controllable) properties like volume, temperature, pressure and so forth. If it wasn't clear this means that we reduced what happens to ~100,000,000,000,000,000,000,000 particles to a handful (~3) of measurable or specifiable properties!
Fig. (2): Comparison of experiment and theoretical calculations done with Quantum Monte-Carlo for a system of (~200,000) particles in an optical lattice made by interfering three laser beams (first image). The system is a discrete version of spherical chickens in a box shown earlier. A model is shown in the middle image. Experiments can control the strength of the interaction between particles by controlling the intensity of the laser (measured as 's'). These comparisons are done at low but finite temperature. (These results are from my work with Brian DeMarco's group at the University of Illinois at Urbana-Champaign. The y-axis measures the fraction of particles in a quantum state of matter called a Bose-Einstein condensate.)
By the way, stat. mech. isn't an approximation (although, approximations are possible to reduce computational effort and obtain some desired level of accuracy) -- experiments and theory (following the rules of statistical physics) done with various systems over decades are basically in excellent agreement (see Fig. 2). How can it achieve such a feat? It is because over many decades scientists have been able to figure out the fundamental laws or rules that are applicable to the equilibrium scenario -- particles are allowed to share energy with each other but as a whole, no energy can be added or removed from the system (HC: there are some special ways by which we could add/remove energy from the system, but let's not go there...). A key aspect of equilibrium stat. mech. is that the properties we are measuring are related to the configuration (static position and velocity) of the particles in the steady state -- the history, i.e., all the prior positions and velocities, do not matter. The system is expected to transition from configuration to configuration visiting each configuration given by a probability distribution due to Boltzmann.
So much for equilibrium physics. If the system is changed or disturbed in a small way (HC: this sounds vague, but there are is actual mathematical measures to make this rigorous.), there is near-equilibrium tools to handle such situations (called linear response theory). On the other hand, If the system is driven, i.e., we add or remove energy then the rules of equilibrium stat. mech. do not apply and the nice Boltzmann distribution can no longer be used. For many years, physicists handled such situations on a case by case basis and there was no unified way of handle different types of systems. This is where Large Deviation Theory (LDT) comes in. It is basically, the first unified approach to equilibrium and nonequilibrium statistical mechanics and allows both to be treated on an equal footing.
Fig. (3): Illustrates the ideas of configurations, order parameter and counting statistics needed to understand equilibrium statistical mechanics (see text).
Before we continue, we have to discuss some concepts in greater detail. Returning to our spherical chickens, the image on the left in Fig. (3) shows a time-series of configurations of chickens (after sufficient time has passed to reach the steady state, where long-time behavior dominates and initial condition related fluctuations have been eliminated). Note that the full time-series of configurations is called a trajectory. Now, for each configuration of chickens we can measure different properties. In this example, an obvious one to measure is the clustering of chickens. Notice how although the chickens move around over time they remain clustered to varying degrees. We could measure the degree of clustering via an order parameter that is just a function which takes a configuration and returns a number. This is illustrated via the fancy 'O' with a hat and parenthesis. For each value returned for all possible configurations we can tabulate it and then plot the probability of clustering values as shown in the second image of Fig. (3). If the process follows the rule of equilibrium stat. mech. then this distribution function is known a priori -- the Boltzmann distribution (HC: it will more generally be a weighted Boltzmann distribution. The key point is that we know exactly what form it will take no matter how exotic the system).
Fig. (4): Swapping time dependent configurations for an ensemble of configurations (see text).
There are some additional technical ideas I have to convey before we can continue to nonequilibrium stat. mech. This is technical and requires some abstract thinking, but the first concept is what is known as an ensemble. Recall that in the context of equilibrium stat. mech., I discussed how to obtain a probability distribution of order parameter values (see Fig. 3). This was done by using the order parameter function on configurations generated as a function of time (after the system had sufficient time to relax into long-time behavior). Now, since equilibrium configurations are not time dependent we might as well have removed the time dependence (but, again after the system has relaxed) and just considered all configurations generated according their probabilities or an ensemble of configurations, which are decided by the dynamics encoded in the system (typically via the energetics described by the Hamiltonian that are nothing but transition probabilities from one configuration to another -- this is too advanced for this discussion but I mention it so that it is clear that it is not arbitrary). This idea is illustrated in Fig. (4). By the way, this "casual" swapping of time dependent configurations to an ensemble is not trivial and is only possible under the ergodicity assumption: given sufficient time the system will visit all configurations.