Physicists Rewrite the Fundamental Law That Leads to Disorder
Introduction
In all of physical law, there’s arguably no principle more sacrosanct than the second law of thermodynamics — the notion that entropy, a measure of disorder, will always stay the same or increase. “If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations — then so much the worse for Maxwell’s equations,” wrote the British astrophysicist Arthur Eddington in his 1928 book The Nature of the Physical World. “If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.” No violation of this law has ever been observed, nor is any expected.
But something about the second law troubles physicists. Some are not convinced that we understand it properly or that its foundations are firm. Although it’s called a law, it’s usually regarded as merely probabilistic: It stipulates that the outcome of any process will be the most probable one (which effectively means the outcome is inevitable given the numbers involved).
Yet physicists don’t just want descriptions of what will probably happen. “We like laws of physics to be exact,” said the physicist Chiara Marletto of the University of Oxford. Can the second law be tightened up into more than just a statement of likelihoods?
A number of independent groups appear to have done just that. They may have woven the second law out of the fundamental principles of quantum mechanics — which, some suspect, have directionality and irreversibility built into them at the deepest level. According to this view, the second law comes about not because of classical probabilities but because of quantum effects such as entanglement. It arises from the ways in which quantum systems share information, and from cornerstone quantum principles that decree what is allowed to happen and what is not. In this telling, an increase in entropy is not just the most likely outcome of change. It is a logical consequence of the most fundamental resource that we know of — the quantum resource of information.
Quantum Inevitability
Thermodynamics was conceived in the early 19th century to describe the flow of heat and the production of work. The need for such a theory was urgently felt as steam power drove the Industrial Revolution, and engineers wanted to make their devices as efficient as possible.
In the end, thermodynamics wasn’t much help in making better engines and machinery. Instead, it became one of the central pillars of modern physics, providing criteria that govern all processes of change.
Classical thermodynamics has only a handful of laws, of which the most fundamental are the first and second. The first says that energy is always conserved; the second law says that heat always flows from hot to cold. More commonly this is expressed in terms of entropy, which must increase overall in any process of change. Entropy is loosely equated with disorder, but the Austrian physicist Ludwig Boltzmann formulated it more rigorously as a quantity related to the total number of microstates a system has: how many equivalent ways its particles can be arranged.
The second law appears to show why change happens in the first place. At the level of individual particles, the classical laws of motion can be reversed in time. But the second law implies that change must happen in a way that increases entropy. This directionality is widely considered to impose an arrow of time. In this view, time seems to flow from past to future because the universe began — for reasons not fully understood or agreed on — in a low-entropy state and is heading toward one of ever higher entropy. The implication is that eventually heat will be spread completely uniformly and there will be no driving force for further change — a depressing prospect that scientists of the mid-19th century called the heat death of the universe.
Boltzmann’s microscopic description of entropy seems to explain this directionality. Many-particle systems that are more disordered and have higher entropy vastly outnumber ordered, lower-entropy states, so molecular interactions are much more likely to end up producing them. The second law seems then to be just about statistics: It’s a law of large numbers. In this view, there’s no fundamental reason why entropy can’t decrease — why, for example, all the air molecules in your room can’t congregate by chance in one corner. It’s just extremely unlikely.
Yet this probabilistic statistical physics leaves some questions hanging. It directs us toward the most probable microstates in a whole ensemble of possible states and forces us to be content with taking averages across that ensemble.
But the laws of classical physics are deterministic — they allow only a single outcome for any starting point. Where, then, can that hypothetical ensemble of states enter the picture at all, if only one outcome is ever possible?
David Deutsch, a physicist at Oxford, has for several years been seeking to avoid this dilemma by developing a theory of (as he puts it) “a world in which probability and randomness are totally absent from physical processes.” His project, on which Marletto is now collaborating, is called constructor theory. It aims to establish not just which processes probably can and can’t happen, but which are possible and which are forbidden outright.
Constructor theory aims to express all of physics in terms of statements about possible and impossible transformations. It echoes the way thermodynamics itself began, in that it considers change in the world as something produced by “machines” (constructors) that work in a cyclic fashion, following a pattern like that of the famous Carnot cycle, proposed in the 19th century to describe how engines perform work. The constructor is rather like a catalyst, facilitating a process and being returned to its original state at the end.
“Say you have a transformation like building a house out of bricks,” said Marletto. “You can think of a number of different machines that can achieve this, to different accuracies. All of these machines are constructors, working in a cycle” — they return to their original state when the house is built.
But just because a machine for conducting a certain task might exist, that doesn’t mean it can also undo the task. A machine for building a house might not be capable of dismantling it. This makes the operation of the constructor different from the operation of the dynamical laws of motion describing the movements of the bricks, which are reversible.
The reason for the irreversibility, said Marletto, is that for most complex tasks, a constructor is geared to a given environment. It requires some specific information from the environment relevant to completing that task. But the reverse task will begin with a different environment, so the same constructor won’t necessarily work. “The machine is specific to the environment it is working on,” she said.
Recently, Marletto, working with the quantum theorist Vlatko Vedral at Oxford and colleagues in Italy, showed that constructor theory does identify processes that are irreversible in this sense — even though everything happens according to quantum mechanical laws that are themselves perfectly reversible. “We show that there are some transformations for which you can find a constructor for one direction but not the other,” she said.
The researchers considered a transformation involving the states of quantum bits (qubits), which can exist in one of two states or in a combination, or superposition, of both. In their model, a single qubit B may be transformed from some initial, perfectly known state B1 to a target state B2 when it interacts with other qubits by moving past a row of them one qubit at a time. This interaction entangles the qubits: Their properties become interdependent, so that you can’t fully characterize one of the qubits unless you look at all the others too.
As the number of qubits in the row gets very large, it becomes possible to bring B into state B2 as accurately as you like, said Marletto. The process of sequential interactions of B with the row of qubits constitutes a constructor-like machine that transforms B1 to B2. In principle you can also undo the process, turning B2 back to B1, by sending B back along the row.
But what if, having done the transformation once, you try to reuse the array of qubits for the same process with a fresh B? Marletto and colleagues showed that if the number of qubits in the row is not very large and you use the same row repeatedly, the array becomes less and less able to produce the transformation from B1 to B2. But crucially, the theory also predicts that the row becomes even less able to do the reverse transformation from B2 to B1. The researchers have confirmed this prediction experimentally using photons for B and a fiber optic circuit to simulate a row of three qubits.
“You can approximate the constructor arbitrarily well in one direction but not the other,” Marletto said. There’s an asymmetry to the transformation, just like the one imposed by the second law. This is because the transformation takes the system from a so-called pure quantum state (B1) to a mixed one (B2, which is entangled with the row). A pure state is one for which we know all there is to be known about it. But when two objects are entangled, you can’t fully specify one of them without knowing everything about the other too. The fact is that it’s easier to go from a pure quantum state to a mixed state than vice versa — because the information in the pure state gets spread out by entanglement and is hard to recover. It’s comparable to trying to re-form a droplet of ink once it has dispersed in water, a process in which the irreversibility is imposed by the second law.
So here the irreversibility is “just a consequence of the way the system dynamically evolves,” said Marletto. There’s no statistical aspect to it. Irreversibility is not just the most probable outcome but the inevitable one, governed by the quantum interactions of the components. “Our conjecture,” said Marletto, “is that thermodynamic irreversibility might stem from this.”
Demon in the Machine
There’s another way of thinking about the second law, though, that was first devised by James Clerk Maxwell, the Scottish scientist who pioneered the statistical view of thermodynamics along with Boltzmann. Without quite realizing it, Maxwell connected the thermodynamic law to the issue of information.
Maxwell was troubled by the theological implications of a cosmic heat death and of an inexorable rule of change that seemed to undermine free will. So in 1867 he sought a way to “pick a hole” in the second law. In his hypothetical scenario, a microscopic being (later, to his annoyance, called a demon) turns “useless” heat back into a resource for doing work. Maxwell had previously shown that in a gas at thermal equilibrium there is a distribution of molecular energies. Some molecules are “hotter” than others — they are moving faster and have more energy. But they are all mixed at random so there appears to be no way to make use of those differences.
Enter Maxwell’s demon. It divides the compartment of gas in two, then installs a frictionless trapdoor between them. The demon lets the hot molecules moving about the compartments pass through the trapdoor in one direction but not the other. Eventually the demon has a hot gas on one side and a cooler one on the other, and it can exploit the temperature gradient to drive some machine.
The demon has used information about the motions of molecules to apparently undermine the second law. Information is thus a resource that, just like a barrel of oil, can be used to do work. But as this information is hidden from us at the macroscopic scale, we can’t exploit it. It’s this ignorance of the microstates that compels classical thermodynamics to speak of averages and ensembles.
Almost a century later, physicists proved that Maxwell’s demon doesn’t subvert the second law in the long term, because the information it gathers must be stored somewhere, and any finite memory must eventually be wiped to make room for more. In 1961 the physicist Rolf Landauer showed that this erasure of information can never be accomplished without dissipating some minimal amount of heat, thus raising the entropy of the surroundings. So the second law is only postponed, not broken.
The informational perspective on the second law is now being recast as a quantum problem. That’s partly because of the perception that quantum mechanics is a more fundamental description — Maxwell’s demon treats the gas particles as classical billiard balls, essentially. But it also reflects the burgeoning interest in quantum information theory itself. We can do things with information using quantum principles that we can’t do classically. In particular, entanglement of particles enables information about them to be spread around and manipulated in nonclassical ways.
Crucially, the quantum informational approach suggests a way of getting rid of the troublesome statistical picture that bedevils the classical view of thermodynamics, where you have to take averages over ensembles of many different microstates. “The true novelty with quantum information came with the understanding that one can replace ensembles with entanglement with the environment,” said Carlo Maria Scandolo of the University of Calgary.
Taking recourse in an ensemble, he said, reflects the fact that we have only partial information about the state — it could be this microstate or that one, with different probabilities, and so we have to average over a probability distribution. But quantum theory offers another way to generate states of partial information: through entanglement. When a quantum system gets entangled with its environment, about which we can’t know everything, some information about the system itself is inevitably lost: It ends up in a mixed state, where you can’t know everything about it even in principle by focusing on just the system.
Then you are forced to speak in terms of probabilities not because there are things about the system you don’t know, but because some of that information is fundamentally unknowable. In this way, “probabilities arise naturally from entanglement,” said Scandolo. “The whole idea of getting thermodynamic behavior by considering the role of the environment works only as long as there is entanglement.”
Those ideas have now been made precise. Working with Giulio Chiribella of the University of Hong Kong, Scandolo has proposed four axioms about quantum information that are required to obtain a “sensible thermodynamics” — that is, one not based on probabilities. The axioms describe constraints on the information in a quantum system that becomes entangled with its environment. In particular, everything that happens to the system plus environment is in principle reversible, just as is implied by the standard mathematical formulation of how a quantum system evolves in time.
As a consequence of these axioms, Scandolo and Chiribella show, uncorrelated systems always grow more correlated through reversible interactions. Correlations are what connect entangled objects: The properties of one are correlated with those of the other. They are measured by “mutual information,” a quantity that’s related to entropy. So a constraint on how correlations can change is also a constraint on entropy. If the entropy of the system decreases, the entropy of the environment must increase such that the sum of the two entropies can only increase or stay the same, but never decrease. In this way, Scandolo said, their approach derives the existence of entropy from the underlying axioms, rather than postulating it at the outset.
Redefining Thermodynamics
One of the most versatile ways to understand this new quantum version of thermodynamics invokes so-called resource theories — which again speak about which transformations are possible and which are not. “A resource theory is a simple model for any situation in which the actions you can perform and the systems you can access are restricted for some reason,” said the physicist Nicole Yunger Halpern of the National Institutes of Standards and Technology. (Scandolo has incorporated resource theories into his work too.)
Quantum resource theories adopt the picture of the physical world suggested by quantum information theory, in which there are fundamental limitations on which physical processes are possible. In quantum information theory these limitations are typically expressed as “no-go theorems”: statements that say “You can’t do that!” For example, it is fundamentally impossible to make a copy of an unknown quantum state, an idea called quantum no-cloning.
Resource theories have a few main ingredients. The operations that are allowed are called free operations. “Once you specify the free operations, you have defined the theory — and then you can start reasoning about which transformations are possible or not, and ask what are the optimal efficiencies with which we can perform these tasks,” said Yunger Halpern. A resource, meanwhile, is something that an agent can access to do something useful — it could be a pile of coal to fire up a furnace and power a steam engine. Or it could be extra memory that will allow a Maxwellian demon to subvert the second law for a little longer.
Quantum resource theories allow a kind of zooming in on the fine-grained details of the classical second law. We don’t need to think about huge numbers of particles; we can make statements about what is allowed among just a few of them. When we do this, said Yunger Halpern, it becomes clear that the classical second law (final entropy must be equal to or greater than initial entropy) is just a kind of coarse-grained sum of a whole family of inequality relationships. For instance, classically the second law says that you can transform a nonequilibrium state into one that is closer to thermal equilibrium. But “asking which of these states is closer to thermal is not a simple question,” said Yunger Halpern. To answer it, “we have to check a whole bunch of inequalities.”
In other words, in resource theories there seem to be a whole bunch of mini-second laws. “So there could be some transformations allowed by the conventional second law but forbidden by this more detailed family of inequalities,” said Yunger Halpern. For that reason, she adds, “sometimes I feel like everyone [in this field] has their own second law.”
The resource-theory approach, said physicist Markus Müller of the University of Vienna, “admits a fully mathematically rigorous derivation, without any conceptual or mathematical loose ends, of the thermodynamic laws and more.” He said that this approach involves “a reconsideration of what one really means by thermodynamics” — it is not so much about the average properties of large ensembles of moving particles, but about a game that an agent plays against nature to conduct a task efficiently with the available resources. In the end, though, it is still about information. The discarding of information — or the inability to keep track of it — is really the reason why the second law holds, Yunger Halpern said.
Hilbert’s Problem
All these efforts to rebuild thermodynamics and the second law recall a challenge laid down by the German mathematician David Hilbert. In 1900 he posed 23 outstanding problems in mathematics that he wanted to see solved. Item six in that list was “to treat, by means of axioms, those physical sciences in which already today mathematics plays an important part.” Hilbert was concerned that the physics of his day seemed to rest on rather arbitrary assumptions, and he wanted to see them made rigorous in the same way that mathematicians were attempting to derive fundamental axioms for their own discipline.
Some physicists today are still working on Hilbert’s sixth problem, attempting in particular to reformulate quantum mechanics and its more abstract version, quantum field theory, using axioms that are simpler and more physically transparent than the traditional ones. But Hilbert evidently had thermodynamics in mind too, referring to aspects of physics that use “the theory of probabilities” as among those ripe for reinvention.
Whether Hilbert’s sixth problem has yet been cracked for the second law seems to be a matter of taste. “I think Hilbert’s sixth problem is far from being completely solved, and I personally find it a very intriguing and important research direction in the foundations of physics,” said Scandolo. “There are still open problems, but I think they will be solved in the foreseeable future, provided enough time and energy are devoted to them.”
Maybe, though, the real value of re-deriving the second law lies not in satisfying Hilbert’s ghost but just in deepening our understanding of the law itself. As Einstein said, “A theory is the more impressive the greater the simplicity of its premises.” Yunger Halpern compares the motivation for working on the law to the reason literary scholars still reanalyze the plays and poems of Shakespeare: not because such new analysis is “more correct,” but because works this profound are an endless source of inspiration and insight.