origins of life

First Support for a Physics Theory of Life

Take chemistry, add energy, get life. The first tests of Jeremy England’s provocative origin-of-life hypothesis are in, and they appear to show how order can arise from nothing.

Shayla Fish for Quanta Magazine

Introduction

The biophysicist Jeremy England made waves in 2013 with a new theory that cast the origin of life as an inevitable outcome of thermodynamics. His equations suggested that under certain conditions, groups of atoms will naturally restructure themselves so as to burn more and more energy, facilitating the incessant dispersal of energy and the rise of “entropy” or disorder in the universe. England said this restructuring effect, which he calls dissipation-driven adaptation, fosters the growth of complex structures, including living things. The existence of life is no mystery or lucky break, he told Quanta in 2014, but rather follows from general physical principles and “should be as unsurprising as rocks rolling downhill.”

Since then, England, a 35-year-old associate professor at the Massachusetts Institute of Technology, has been testing aspects of his idea in computer simulations. The two most significant of these studies were published this month — the more striking result in the Proceedings of the National Academy of Sciences (PNAS) and the other in Physical Review Letters (PRL). The outcomes of both computer experiments appear to back England’s general thesis about dissipation-driven adaptation, though the implications for real life remain speculative.

“This is obviously a pioneering study,” Michael Lässig, a statistical physicist and quantitative biologist at the University of Cologne in Germany, said of the PNAS paper written by England and an MIT postdoctoral fellow, Jordan Horowitz. It’s “a case study about a given set of rules on a relatively small system, so it’s maybe a bit early to say whether it generalizes,” Lässig said. “But the obvious interest is to ask what this means for life.”

The paper strips away the nitty-gritty details of cells and biology and describes a simpler, simulated system of chemicals in which it is nonetheless possible for exceptional structure to spontaneously arise — the phenomenon that England sees as the driving force behind the origin of life. “That doesn’t mean you’re guaranteed to acquire that structure,” England explained. The dynamics of the system are too complicated and nonlinear to predict what will happen.

The simulation involved a soup of 25 chemicals that react with one another in myriad ways. Energy sources in the soup’s environment facilitate or “force” some of these chemical reactions, just as sunlight triggers the production of ozone in the atmosphere and the chemical fuel ATP drives processes in the cell. Starting with random initial chemical concentrations, reaction rates and “forcing landscapes” — rules that dictate which reactions get a boost from outside forces and by how much — the simulated chemical reaction network evolves until it reaches its final, steady state, or “fixed point.”

Jeremy England, an associate professor of physics at the Massachusetts Institute of Technology, thinks he has found the physical mechanism underlying the origin of life.

Katherine Taylor for Quanta Magazine

Often, the system settles into an equilibrium state, where it has a balanced concentration of chemicals and reactions that just as often go one way as the reverse. This tendency to equilibrate, like a cup of coffee cooling to room temperature, is the most familiar outcome of the second law of thermodynamics, which says that energy constantly spreads and the entropy of the universe always increases. (The second law is true because there are more ways for energy to be spread out among particles than to be concentrated, so as particles move around and interact, the odds favor their energy becoming increasingly shared.)

But for some initial settings, the chemical reaction network in the simulation goes in a wildly different direction: In these cases, it evolves to fixed points far from equilibrium, where it vigorously cycles through reactions by harvesting the maximum energy possible from the environment. These cases “might be recognized as examples of apparent fine-tuning” between the system and its environment, Horowitz and England write, in which the system finds “rare states of extremal thermodynamic forcing.”

Living creatures also maintain steady states of extreme forcing: We are super-consumers who burn through enormous amounts of chemical energy, degrading it and increasing the entropy of the universe, as we power the reactions in our cells. The simulation emulates this steady-state behavior in a simpler, more abstract chemical system and shows that it can arise “basically right away, without enormous wait times,” Lässig said — indicating that such fixed points can be easily reached in practice.

Many biophysicists think something like what England is suggesting may well be at least part of life’s story. But whether England has identified the most crucial step in the origin of life depends to some extent on the question: What’s the essence of life? Opinions differ.

Form and Function

England, a prodigy by many accounts who spent time at Harvard, Oxford, Stanford and Princeton universities before landing on the faculty at MIT at 29, sees the essence of living things as the exceptional arrangement of their component atoms. “If I imagine randomly rearranging the atoms of the bacterium — so I just take them, I label them all, I permute them in space — I’m presumably going to get something that is garbage,” he said earlier this month. “Most arrangements [of atomic building blocks] are not going to be the metabolic powerhouses that a bacterium is.”

It’s not easy for a group of atoms to unlock and burn chemical energy. To perform this function, the atoms must be arranged in a highly unusual form. According to England, the very existence of a form-function relationship “implies that there’s a challenge presented by the environment that we see the structure of the system as meeting.”

But how and why do atoms acquire the particular form and function of a bacterium, with its optimal configuration for consuming chemical energy? England hypothesizes that it’s a natural outcome of thermodynamics in far-from-equilibrium systems.

The Nobel-Prize-winning physical chemist Ilya Prigogine pursued similar ideas in the 1960s, but his methods were limited. Traditional thermodynamic equations work well only for studying near-equilibrium systems like a gas that is slowly being heated or cooled. Systems driven by powerful external energy sources have much more complicated dynamics and are far harder to study.

The situation changed in the late 1990s, when the physicists Gavin Crooks and Chris Jarzynski derived “fluctuation theorems” that can be used to quantify how much more often certain physical processes happen than reverse processes. These theorems allow researchers to study how systems evolve — even far from equilibrium. England’s “novel angle,” said Sara Walker, a theoretical physicist and origins-of-life specialist at Arizona State University, has been to apply the fluctuation theorems “to problems relevant to the origins of life. I think he’s probably the only person doing that in any kind of rigorous way.”

Coffee cools down because nothing is heating it up, but England’s calculations suggested that groups of atoms that are driven by external energy sources can behave differently: They tend to start tapping into those energy sources, aligning and rearranging so as to better absorb the energy and dissipate it as heat. He further showed that this statistical tendency to dissipate energy might foster self-replication. (As he explained it in 2014, “A great way of dissipating more is to make more copies of yourself.”) England sees life, and its extraordinary confluence of form and function, as the ultimate outcome of dissipation-driven adaptation and self-replication.

However, even with the fluctuation theorems in hand, the conditions on early Earth or inside a cell are far too complex to predict from first principles. That’s why the ideas have to be tested in simplified, computer-simulated environments that aim to capture the flavor of reality.

In the PRL paper, England and his coauthors Tal Kachman and Jeremy Owen of MIT simulated a system of interacting particles. They found that the system increases its energy absorption over time by forming and breaking bonds in order to better resonate with a driving frequency. “This is in some sense a little bit more basic as a result” than the PNAS findings involving the chemical reaction network, England said.

Crucially, in the latter work, he and Horowitz created a challenging environment where special configurations would be required to tap into the available energy sources, just as the special atomic arrangement of a bacterium enables it to metabolize energy. In the simulated environment, external energy sources boosted (or “forced”) certain chemical reactions in the reaction network. The extent of this forcing depended on the concentrations of the different chemical species. As the reactions progressed and the concentrations evolved, the amount of forcing would change abruptly. Such a rugged forcing landscape made it difficult for the system “to find combinations of reactions which are capable of extracting free energy optimally,” explained Jeremy Gunawardena, a mathematician and systems biologist at Harvard Medical School.

Yet when the researchers let the chemical reaction networks play out in such an environment, the networks seemed to become fine-tuned to the landscape. A randomized set of starting points went on to achieve rare states of vigorous chemical activity and extreme forcing four times more often than would be expected. And when these outcomes happened, they happened dramatically: These chemical networks ended up in the 99th percentile in terms of how much forcing they experienced compared with all possible outcomes. As these systems churned through reaction cycles and dissipated energy in the process, the basic form-function relationship that England sees as essential to life set in.

Information Processors

Experts said an important next step for England and his collaborators would be to scale up their chemical reaction network and to see if it still dynamically evolves to rare fixed points of extreme forcing. They might also try to make the simulation less abstract by basing the chemical concentrations, reaction rates and forcing landscapes on conditions that might have existed in tidal pools or near volcanic vents in early Earth’s primordial soup (but replicating the conditions that actually gave rise to life is guesswork). Rahul Sarpeshkar, a professor of engineering, physics and microbiology at Dartmouth College, said, “It would be nice to have some concrete physical instantiation of these abstract constructs.” He hopes to see the simulations re-created in real experiments, perhaps using biologically relevant chemicals and energy sources such as glucose.

But even if the fine-tuned fixed points can be observed in settings that are increasingly evocative of life and its putative beginnings, some researchers see England’s overarching thesis as “necessary but not sufficient” to explain life, as Walker put it, because it cannot account for what many see as the true hallmark of biological systems: their information-processing capacity. From simple chemotaxis (the ability of bacteria to move toward nutrient concentrations or away from poisons) to human communication, life-forms take in and respond to information about their environment.

By clicking to watch this video, you agree to our privacy policy.

Video: David Kaplan explains how the law of increasing entropy could drive random bits of matter into the stable, orderly structures of life.

Filming by Tom Hurwitz and Richard Fleming. Editing and motion graphics by Tom McNamara. Music by Podington Bear.

To Walker’s mind, this distinguishes us from other systems that fall under the umbrella of England’s dissipation-driven adaptation theory, such as Jupiter’s Great Red Spot. “That’s a highly non-equilibrium dissipative structure that’s existed for at least 300 years, and it’s quite different from the non-equilibrium dissipative structures that are existing on Earth right now that have been evolving for billions of years,” she said. Understanding what distinguishes life, she added, “requires some explicit notion of information that takes it beyond the non-equilibrium dissipative structures-type process.” In her view, the ability to respond to information is key: “We need chemical reaction networks that can get up and walk away from the environment where they originated.”

Gunawardena noted that aside from the thermodynamic properties and information-processing abilities of life-forms, they also store and pass down genetic information about themselves to their progeny. The origin of life, Gunawardena said, “is not just emergence of structure, it’s the emergence of a particular kind of dynamics, which is Darwinian. It’s the emergence of structures that reproduce. And the ability for the properties of those objects to influence their reproductive rates. Once you have those two conditions, you’re basically in a situation where Darwinian evolution kicks in, and to biologists, that’s what it’s all about.”

Eugene Shakhnovich, a professor of chemistry and chemical biology at Harvard who supervised England’s undergraduate research, sharply emphasized the divide between his former student’s work and questions in biology. “He started his scientific career in my lab and I really know how capable he is,” Shakhnovich said, but “Jeremy’s work represents potentially interesting exercises in non-equilibrium statistical mechanics of simple abstract systems.” Any claims that it has to do with biology or the origins of life, he added, are “pure and shameless speculations.”

Even if England is on the right track about the physics, biologists want more particulars — such as a theory of what the primitive “protocells” were that evolved into the first living cells, and how the genetic code arose. England completely agrees that his findings are mute on such topics. “In the short term, I’m not saying this tells me a lot about what’s going in a biological system, nor even claiming that this is necessarily telling us where life as we know it came from,” he said. Both questions are “a fraught mess” based on “fragmentary evidence,” that, he said, “I am inclined to steer clear of for now.” He is rather suggesting that in the tool kit of the first life- or proto-life-forms, “maybe there’s more that you can get for free, and then you can optimize it using the Darwinian mechanism.”

Sarpeshkar seemed to see dissipation-driven adaptation as the opening act of life’s origin story. “What Jeremy is showing is that as long as you can harvest energy from your environment, order will spontaneously arise and self-tune,” he said. Living things have gone on to do a lot more than England and Horowitz’s chemical reaction network does, he noted. “But this is about how did life first arise, perhaps — how do you get order from nothing.”

This article was reprinted on Wired.com.

Comment on this article