The New Thermodynamic Understanding of Clocks
Introduction
In 2013, a masters student in physics named Paul Erker went combing through textbooks and papers looking for an explanation of what a clock is. “Time is what a clock measures,” Albert Einstein famously quipped; Erker hoped a deeper understanding of clocks might inspire new insights about the nature of time.
But he found that physicists hadn’t bothered much about the fundamentals of timekeeping. They tended to take time information for granted. “I was very unsatisfied by the way the literature so far dealt with clocks,” Erker said recently.
The budding physicist started thinking for himself about what a clock is — what it takes to tell time. He had some initial ideas. Then in 2015, he moved to Barcelona for his doctorate. There, a whole cadre of physicists took up Erker’s question, led by a professor named Marcus Huber. Huber, Erker and their colleagues specialized in quantum information theory and quantum thermodynamics, disciplines concerning the flow of information and energy. They realized that these theoretical frameworks, which undergird emerging technologies like quantum computers and quantum engines, also provided the right language for describing clocks.
“It occurred to us that actually a clock is a thermal machine,” Huber explained over Zoom, his dark blond dreadlocks draped over a black T-shirt. Like an engine, a clock harnesses the flow of energy to do work, producing exhaust in the process. Engines use energy to propel; clocks use it to tick.
Over the past five years, through studies of the simplest conceivable clocks, the researchers have discovered the fundamental limits of timekeeping. They’ve mapped out new relationships between accuracy, information, complexity, energy and entropy — the quantity whose incessant rise in the universe is closely associated with the arrow of time.
These relationships were purely theoretical until this spring, when the experimental physicist Natalia Ares and her team at the University of Oxford reported measurements of a nanoscale clock that strongly support the new thermodynamic theory.
Nicole Yunger Halpern, a quantum thermodynamicist at Harvard University who was not involved in the recent clock work, called it “foundational.” She thinks the findings could lead to the design of optimally efficient, autonomous quantum clocks for controlling operations in future quantum computers and nanorobots.
The new perspective on clocks has already provided fresh fodder for discussions of time itself. “This line of work does grapple, in a fundamental way, with the role of time in quantum theory,” Yunger Halpern said.
Gerard Milburn, a quantum theorist at the University of Queensland in Australia who wrote a review paper last year about the research on clock thermodynamics, said, “I don’t think people appreciate just how fundamental it is.”
What a Clock Is
The first thing to note is that pretty much everything is a clock. Garbage announces the days with its worsening smell. Wrinkles mark the years. “You could tell time by measuring how cold your coffee has gotten on your coffee table,” said Huber, who is now at the Technical University of Vienna and the Institute for Quantum Optics and Quantum Information Vienna.
Early in their conversations in Barcelona, Huber, Erker and their colleagues realized that a clock is anything that undergoes irreversible changes: changes in which energy spreads out among more particles or into a broader area. Energy tends to dissipate — and entropy, a measure of its dissipation, tends to increase — simply because there are far, far more ways for energy to be spread out than for it to be highly concentrated. This numerical asymmetry, and the curious fact that energy started out ultra-concentrated at the beginning of the universe, are why energy now moves toward increasingly dispersed arrangements, one cooling coffee cup at a time.
Not only do energy’s strong spreading tendency and entropy’s resulting irreversible rise seem to account for time’s arrow, but according to Huber and company, it also accounts for clocks. “The irreversibility is really fundamental,” Huber said. “This shift in perspective is what we wanted to explore.”
Coffee doesn’t make a great clock. As with most irreversible processes, its interactions with the surrounding air happen stochastically. This means you have to average over long stretches of time, encompassing many random collisions between coffee and air molecules, in order to accurately estimate a time interval. This is why we don’t refer to coffee, or garbage or wrinkles, as clocks.
We reserve that name, the clock thermodynamicists realized, for objects whose timekeeping ability is enhanced by periodicity: some mechanism that spaces out the intervals between the moments when irreversible processes occur. A good clock doesn’t just change. It ticks.
The more regular the ticks, the more accurate the clock. In their first paper, published in Physical Review X in 2017, Erker, Huber and co-authors showed that better timekeeping comes at a cost: The greater a clock’s accuracy, the more energy it dissipates and the more entropy it produces in the course of ticking.
“A clock is a flow meter for entropy,” said Milburn.
They found that an ideal clock — one that ticks with perfect periodicity — would burn an infinite amount of energy and produce infinite entropy, which isn’t possible. Thus, the accuracy of clocks is fundamentally limited.
Indeed, in their paper, Erker and company studied the accuracy of the simplest clock they could think of: a quantum system consisting of three atoms. A “hot” atom connects to a heat source, a “cold” atom couples to the surrounding environment, and a third atom that’s linked to both of the others “ticks” by undergoing excitations and decays. Energy enters the system from the heat source, driving the ticks, and entropy is produced when waste energy gets released into the environment.
The researchers calculated that the ticks of this three-atom clock become more regular the more entropy the clock produces. This relationship between clock accuracy and entropy “intuitively made sense to us,” Huber said, in light of the known connection between entropy and information.
In precise terms, entropy is a measure of the number of possible arrangements that a system of particles can be in. These possibilities grow when energy is spread more evenly among more particles, which is why entropy rises as energy disperses. Moreover, in his 1948 paper that founded information theory, the American mathematician Claude Shannon showed that entropy also inversely tracks with information: The less information you have about, say, a data set, the higher its entropy, since there are more possible states the data can be in.
“There’s this deep connection between entropy and information,” Huber said, and so any limit on a clock’s entropy production should naturally correspond to a limit of information — including, he said, “information about the time that has passed.”
In another paper published in Physical Review X earlier this year, the theorists expanded on their three-atom clock model by adding complexity — essentially extra hot and cold atoms connected to the ticking atom. They showed that this additional complexity enables a clock to concentrate the probability of a tick happening into narrower and narrower windows of time, thereby increasing the regularity and accuracy of the clock.
In short, it’s the irreversible rise of entropy that makes timekeeping possible, while both periodicity and complexity enhance clock performance. But until 2019, it wasn’t clear how to verify the team’s equations, or what, if anything, simple quantum clocks had to do with the ones on our walls.
Measuring Ticks
At a conference dinner that year, Erker sat near Anna Pearson, a graduate student at Oxford who had given a talk he’d found interesting earlier that day. Pearson worked on studies of a 50-nanometer-thick vibrating membrane. In her talk, she remarked offhandedly that the membrane could be stimulated with white noise — a random mix of radio frequencies. The frequencies that resonated with the membrane drove its vibrations.
To Erker, the noise seemed like a heat source, and the vibrations like ticks of a clock. He suggested a collaboration.
Pearson’s supervisor, Ares, was enthusiastic. She’d already discussed with Milburn the possibility that the membrane could behave as a clock, but she hadn’t heard about the new thermodynamic relationships derived by the other theorists, including the fundamental limit on accuracy. “We said, ‘We can definitely measure that!’” Ares said. “‘We can measure the entropy production! We can measure the ticks!’”
The vibrating membrane isn’t a quantum system, but it’s small and simple enough to allow precise tracking of its motion and energy use. “We can tell from the energy dissipation in the circuit itself how much the entropy changes,” Ares said.
She and her team set out to test the key prediction from Erker and company’s 2017 paper: That there should be a linear relationship between entropy production and accuracy. It was unclear whether the relationship would hold for a larger, classical clock, like the vibrating membrane. But when the data rolled in, “we saw the first plots [and] we thought, wow, there is this linear relationship,” Huber said.
The regularity of the membrane clock’s vibrations directly tracked with how much energy entered the system and how much entropy it produced. The findings suggest that the thermodynamic equations the theorists derived may hold universally for timekeeping devices.
Most clocks don’t approach these fundamental limits; they burn far more than the minimum energy to tell time. Even the world’s most accurate atomic clocks, like those operated at the JILA institute in Boulder, Colorado, “are far from the fundamental limit of minimum energy,” said Jun Ye, a physicist at JILA. But, Ye said, “we clockmakers are trying to use quantum information science to build more precise and accurate clocks,” and so fundamental limits may become important in the future. Yunger Halpern agrees, noting that efficient, autonomous clocks may eventually govern the timing of operations inside quantum computers, removing the need for external control.
Practicalities aside, Erker’s hope has stayed the same since his student days. “The ultimate goal would be to understand what time is,” he said.
A Smooth Order
One major aspect of the mystery of time is the fact that it doesn’t play the same role in quantum mechanics as other quantities, like position or momentum; physicists say there are no “time observables” — no exact, intrinsic time stamps on quantum particles that can be read off by measurements. Instead, time is a smoothly varying parameter in the equations of quantum mechanics, a reference against which to gauge the evolution of other observables.
Physicists have struggled to understand how the time of quantum mechanics can be reconciled with the notion of time as the fourth dimension in Einstein’s general theory of relativity, the current description of gravity. Modern attempts to reconcile quantum mechanics and general relativity often treat the four-dimensional space-time fabric of Einstein’s theory as emergent, a kind of hologram cooked up by more abstract quantum information. If so, both time and space ought to be approximate concepts.
The clock studies are suggestive, in showing that time can only ever be measured imperfectly. The “big question,” said Huber, is whether the fundamental limit on the accuracy of clocks reflects a fundamental limit on the smooth flow of time itself — in other words, whether stochastic events like collisions of coffee and air molecules are what time ultimately is.
“What we’ve done is to show that even if time is a perfect, classical and smooth parameter governing time evolution of quantum systems,” Huber said, “we would only be able to track its passage” imperfectly, through stochastic, irreversible processes. This invites a question, he said: “Could it be that time is an illusion and smooth time is an emergent consequence of us trying to put events into a smooth order? It is certainly an intriguing possibility that is not easily dismissed.”