explainers

What Is Analog Computing?

You don’t need 0s and 1s to perform computations, and in some cases it’s better to avoid them.
A large gray-green machine full of wires, buttons, switches and other electronics.

The OME P2, released by a French company in 1952, is an example of an electronic analog computer. Its name is short for Opérateur Mathématique Électronique.

Damien Boureille

Introduction

Computing today is almost entirely digital. The vast informational catacombs of the internet, the algorithms that power AI, the screen you’re reading this on — all are powered by electronic circuits manipulating binary digits — 0 and 1, off and on. We live, it has been said, in the digital age.

But it’s not obvious why a system that operates using discrete chunks of information would be good at modeling our continuous, analog world. And indeed, for millennia humans have used analog computing devices to understand and predict the ebbs and flows of nature.

Among the earliest known analog computers is the Antikythera mechanism from ancient Greece, which used dozens of gears to predict eclipses and calculate the positions of the sun and moon. Slide rules, invented in the 17th century, executed the mathematical operations that would one day send men to the moon. (The abacus, however, doesn’t count as analog: Its discrete “counters” make it one of the earliest digital computers.) And in the late 19th century, William Thomson, who later became Lord Kelvin, designed a machine that used shafts, cranks and pulleys to model the influence of celestial bodies on the tides. Its successors were used decades later to plan for the beach landings on Normandy on D-Day.

What do these devices have in common? They are all physical systems set up to obey the same mathematical equations behind the phenomena you want to understand. Thomson’s tide-calculating computer, for example, was inspired by 19th-century mathematical advances that turned the question of predicting the tide into a complex trigonometric expression. Calculating that expression by hand was both laborious and error-prone. The cranks and pulleys in Thomson’s machine were configured so that by spinning them, the user would get an output that was identical to the result of the expression that needed to be solved.

From left: A piece of the Antikythera mechanism, which likely dates to the second century BCE. A reconstruction of an 14th-century astronomical clock known as the Astrarium of Giovanni Dondi dall’Orologio, which could predict the positions of the sun, the moon, and the five planets known at the time. William Thomson (later Lord Kelvin) designed this tide-predicting computer in the late 19th century.

From left: Hellenic National Archaeological Museum/ Hellenic Ministry of Culture; Museo della Scienza e della Tecnologia “Leonardo da Vinci”; British Association for the Advancement of Science (per Lord Kelvin)

Analog computing reached its apotheosis in the differential analyzer, first built by Vannevar Bush at the Massachusetts Institute of Technology in 1931. The analyzer used a complicated series of gears and shafts driven by electric motors. It could calculate a huge variety of differential equations — the kind of equation used to model physical phenomena. But to modify an equation, the machine had to be laboriously reconfigured by hand.

When modern digital computing began in the late 1930s, it was clunky, expensive and inferior. But digital computation had benefits. Digital computers were easier to program and often more accurate than analog machines. And with the rise of the transistor and the subsequent advances fueled by Moore’s law, digital processing soon took over.

But as our digital world has exploded, its costs have as well. Every switch of a digital bit takes a smidgen of energy. And new artificial intelligence systems require huge amounts of computing power. To take just one example, news reports have revealed that Microsoft and OpenAI are planning a $100 billion data center that would suck about 5 gigawatts of power. That’s roughly the output of five nuclear reactors.

A black-and-white photo of a large machine full of gears and shafts in black and white.

After Vannevar Bush completed his differential analyzer, the idea spread. This one was built at the University of Cambridge in 1938.

University of Cambridge

Analog computing offers an alternative. The neural networks that power AI systems make predictions by repeatedly blasting through a sequence of multiplication and addition operations.

In an analog computer that uses electrical signals — not gears and pulleys — a current could pass through a circuit that uses carefully chosen resistors to model those operations, at a significant power savings.

The advantages of digital computing are real, but so are the drawbacks. Perhaps, by reaching back to computing’s past, researchers will be able to steer a sustainable path toward our computational future.

Correction: August 2, 2024: The $100 billion data center would require 5 gigawatts of power, not 5 megawatts.

Comment on this article