Art for "Quanta’s Year in Biology (2019)"
SERIES
2019 in Review

The Year in Biology

Researchers explored the zone between life and death, charted the mind’s system for arranging ideas and memories and learned how life’s complexity emerged.

Olena Shmahalo/Quanta Magazine

“Look who knows so much. It just so happens that your friend here is only mostly dead,” says Miracle Max in the film version of The Princess Bride. “There’s a big difference between mostly dead and all dead…. Mostly dead is slightly alive.” The movie can play the paradox for laughs because life and death seem like such self-evidently binary states. Yet as events in 2019 illustrated, Miracle Max was onto something.

The past year in the biological sciences has held no shortage of surprises, most of which revealed the messy intricacies and exceptions that complicate our efforts to understand the living world. In addition to turning up several kinds of proof that the distinctions between life and death can get blurry, biologists learned more about how evolution successfully merged two or more cells into one, and about the mathematical tricks that organisms use in the course of their embryonic development.

Advances in genome analysis helped to rewrite parts of the story of how ancient humans spread across the world; the latest version is more accurate but much more complicated. Neuroscientists learned more about how literally true it is that we see what we want to see, and about the tools with which we navigate through our imaginations and memories.

Corey Brickley for Quanta Magazine

New Paradigms of Life, Death and Identity

In April, scientists at Yale University made an announcement that would have fit in better at Halloween: By perfusing a solution into the isolated brains of pigs that had been dead for hours, they had successfully reanimated the tissues well enough for the neurons to conduct electrical signals. It was a spectacular demonstration that death is not always as irreversible as is commonly supposed, but it was not the only one. In recent years, researchers have seen other natural examples of cells in tissues seemingly committing themselves to dying — and then reversing their slide toward death through a process called anastasis. In fact, a large body of research has revealed that between life and death is a gray zone of intermediate states where many microorganisms spend much of their time as they wait out harsh conditions. Microbiologists are beginning to see that a large share of bacteria in nature — possibly even a majority — may be in a dormant state at any moment.

More generally, it’s beginning to look as if populations of microorganisms often exhibit surprisingly diverse behaviors as a kind of bet-hedging survival strategy. Unaccountable variations in behavior and metabolism pop up even among genetically identical bacteria: For example, a report early this year showed that cloned cells could have radically different tolerances for toxic formaldehyde. Viruses improve their odds of survival through diversifying, too. As another report laid out, some viruses even distribute their genes across multiple infected cells and piece together their infectious particles later.

 

Art for "The Brain Maps Out Ideas and Memories Like Spaces"

Alexandre Tamisier for Quanta Magazine

The Brain’s Map of Memories and Ideas

To enable us to know where we are, our brains have systems for compiling mental maps of our surroundings and placing us within them. The landmark discovery of those systems involved decades of research, starting in the 1960s; it was honored with a Nobel Prize in 2014. The importance of that work has only grown in the years since then as it’s become clear that the brain creates other kinds of mental maps in the same way. Recent experiments have shown that the brain uses the same hexadirectional “grid” system for encoding relative positional information to keep track of abstractions like ideas, and to order sequences of events as a kind of personal timeline for our memories. Work appearing early this year demonstrated that the ability to retrieve distinct memories of events relates to how quickly the brain can encode new representations of successive events.

But even the strictly navigational maps in our heads bear an imprint of our perceptions and desires. A pair of papers appearing in March made a case that our brains revise the mental representations of places to reflect our experiences and priorities: More neurological real estate is given over to places that are important to us, which enables us to retain more details about them.

 

The illustration shows ghostly hands partially obstructing a person’s sight and hearing.

Jason Lyon for Quanta Magazine

The Brain Picks Its Perceptions

Attention is a puzzling phenomenon. When we shift our mental focus from one part of our field of view to another, or listen to the vocalists in a song instead of the musicians, our perceived experiences may change a lot even though the sensory stimuli streaming into our brains are unchanged. Neuroscientists talk about the “spotlight” of attention because it can seem as if we are pointing more mental resources at one sliver of our perceptions to concentrate on it. Yet this is true only in a relative sense. As experiments this year helped to establish, the brain focuses attention on part of its sensory field by filtering out the signals for the other parts.

This discovery aligns with many other findings that expose perception as an active process, with feedback between higher cognitive centers and lower sensory-processing centers. Not only does the brain constantly screen out some stimuli deemed less relevant to us, it speeds up our perceptions by anticipating what sensory stimuli are most likely to arrive next, based on what we’ve already experienced. Our perceptions also seem to be shaped by experience: Comparative studies of people with different cultural backgrounds suggest that some features of sound (like the perceived organization of musical tones into octaves) may be something we learn. The constant barrage of incoming sensations might seem to pose a catastrophic data management challenge for our brain, but this year, scientists showed that when the brain encodes information, it uses a power law to productively balance the amount of detail it retains.

 

3D illustration of three human skulls, split into left and right halves and nested one inside the next.

Olena Shmahalo/Quanta Magazine

Using DNA to Reconstruct the Past

The genomes of living things record their evolutionary lineage — if you know how to read them. With ever more genomic sequences from a widening diversity of species becoming available, scientists are ambitiously combing through samples of old and new DNA for insights into life’s past. Human DNA, not only from living people but also from ancient populations as preserved in fragments of bone, has yielded some surprises of late. An analysis presented in June added to the evidence that modern humans did not migrate out of Africa just once 60,000 years ago. Instead, populations moved in and out of Africa many times, although only the exodus from about 60,000 years ago left modern human descendants in Europe and Asia. As a result of those complex population movements, some people today carry a little more Neanderthal DNA than was previously supposed, and the Neanderthals picked up even more DNA from our ancestors. The genomes of the archaic Denisovan people showed some evidence of interbreeding with an older but as-yet unidentified population that had lived in Eurasia — possibly remnants of the even more ancient Homo erectus. Another genomic study from January also found traces of a previously unknown population that had interbred with modern humans, one that might itself have been a hybrid of Neanderthals and Denisovans.

Nonhuman DNA has also offered fascinating glimpses of vanished eras. Scraps of genetic material found in cave sediments helped researchers reconstruct part of an ecosystem from 80,000 years ago. Marine microbiologists analyzed the genome of the beautiful bobtail squid for clues to the origins of its biofluorescent organ, which depends on symbiotic bacteria. But one consequence of this bonanza of new insights is that some biologists are increasingly unhappy with the traditional Linnaean biological taxonomy, because its organization is at odds with the realities of how evolution works.

 

Illustration of DNA that combines elements of mealybug and bacterial imagery.

Eric Nyquist for Quanta Magazine

New Clues to the Origins of Biological Complexity

Some of the most important but deeply mysterious milestones in evolutionary history mark where the complexity of living things leaped drastically. Around 2.7 billion years ago, complex eukaryotic cells appeared, as prokaryotic host cells and the symbiotic partners living inside them took their relationship to a more committed level. As the endosymbionts became organelles like mitochondria, their genes had to be absorbed into the host’s own genome — a change that has been hard to model as a gradual evolutionary process. But this year, scientists gained new insights into the solution by looking at a strange three-way symbiosis in which all of the cellular partners were interdependent. Researchers also gave more attention to compartmentalized functional structures within bacteria to learn whether they are forerunners of true organelles or completely separate evolutionary innovations.

Organisms also grew more complicated when they became multicellular. New work on jellyfish, one of the oldest groups of animals, found proof that structural complexity could soar even without comparable increases in genetic complexity. And in a curious discovery, scientists found evidence that the evolution of specialized cell types in multicellular organisms could have been built on an overlooked knack for temporary specialization in single-celled life.

 

Art for "The Math That Tells Cells What They Are"

Adrian du Buisson for Quanta Magazine

Mathematical Insights Into Life

In an embryo, cells seem to learn what kind of tissue to become by reading the concentrations of “morphogenetic” chemical signals around themselves. But important details of how cells carried out that calculation were murky — until a study published in January revealed them. It had been thought that cells’ identities were defined gradually and with ever more specificity throughout development. Instead, the new work indicated that the cells decode the information from the external chemical signals with optimum efficiency, which could mean that cells discover their fate very early in development.

Cells revealed their mathematical prowess to researchers in a number of other ways this year, too. Biologists were at a loss to explain exactly how cells could maintain such exquisite control over their physiology — a feat that requires a kind of negative feedback control that engineers call robust perfect adaptation. Then last summer, researchers demonstrated a synthetic system that showed how cells could do it. Studies of flying swarms of midges confirmed that the insects collectively exhibited viscosity and damping effects like those in liquids, which may help to keep the swarms together. Other studies of swarming behaviors with decentralized control arrived at the insight that the collectives perform best when the individual parts aren’t too complex — because the stubbornness of smart members can make the swarm slow to respond.

 

Comment on this article