A surprising food may have been a staple of the real Paleo diet: rotten meat

In a book about his travels in Africa published in 1907, British explorer Arnold Henry Savage Landor recounted witnessing an impromptu meal that his companions relished but that he found unimaginably revolting.

As he coasted down a river in the Congo Basin with several local hunter-gatherers, a dead rodent floated near their canoe. Its decomposing body had bloated to the size of a small pig.

Stench from the swollen corpse left Landor gasping for breath. Unable to speak, he tried to signal his companions to steer the canoe away from the fetid creature. Instead, they hauled the supersize rodent aboard and ate it.
“The odour when they dug their knives into it was enough to kill the strongest of men,” Landor wrote. “When I recovered, my admiration for the digestive powers of these people was intense. They were smacking their lips and they said the [rodent] had provided most excellent eating.”

Starting in the 1500s, European and then later American explorers, traders, missionaries, government officials and others who lived among Indigenous peoples in many parts of the world wrote of similar food practices. Hunter-gatherers and small-scale farmers everywhere commonly ate putrid meat, fish and fatty parts of a wide range of animals. From arctic tundra to tropical rainforests, native populations consumed rotten remains, either raw, fermented or cooked just enough to singe off fur and create a more chewable texture. Many groups treated maggots as a meaty bonus.

Descriptions of these practices, which still occur in some present-day Indigenous groups and among northern Europeans who occasionally eat fermented fish, aren’t likely to inspire any new Food Network shows or cookbooks from celebrity chefs.

Case in point: Some Indigenous communities feasted on huge decomposing beasts, including hippos that had been trapped in dug-out pits in Africa and beached whales on Australia’s coast. Hunters in those groups typically smeared themselves with the fat of the animal before gorging on greasy innards. After slicing open animals’ midsections, both adults and children climbed into massive, rotting body cavities to remove meat and fat.

Or consider that Native Americans in Missouri in the late 1800s made a prized soup from the greenish, decaying flesh of dead bison. Animal bodies were buried whole in winter and unearthed in spring after ripening enough to achieve peak tastiness.

But such accounts provide a valuable window into a way of life that existed long before Western industrialization and the war against germs went global, says anthropological archaeologist John Speth of the University of Michigan in Ann Arbor. Intriguingly, no reports of botulism and other potentially fatal reactions to microorganisms festering in rotting meat appear in writings about Indigenous groups before the early 1900s. Instead, decayed flesh and fat represented valued and tasty parts of a healthy diet.
Many travelers such as Landor considered such eating habits to be “disgusting.” But “a gold mine of ethnohistorical accounts makes it clear that the revulsion Westerners feel toward putrid meat and maggots is not hardwired in our genome but is instead culturally learned,” Speth says.

This dietary revelation also challenges an influential scientific idea that cooking originated among our ancient relatives as a way to make meat more digestible, thus providing a rich calorie source for brain growth in the Homo genus. It’s possible, Speth argues, that Stone Age hominids such as Neandertals first used cooking for certain plants that, when heated, provided an energy-boosting, carbohydrate punch to the diet. Animals held packets of fat and protein that, after decay set in, rounded out nutritional needs without needing to be heated.
Putrid foods in the diets of Indigenous peoples
Speth’s curiosity about a human taste for putrid meat was originally piqued by present-day hunter-gatherers in polar regions. North American Inuit, Siberians and other far-north populations still regularly eat fermented or rotten meat and fish.

Fermented fish heads, also known as “stinkhead,” are one popular munchy among northern groups. Chukchi herders in the Russian Far East, for instance, bury whole fish in the ground in early fall and let the bodies naturally ferment during periods of freezing and thawing. Fish heads the consistency of hard ice cream are then unearthed and eaten whole.

Speth has suspected for several decades that consumption of fermented and putrid meat, fish, fat and internal organs has a long and probably ancient history among northern Indigenous groups. Consulting mainly online sources such as Google Scholar and universities’ digital library catalogs, he found many ethnohistorical descriptions of such behavior going back to the 1500s. Putrid walrus, seals, caribou, reindeer, musk oxen, polar bears, moose, arctic hares and ptarmigans had all been fair game. Speth reported much of this evidence in 2017 in PaleoAnthropology.

In one recorded incident from late-1800s Greenland, a well-intentioned hunter brought what he had claimed in advance was excellent food to a team led by American explorer Robert Peary. A stench filled the air as the hunter approached Peary’s vessel carrying a rotting seal dripping with maggots. The Greenlander had found the seal where a local group had buried it, possibly a couple of years earlier, so that the body could reach a state of tasty decomposition. Peary ordered the man to keep the reeking seal off his boat.

Miffed at this unexpected rejection, the hunter “told us that the more decayed the seal the finer the eating, and he could not understand why we should object,” Peary’s wife wrote of the encounter.

Even in temperate and tropical areas, where animal bodies decompose within hours or days, Indigenous peoples have appreciated rot as much as Peary’s seal-delivery man did. Speth and anthropological archaeologist Eugène Morin of Trent University in Peterborough, Canada, described some of those obscure ethnohistorical accounts last October in PaleoAnthropology.
Early hominids may have scavenged rotten meat
These accounts undermine some of scientists’ food-related sacred cows, Speth says. For instance, European explorers and other travelers consistently wrote that traditional groups not only ate putrid meat raw or lightly cooked but suffered no ill aftereffects. A protective gut microbiome may explain why, Speth suspects. Indigenous peoples encountered a variety of microorganisms from infancy on, unlike people today who grow up in sanitized settings. Early exposures to pathogens may have prompted the development of an array of gut microbes and immune responses that protected against potential harms of ingesting putrid meat.

That idea requires further investigation; little is known about the bacterial makeup of rotten meat eaten by traditional groups or of their gut microbiomes. But studies conducted over the last few decades do indicate that putrefaction, the process of decay, offers many of cooking’s nutritional benefits with far less effort. Putrefaction predigests meat and fish, softening the flesh and chemically breaking down proteins and fats so they are more easily absorbed and converted to energy by the body.

Given the ethnohistorical evidence, hominids living 3 million years ago or more could have scavenged meat from decomposing carcasses, even without stone tools for hunting or butchery, and eaten their raw haul safely long before fire was used for cooking, Speth contends. If simple stone tools appeared as early as 3.4 million years ago, as some researchers have controversially suggested, those implements may have been made by hominids seeking raw meat and marrow (SN: 9/11/10, p. 8). Researchers suspect regular use of fire for cooking, light and warmth emerged no earlier than around 400,000 years ago (SN: 5/5/12, p. 18).

“Recognizing that eating rotten meat is possible, even without fire, highlights how easy it would have been to incorporate scavenged food into the diet long before our ancestors learned to hunt or process [meat] with stone tools,” says paleoanthropologist Jessica Thompson of Yale University.

Thompson and colleagues suggested in Current Anthropology in 2019 that before about 2 million years ago, hominids were primarily scavengers who used rocks to smash open animal bones and eat nutritious, fat-rich marrow and brains. That conclusion, stemming from a review of fossil and archaeological evidence, challenged a common assumption that early hominids — whether as hunters or scavengers — primarily ate meat off the bone.

Certainly, ancient hominids were eating more than just the meaty steaks we think of today, says archaeologist Manuel Domínguez-Rodrigo of Rice University in Houston. In East Africa’s Olduvai Gorge, butchered animal bones at sites dating to nearly 2 million years ago indicate that hominids ate most parts of carcasses, including brains and internal organs.

“But Speth’s argument about eating putrid carcasses is very speculative and untestable,” Domínguez-Rodrigo says.

Untangling whether ancient hominids truly had a taste for rot will require research that spans many fields, including microbiology, genetics and food science, Speth says.

But if his contention holds up, it suggests that ancient cooks were not turning out meat dishes. Instead, Speth speculates, cooking’s primary value at first lay in making starchy and oily plants softer, more chewable and easily digestible. Edible plants contain carbohydrates, sugar molecules that can be converted to energy in the body. Heating over a fire converts starch in tubers and other plants to glucose, a vital energy source for the body and brain. Crushing or grinding of plants might have yielded at least some of those energy benefits to hungry hominids who lacked the ability to light fires.

Whether hominids controlled fire well enough to cook plants or any other food regularly before around 400,000 to 300,000 years ago is unknown.
Neandertals may have hunted animals for fat
Despite their nutritional benefits, plants often get viewed as secondary menu items for Stone Age folks. It doesn’t help that plants preserve poorly at archaeological sites.

Neandertals, in particular, have a long-standing reputation as plant shunners. Popular opinion views Neandertals as burly, shaggy individuals who huddled around fires chomping on mammoth steaks.

That’s not far from an influential scientific view of what Neandertals ate. Elevated levels of a diet-related form of nitrogen in Neandertal bones and teeth hint that they were committed carnivores, eating large amounts of protein-rich lean meat, several research teams have concluded over nearly the last 30 years.

But consuming that much protein from meat, especially from cuts above the front and hind limbs now referred to as steaks, would have been a recipe for nutritional disaster, Speth argues. Meat from wild, hoofed animals and smaller creatures such as rabbits contains almost no fat, or marbling, unlike meat from modern domestic animals, he says. Ethnohistorical accounts, especially for northern hunters including the Inuit, include warnings about weight loss, ill health and even death that can result from eating too much lean meat.

This form of malnutrition is known as rabbit starvation. Evidence indicates that people can safely consume between about 25 and 35 percent of daily calories as protein, Speth says. Above that threshold, several investigations have indicated that the liver becomes unable to break down chemical wastes from ingested proteins, which then accumulate in the blood and contribute to rabbit starvation. Limits to the amount of daily protein that can be safely consumed meant that ancient hunting groups, like those today, needed animal fats and carbohydrates from plants to fulfill daily calorie and other nutritional needs.

Modern “Paleo diets” emphasize eating lean meats, fruits and vegetables. But that omits what past and present Indigenous peoples most wanted from animal carcasses. Accounts describe Inuit people eating much larger amounts of fatty body parts than lean meat, Speth says. Over the last few centuries, they have favored tongue, fat deposits, brisket, ribs, fatty tissue around intestines and internal organs, and marrow. Internal organs, especially adrenal glands, have provided vitamin C — nearly absent in lean muscle — that prevented anemia and other symptoms of scurvy.

Western explorers noted that the Inuit also ate chyme, the stomach contents of reindeer and other plant-eating animals. Chyme provided at least a side course of plant carbohydrates. Likewise, Neandertals in Ice Age Europe probably subsisted on a fat- and chyme-supplemented diet (SN Online: 10/11/13), Speth contends.

Large numbers of animal bones found at northern European Neandertal sites — often viewed as the residue of ravenous meat eaters — may instead reflect overhunting of animals to obtain enough fat to meet daily calorie needs. Because wild game typically has a small percentage of body fat, northern hunting groups today and over the last few centuries frequently killed prey in large numbers, either discarding most lean meat from carcasses or feeding it to their dogs, ethnographic studies show.

If Neandertals followed that playbook, eating putrid foods might explain why their bones carry a carnivore-like nitrogen signature, Speth suggests. An unpublished study of decomposing human bodies kept at a University of Tennessee research facility in Knoxville called the Body Farm tested that possibility. Biological anthropologist Melanie Beasley, now at Purdue University in West Lafayette, Ind., found moderately elevated tissue nitrogen levels in 10 deceased bodies sampled regularly for about six months. Tissue from those bodies served as a stand-in for animal meat consumed by Neandertals. Human flesh is an imperfect substitute for, say, reindeer or elephant carcasses. But Beasley’s findings suggest that decomposition’s effects on a range of animals need to be studied. Intriguingly, she also found that maggots in the decaying tissue displayed extremely elevated nitrogen levels.

Paleobiologist Kimberly Foecke of George Washington University in Washington, D.C., has also found high nitrogen levels in rotting, maggot-free cuts of beef from animals fed no hormones or antibiotics to approximate the diets of Stone Age creatures (SN: 1/2/19).

Like arctic hunters did a few hundred years ago, Neandertals may have eaten putrid meat and fish studded with maggots, Speth says. That would explain elevated nitrogen levels in Neandertal fossils.

But Neandertal dining habits are poorly understood. Unusually extensive evidence of Neandertal big-game consumption has come from a new analysis of fossil remains at a roughly 125,000-year-old site in northern Germany called Neumark-Nord. There, Neandertals periodically hunted straight-tusked elephants weighing up to 13 metric tons, say archaeologist Sabine Gaudzinski-Windheuser of Johannes Gutenberg University of Mainz in Germany and colleagues.

In a study reported February 1 in Science Advances, her group analyzed patterns of stone-tool incisions on bones of at least 57 elephants from 27 spots near an ancient lake basin where Neandertals lit campfires and constructed shelters (SN: 1/29/22, p. 8). Evidence suggests that Neandertal butchers — much like Inuit hunters — removed fat deposits under the skin and fatty body parts such as the tongue, internal organs, brain and thick layers of fat in the feet. Lean meat from elephants would have been eaten in smaller quantities to avoid rabbit starvation, the researchers argue.

Further research needs to examine whether the Neandertals cooked elephant meat or boiled the bones to extract nutritious grease, Speth says. Mealtime options would have expanded for hominids who could not only consume putrid meat and fat but also heat animal parts over fires, he suspects.

Neandertals who hunted elephants must also have eaten a variety of plants to meet their considerable energy requirements, says Gaudzinski-Windheuser. But so far, only fragments of burned hazelnuts, acorns and blackthorn plums have been found at Neumark-Nord.
Neandertals probably carb-loaded
Better evidence of Neandertals’ plant preferences comes from sites in warm Mediterranean and Middle Eastern settings. At a site in coastal Spain, Neandertals probably ate fruits, nuts and seeds of a variety of plants (SN: 3/27/21, p. 32).

Neandertals in a range of environments must have consumed lots of starchy plants, argues archaeologist Karen Hardy of the University of Glasgow in Scotland. Even Stone Age northern European and Asian regions included plants with starch-rich appendages that grew underground, such as tubers.

Neandertals could also have obtained starchy carbs from the edible, inner bark of many trees and from seaweed along coastlines. Cooking, as suggested by Speth, would have greatly increased the nutritional value of plants, Hardy says. Not so for rotten meat and fat, though Neandertals such as those at Neumark-Nord may have cooked what they gleaned from fresh elephant remains.

There is direct evidence that Neandertals munched on plants. Microscopic remnants of edible and medicinal plants have been found in the tartar on Neandertal teeth (SN: 4/1/17, p. 16), Hardy says.

Carbohydrate-fueled energy helped to maintain large brains, enable strenuous physical activity and ensure healthy pregnancies for both Neandertals and ancient Homo sapiens, Hardy concludes in the January 2022 Journal of Human Evolution. (Researchers disagree over whether Neandertals, which lived from around 400,000 to 40,000 years ago, were a variant of H. sapiens or a separate species.)
Paleo cuisine was tasty
Like Hardy, Speth suspects that plants provided a large share of the energy and nutrients Stone Age folks needed. Plants represented a more predictable, readily available food source than hunted or scavenged meat and fat, he contends.

Plants also offered Neandertals and ancient H. sapiens — whose diets probably didn’t differ dramatically from Neandertals’, Hardy says — a chance to stretch their taste buds and cook up tangy meals.

Paleolithic plant cooking included preplanned steps aimed at adding dashes of specific flavors to basic dishes, a recent investigation suggests. In at least some places, Stone Age people apparently cooked to experience pleasing tastes and not just to fill their stomachs. Charred plant food fragments from Shanidar Cave in Iraqi Kurdistan and Franchthi Cave in Greece consisted of crushed pulse seeds, possibly from starchy pea species, combined with wild plants that would have provided a pungent, somewhat bitter taste, microscopic analyses show.

Added ingredients included wild mustard, wild almonds, wild pistachio and fruits such as hackberry, archaeobotanist Ceren Kabukcu of the University of Liverpool in England and colleagues reported last November in Antiquity.

Four Shanidar food bits date to about 40,000 years ago or more and originated in sediment that included stone tools attributed to H. sapiens. Another food fragment, likely from a cooked Neandertal meal, dates to between 70,000 and 75,000 years ago. Neandertal fossils found in Shanidar Cave are also about 70,000 years old. So it appears that Shanidar Neandertals spiced up cooked plant foods before Shanidar H. sapiens did, Kabukcu says.

Franchthi food remains date to between 13,100 and 11,400 years ago, when H. sapiens lived there. Wild pulses in food from both caves display microscopic signs of having been soaked, a way to dilute poisons in seeds and moderate their bitterness.

These new findings “suggest that cuisine, or the combination of different ingredients for pleasure, has a very long history indeed,” says Hardy, who was not part of Kabukcu’s team.

There’s a hefty dollop of irony in the possibility that original Paleo diets mixed what people in many societies today regard as gross-sounding portions of putrid meat and fat with vegetarian dishes that still seem appealing.

Add beer to the list of foods threatened by climate change

Beer lovers could be left with a sour taste, thanks to the latest in a series of studies mapping the effects of climate change on crops.

Malted barley — a key ingredient in beer including IPAs, stouts and pilsners — is particularly sensitive to warmer temperatures and drought, both of which are likely to increase due to climate change. As a result, average global barley crop yields could drop as much as 17 percent by 2099, compared with the average yield from 1981 to 2010, under the more extreme climate change projections, researchers report October 15 in Nature Plants.
That decline “could lead to, on average, a doubling of price in some countries,” says coauthor Steven Davis, an Earth systems scientist at University of California, Irvine. Consumption would also drop globally by an average of 16 percent, or roughly what people in the United States consumed in 2011.

The results are based on computer simulations projecting climate conditions, plant responses and global market reactions up to the year 2099. Under the mildest climate change predictions, world average barley yields would still go down by at least 3 percent, and average prices would increase about 15 percent, the study says.

Other crops such as maize, wheat and soy and wine grapes are also threatened by the global rising of average atmospheric temperatures as well as by pests emboldened by erratic weather (SN: 2/8/14, p. 3). But there’s still hope for ale aficionados. The study did not account for technological innovations or genetic tweaks that could spare the crop, Davis says.

310-million-year-old fossil blobs might not be jellyfish after all

What do you get when you flip a fossilized “jellyfish” upside down? The answer, it turns out, might be an anemone.

Fossil blobs once thought to be ancient jellyfish were actually a type of burrowing sea anemone, scientists propose March 8 in Papers in Palaeontology.

From a certain angle, the fossils’ features include what appears to be a smooth bell shape, perhaps with tentacles hanging beneath — like a jellyfish. And for more than 50 years, that’s what many scientists thought the animals were.
But for paleontologist Roy Plotnick, something about the fossils’ supposed identity seemed fishy. “It’s always kind of bothered me,” says Plotnick, of the University of Illinois Chicago. Previous scientists had interpreted one fossil feature as a curtain that hung around the jellies’ tentacles. But that didn’t make much sense, Plotnick says. “No jellyfish has that,” he says. “How would it swim?”

One day, looking over specimens at the Field Museum in Chicago, something in Plotnick’s mind clicked. What if the bell belonged on the bottom, not the top? He turned to a colleague and said, “I think this is an anemone.”

Rotated 180 degrees, Plotnick realized, the fossils’ shape — which looks kind of like an elongated pineapple with a stumpy crown — resembles some modern anemones. “It was one of those aha moments,” he says. The “jellyfish” bell might be the anemone’s lower body. And the purported tentacles? Perhaps the anemone’s upper section, a tough, textured barrel protruding from the seafloor.

Plotnick and his colleagues examined thousands of the fossilized animals, dubbed Essexella asherae, unearthing more clues. Bands running through the fossils match the shape of some modern anemones’ musculature. And some specimens’ pointy protrusions resemble an anemone’s contracted tentacles.
“It’s totally possible that these are anemones,” says Estefanía Rodríguez, an anemone expert at the American Museum of Natural History in New York City who was not involved with the work. The shape of the fossils, the comparison with modern-day anemones — it all lines up, she says, though it’s not easy to know for sure.

Paleontologist Thomas Clements agrees. Specimens like Essexella “are some of the most notoriously difficult fossils to identify,” he says. “Jellyfish and anemones are like bags of water. There’s hardly any tissue to them,” meaning there’s little left to fossilize.
Still, it’s plausible that the blobs are indeed fossilized anemones, says Clements, of Friedrich-Alexander-Universität Erlangen-Nürnberg in Germany. He was not part of the new study but has spent several field seasons at Mazon Creek, the Illinois site where Essexella lived some 310 million years ago. Back then, the area was near the shoreline, Clements says, with nearby rivers dumping sediment into the environment – just the kind of place ancient burrowing anemones may have once called home.

Bizarre metals may help unlock mysteries of how Earth’s magnetic field forms

Weird materials called Weyl metals might reveal the secrets of how Earth gets its magnetic field.

The substances could generate a dynamo effect, the process by which a swirling, electrically conductive material creates a magnetic field, a team of scientists reports in the Oct. 26 Physical Review Letters.

Dynamos are common in the universe, producing the magnetic fields of the Earth, the sun and other stars and galaxies. But scientists still don’t fully understand the details of how dynamos create magnetic fields. And, unfortunately, making a dynamo in the lab is no easy task, requiring researchers to rapidly spin giant tanks of a liquefied metal, such as sodium (SN: 5/18/13, p. 26).
First discovered in 2015, Weyl metals are topological materials, meaning that their behavior is governed by a branch of mathematics called topology, the study of shapes like doughnuts and knots (SN: 8/22/15, p. 11). Electrons in Weyl metals move around in bizarre ways, behaving as if they are massless.

Within these materials, the researchers discovered, electrons are subject to the same set of equations that describes the behavior of liquids known to form dynamos, such as molten iron in the Earth’s outer core. The team’s calculations suggest that, under the right conditions, it should be possible to make a dynamo from solid Weyl metals.

It might be easier to create such dynamos in the lab, as they don’t require large quantities of swirling liquid metals. Instead, the electrons in a small chunk of Weyl metal could flow like a fluid, taking the place of the liquid metal.
The result is still theoretical. But if the idea works, scientists may be able to use Weyl metals to reproduce the conditions that exist within the Earth, and better understand how its magnetic field forms.

Skull damage suggests Neandertals led no more violent lives than humans

Neandertals are shaking off their reputation as head bangers.

Our close evolutionary cousins experienced plenty of head injuries, but no more so than late Stone Age humans did, a study suggests. Rates of fractures and other bone damage in a large sample of Neandertal and ancient Homo sapiens skulls roughly match rates previously reported for human foragers and farmers who have lived within the past 10,000 years, concludes a team led by paleoanthropologist Katerina Harvati of the University of Tübingen in Germany.
Males suffered the bulk of harmful head knocks, whether they were Neandertals or ancient humans, the scientists report online November 14 in Nature.

“Our results suggest that Neandertal lifestyles were not more dangerous than those of early modern Europeans,” Harvati says.

Until recently, researchers depicted Neandertals, who inhabited Europe and Asia between around 400,000 and 40,000 years ago, as especially prone to head injuries. Serious damage to small numbers of Neandertal skulls fueled a view that these hominids led dangerous lives. Proposed causes of Neandertal noggin wounds have included fighting, attacks by cave bears and other carnivores and close-range hunting of large prey animals.

Paleoanthropologist Erik Trinkaus of Washington University in St. Louis coauthored an influential 1995 paper arguing that Neandertals incurred an unusually large number of head and upper-body injuries. Trinkaus recanted that conclusion in 2012, though. All sorts of causes, including accidents and fossilization, could have resulted in Neandertal skull damage observed in relatively small fossil samples, he contended (SN: 5/27/17, p. 13).
Harvati’s study further undercuts the argument that Neandertals engaged in a lot of violent behavior, Trinkaus says.

Still, the idea that Neandertals frequently got their heads bonked during crude, close-up attacks on prey has persisted, says paleoanthropologist David Frayer of the University of Kansas in Lawrence. The new report highlights the harsh reality that, for Neandertals and ancient humans alike, “head trauma, no matter the level of technological or social complexity, or population density, was common.”

Harvati’s group analyzed data for 114 Neandertal skulls and 90 H. sapiens skulls. All of these fossils were found in Eurasia and date to between around 80,000 and 20,000 years ago. One or more head injuries appeared in nine Neandertals and 12 ancient humans. After statistically accounting for individuals’ sex, age at death, geographic locations and state of bone preservation, the investigators estimated comparable levels of skull damage in the two species. Statistical models run by the team indicate that skull injuries affected an average of 4 percent to 33 percent of Neandertals, and 2 percent to 34 percent of ancient humans.

Estimated prevalence ranges that large likely reflect factors that varied from one locality to another, such as resource availability and hunting conditions, the researchers say.

Neandertals with head wounds included more individuals under age 30 than observed among their human counterparts. Neandertals may have suffered more head injuries early in life, the researchers say. It’s also possible that Neandertals died more often from head injuries than Stone Age humans did.

Researchers have yet to establish whether Neandertals experienced especially high levels of damage to body parts other than the head, writes paleoanthropologist Marta Mirazón Lahr of the University of Cambridge in a commentary in Nature accompanying the new study.

50 years ago, researchers discovered a leak in Earth’s oceans

Oceans may be shrinking — Science News, March 10, 1973

The oceans of the world may be gradually shrinking, leaking slowly away into the Earth’s mantle…. Although the oceans are constantly being slowly augmented by water carried up from Earth’s interior by volcanic activity … some process such as sea-floor spreading seems to be letting the water seep away more rapidly than it is replaced.

Update
Scientists traced the ocean’s leak to subduction zones, areas where tectonic plates collide and the heavier of the two sinks into the mantle. It’s still unclear how much water has cycled between the deep ocean and mantle through the ages. A 2019 analysis suggests that sea levels have dropped by an average of up to 130 meters over the last 230 million years, in part due to Pangea’s breakup creating new subduction zones. Meanwhile, molten rock that bubbles up from the mantle as continents drift apart may “rain” water back into the ocean, scientists reported in 2022. But since Earth’s mantle can hold more water as it cools (SN: 6/13/14), the oceans’ mass might shrink by 20 percent every billion years.

This huge plant eater thrived in the age of dinosaurs — but wasn’t one of them

A new species of hulking ancient herbivore would have overshadowed its relatives.

Fossils found in Poland belong to a new species that roamed during the Late Triassic, a period some 237 million to 201 million years ago, researchers report November 22 in Science. But unlike most of the enormous animals who lived during that time period, this new creature isn’t a dinosaur — it’s a dicynodont.

Dicynodonts are a group of ancient four-legged animals that are related to mammals’ ancestors. They’re a diverse group, but the new species is far larger than any other dicynodont found to date. The elephant-sized creature was more than 4.5 meters long and probably weighed about nine tons, the researchers estimate. Related animals didn’t become that big again until the Eocene, 150 million years later.
“We think it’s one of the most unexpected fossil discoveries from the Triassic of Europe,” says study coauthor Grzegorz Niedzwiedzki, a paleontologist at Uppsala University in Sweden. “Who would have ever thought that there is a fossil record of such a giant, elephant-sized mammal cousin in this part of the world?” He and his team first described some of the bones in 2008; now they’ve made the new species — Lisowicia bojani — official.

The creature had upright forelimbs like today’s rhinoceroses and hippos, instead of the splayed front limbs seen on other Triassic dicynodonts, which were similar to the forelimbs of present-day lizards. That posture would have helped it support its massive bodyweight.

A new way to turn saltwater fresh can kill germs and avoid gunk buildup

A new design for sun-powered desalination technology may lead to longer-lasting devices that produce cleaner water.

The trick boils down to preventing a device’s components from touching the saltwater. Instead, a lid of light-absorbing material rests above a partially filled basin of water, absorbing sunlight and radiating that energy to the liquid below. That evaporates the water to create pure vapor, which can be condensed into freshwater to help meet the demands of a world where billions of people lack safe drinking water (SN: 8/18/18, p. 14).
This setup marks an improvement over other sun-powered desalination devices, where sunshine-absorbing materials float atop the saltwater (SN: 8/20/16, p. 22). In those devices, salt and other contaminants left behind during evaporation can degrade the material’s ability to soak up sunlight. Having water in contact with the material also prevents the material from getting hotter than about 100° Celsius or producing steam above that temperature. That limits the technology’s ability to purify the final product; killing pathogenic microbes often requires temperatures of at least 121° C.

In the new device, described online December 11 in Nature Communications, the separation between the light-absorbing lid and the water’s surface helps keep the lid clean and allows it to generate vapor tens of degrees hotter than the water’s boiling point.

The lid comprises three main components: a top layer made of a metal-ceramic composite that absorbs sunshine, a sheet of carbon foam and a bottom layer of aluminum. Heat spreads from the sunlight-absorbing layer to the aluminum, from which thermal energy radiates to the water below. When the water temperature hits about 100° C, vapor is produced. The steam rises up through holes in the aluminum and flows through the lid’s middle carbon layer, heating further along the way, until it is released in a single stream out the side of the lid. There, it can be captured and condensed.

Producing superheated steam in this way, without any gunk buildup, is “a very innovative idea,” says Jia Zhu, a materials scientist at Nanjing University in China not involved in the work.
Under a lamp that mimics natural sunlight in the lab, the device evaporated 100 grams of saltwater without any salt collecting on the underside of the lid. Salt crystals formed at the bottom of the basin washed away easily. In experiments in October on a rooftop in Cambridge, Mass., researchers used a curved mirror to focus incoming sunlight onto the light-absorbing layer of the device to produce steam hotter than 146° C.

“When you can access these temperatures, you can use the steam for things like sterilization, for cooking, for cleaning, for industrial processes,” says coauthor Thomas Cooper, a mechanical engineer at York University in Toronto. A device measuring 1 square meter could generate 2.5 liters of freshwater per day in sunny regions such as the southeastern United States, and at least half that in shadier regions such as New England, Cooper estimates.

This sun-powered technology could also provide an ecofriendly alternative to reverse osmosis, a water purification process that involves pushing seawater through salt-filtering membranes (SN: 9/15/18, p. 10). Reverse osmosis, which runs on electricity, “is an energy-hungry technology,” says Qiaoqiang Gan, an engineer at the University at Buffalo in New York not involved in the work. “For resource-limited areas, remote areas or people who live on small islands, this [new device] might be a very good option for them to address their freshwater needs.” But researchers still need to investigate how affordable a commercial version of this device would be, Gan says.

Here’s what was surprising about Kilauea’s 3-month-long eruption

WASHINGTON — After a stunningly explosive summer, Kilauea, the world’s longest continuously erupting volcano, finally seems to have taken a break. But the scientists studying it haven’t. Reams of new data collected during an unprecedented opportunity to monitor an ongoing, accessible eruption are changing what’s known about how some volcanoes behave.

“It was hugely significant,” says Jessica Larsen, a petrologist at the University of Alaska Fairbanks, and “a departure from what Kilauea had been doing for more than 35 years.”
The latest eruption started in May. By the time it had ended three months later, over 825 million cubic meters of earth had collapsed at the summit. That’s the equivalent of 300,000 Olympic-sized swimming pools, Kyle Anderson, a geophysicist with the U.S. Geologic Survey in Menlo Park, Calif., said December 11 in a news conference at the annual meeting of the American Geophysical Union.

As the summit crater deflated, magma gushed through underground tunnels, draining out through fissures along an area called the lower eastern rift zone at a rate of roughly 50 meters per day. That lava eventually covered 35.5 square kilometers of land, Anderson and his colleagues reported in a study published December 11 in Science.

The volcano also taught scientists a thing or two.
Scientists previously believed that groundwater plays a big role in how a caldera collapses. When craters were drained of their magma, “cooling, cracking depressurized the caldera, allowing groundwater to seep in and create a series of explosive eruptions,” Anderson said. “But groundwater did not play a significant role in driving the explosions this summer.”

Instead, the destruction of Kilauea’s crater is what’s called a piston-style caldera collapse, he said. Sixty-two small collapse events rattled the volcano from mid-May to late August, with each collapse causing the crater to sink and pushing the surrounding land out and up. By the end, the center of the volcano sank by as much as 500 meters — more than the height of the Empire State Building.

That activity didn’t just destroy the crater. “We could see surges in the eruption rate 40 kilometers away whenever there was a collapse,” Anderson said.
Life finds a way
Under the sea, life moved in around the brand-new land surprisingly quickly. Using a remotely operated vehicle to explore the seafloor, researchers in September found evidence of hydrothermal activity along newly deposited lava flows about 650 meters deep. More surprising, bright yellow, potentially iron-oxidizing microbes had already moved in.

“There’s no reason why we should have expected there would be hydrothermal activity that would be alive within the first 100 days,” Chris German, a geologist at Woods Hole Oceanographic Institution in Falmouth, Mass., said at the news conference. “This is actually life here!”

The discovery suggests “how volcanism can give rise to the chemical energy that can drive primitive microbial organisms and flower a whole ecosystem,” he said.

Studying these ecosystems can provide insight into how life may form in places like Enceladus, an icy moon of Saturn. Hydrothermal activity is common where Earth’s tectonic plates meet. But alien worlds don’t show evidence of plate tectonics, though they can be volcanically active, German says. Studying how hydrothermal life forms near volcanoes that aren’t along tectonic boundaries on Earth could reveal a lot about other celestial bodies.

“This is a better analog of what we expect to them to be like,” says German, but “it is what’s least studied.”

What comes next
As of December 5, Kilauea had not erupted for three months, suggesting it’s in what’s called a pause – still active but not spewing lava. Observations from previous eruptions suggest that the next phase of Kilauea’s volcanic cycle may be a quieter one. But the volcano likely won’t stay quiet forever, says Christina Neal, the head scientist at the USGS Hawaiian Volcano Observatory and a coauthor of the Science paper. “We’re in this lull and we just don’t know what is going to happen next,” she says.Life finds a way
Under the sea, life moved in around the brand-new land surprisingly quickly. Using a remotely operated vehicle to explore the seafloor, researchers in September found evidence of hydrothermal activity along newly deposited lava flows about 650 meters deep. More surprising, bright yellow, potentially iron-oxidizing microbes had already moved in.

“There’s no reason why we should have expected there would be hydrothermal activity that would be alive within the first 100 days,” Chris German, a geologist at Woods Hole Oceanographic Institution in Falmouth, Mass., said at the news conference. “This is actually life here!”

The discovery suggests “how volcanism can give rise to the chemical energy that can drive primitive microbial organisms and flower a whole ecosystem,” he said.

Studying these ecosystems can provide insight into how life may form in places like Enceladus, an icy moon of Saturn. Hydrothermal activity is common where Earth’s tectonic plates meet. But alien worlds don’t show evidence of plate tectonics, though they can be volcanically active, German says. Studying how hydrothermal life forms near volcanoes that aren’t along tectonic boundaries on Earth could reveal a lot about other celestial bodies.
Scientists are tracking ground swelling near the Puu Oo vent, where much of Kilauea’s lava has flowed from during the volcano’s 35-year eruption history. That inflation is an indication that magma may still be on the move deep below.

The terrain surrounding this remote region is dense with vegetation, making it a difficult area to study. But new methods tested during the 2018 eruption, such as the use of uncrewed aerial vehicles, for example, could aid in tracking the recent deformation.

Scientists are also watching the volcano next door: Mauna Loa. History has shown that Mauna Loa can act up during periods when Kilauea sleeps. For the past several years, volcanologists have kept an eye on Kilauea’s larger sister volcano, which went silent last fall, after a period with few earthquakes and intermittent deformation. “We’re seeing a little bit of inflation at Mauna Loa and some earthquake swarms where it had been active, Neal says. “So that’s another issue of concern for us going into the future.”

DNA tests of Lassa virus mid-outbreak helped Nigeria target its response

When an outbreak of a viral hemorrhagic fever hit Nigeria in 2018, scientists were ready: They were already in the country testing new disease-tracking technology, and within weeks managed to steer health workers toward the most appropriate response.

Lassa fever, which is transmitted from rodents to humans, pops up every year in West Africa. But 2018 was the worst season on record for Nigeria. By mid-March, there were 376 confirmed cases — more than three times as many as by that point in 2017 — and another 1,495 suspected. Health officials weren’t sure if the bad year was being caused by the strains that usually circulate, or by a new strain that might be more transmissible between humans and warrant a stronger response.
New technology for analyzing DNA in the field helped answer that question mid-outbreak, confirming the outbreak was being caused by pretty much the same strains transmitted from rodents to humans in past years. That rapid finding helped Nigeria shape its response, allowing health officials to focus efforts on rodent control and safe food storage, rather than sinking time and money into measures aimed at stopping unlikely human-to-human transmission, researchers report in the Jan. 4 Science.

While the scientists were reporting their results to the Nigeria Centre for Disease Control, they were also discussing the data with other virologists and epidemiologists in online forums. This kind of real-time collaboration can help scientists and public health workers “see the bigger picture about pathogen spread,” says Nicholas Loman, a microbial genomicist at the University of Birmingham in England who was not involved in the research.

Portable DNA sequencers, some as small as a cell phone, have allowed scientists to read the genetic information of viruses emerging in places without extensive lab infrastructure. Looking for genetic differences between patient samples can give clues to how a virus is being transmitted and how quickly it’s changing over time — key information for getting outbreaks under control. If viral DNA from several patients is very similar, that suggests the virus may be transmitted between people; if the DNA is more distinct, people might be picking up the virus independently from other animals.

The technology has also been used amid recent Ebola and Zika outbreaks. But the Lassa virus presents a unique challenge, says study coauthor Stephan Günther, a virologist at the Bernhard-Nocht-Institute for Tropical Medicine in Hamburg, Germany. Unlike Ebola or Zika, Lassa has a lot of genetic variation between strains. So while the same small regions of DNA from various strains of Ebola or Zika can be identified for analysis, it’s hard to accurately target similar regions for comparison among Lassa strains.
Instead, Günther and his team used a tactic called metagenomics: They collected breast milk, plasma and cerebrospinal fluid from patients and sequenced all the DNA within — human, viral and anything else lurking. Then, the team picked out the Lassa virus DNA from that dataset.

All told, the scientists analyzed Lassa virus DNA from 120 patients, far more than initially intended. “We went to the field to do a pilot study,” Günther says. “Then the outbreak came. And we quickly scaled up.” Preexisting relationships in Nigeria helped make that happen: The team had been collaborating for a decade with researchers at the Irrua Specialist Teaching Hospital and working alongside the World Health Organization and the Nigeria Centre for Disease Control.

Analyzing and interpreting the massive amounts of data generated by the metagenomics approach was a challenge, especially with limited internet connection, Günther says. Researchers analyzed 36 samples during the outbreak — less than a third of their total dataset, but still enough to guide the response. The full analysis, completed after the outbreak, confirmed the initial findings.

A metagenomics approach could be useful in disease surveillance more broadly. Currently, “we look for things that we know about and expect to find. Yet evidence from Ebola in West Africa and Zika in the Americas is that emerging pathogens can pop up in unexpected places, and take too long to be recognized,” Loman says. Sequencing all DNA in a sample, he says, could allow scientists to detect problem pathogens before they cause outbreaks.Instead, Günther and his team used a tactic called metagenomics: They collected breast milk, plasma and cerebrospinal fluid from patients and sequenced all the DNA within — human, viral and anything else lurking. Then, the team picked out the Lassa virus DNA from that dataset.

All told, the scientists analyzed Lassa virus DNA from 120 patients, far more than initially intended. “We went to the field to do a pilot study,” Günther says. “Then the outbreak came. And we quickly scaled up.” Preexisting relationships in Nigeria helped make that happen: The team had been collaborating for a decade with researchers at the Irrua Specialist Teaching Hospital and working alongside the World Health Organization and the Nigeria Centre for Disease Control.

Analyzing and interpreting the massive amounts of data generated by the metagenomics approach was a challenge, especially with limited internet connection, Günther says. Researchers analyzed 36 samples during the outbreak — less than a third of their total dataset, but still enough to guide the response. The full analysis, completed after the outbreak, confirmed the initial findings.

A metagenomics approach could be useful in disease surveillance more broadly. Currently, “we look for things that we know about and expect to find. Yet evidence from Ebola in West Africa and Zika in the Americas is that emerging pathogens can pop up in unexpected places, and take too long to be recognized,” Loman says. Sequencing all DNA in a sample, he says, could allow scientists to detect problem pathogens before they cause outbreaks.