Scientists are taking aim at the physics of rubber band bombardments.
Using high-speed video, researchers have analyzed what happens to a rubber band when it’s launched from a thumb. The results offer some tips for how to make a clean shot, Boston University mechanical engineers Alexandros Oratis and James Bird report in a paper in press in Physical Review Letters.
The researchers focused on one particular shooting technique: Elastic is slung around the raised thumb of one hand and pulled back with the fingers of the other hand. Standardizing the operation by using a cylinder rather than a thumb, the scientists filmed the details of the shooting process.
When the rubber band is let loose, a release of tension in the band quickly travels toward the cylinder. Meanwhile, the band itself zings toward the cylinder at a slower speed than that tension release, the scientists found.
When shot off a thumb, the band’s forward motion could lead to a rubbery rear-ender, with the thumb getting in the way of the elastic and sending the band askew. But if the feat is performed properly, the release of tension causes the thumb to duck out of the way before the rubber band reaches it. The band then sails past, buckling into a wrinkly shape as it shoots by.
By testing different shooting strategies, the researchers zeroed in on some guidelines. Don’t pull the band too tight: The extra tension increases the flight speed, so the thumb doesn’t deflect fast enough to avoid it. And a wider elastic band is preferred. That’s because the thumb must exert more force against the wider band, so that when the band is released, the digit falls away more quickly, making the elastic’s getaway easier.
While searching for shipwreck remains near Oman in the Arabian Sea in 2014, divers discovered an unusual metal disk that has since proven to be the world’s oldest known mariner’s astrolabe, British researchers report.
The navigational device came from the wreckage of a ship in the Portuguese armada that had been part of explorer Vasco da Gama’s second voyage to India from 1502 to 1503. Historical decorations on the artifact led the researchers to suggest that the disk was used as early as 1496. A bit wider than a dollar bill, the astrolabe contains carvings of Portugal’s royal coat of arms and a depiction of a ringed Earth that was associated with a Portuguese king who ruled from late 1495 to 1521. Laser imaging of the disk revealed 18 scale marks separated at 5-degree intervals.
The device, used to take altitudes at sea, could have measured from 0 degrees — when the sun is at the horizon — to 90 degrees — when the sun is directly overhead, the team reports in a study published online March 16 in the International Journal of Nautical Archaeology. Only one other solid disk, mariner’s astrolabe has been found, and its authenticity and age are uncertain, say oceanographer David Mearns and colleagues. Mearns directs Blue Water Recoveries in Midhurst, England, a company that locates and studies shipwrecks.
Of 104 artifacts known to have been used as mariner’s astrolabes, the new find is not only the oldest, but also the only one decorated with a national symbol, the researchers say. By the early 1500s, most navigators had adopted more precise, open-wheeled astrolabes.
Approval of the first and only treatment in the United States specifically targeting postpartum depression offers hope for millions of women each year who suffer from the debilitating mental health disorder after giving birth.
The new drug brexanolone — marketed under the name Zulresso and approved on March 19 by the U.S. Food and Drug Administration — is expected to become available to the public in late June. Developed by Sage Therapeutics, based in Cambridge, Mass., the drug is costly and treatment is intensive: It’s administered in the hospital as a 60-hour intravenous drip, and a treatment runs between $20,000 and $35,000. But researchers say that it could help many of the estimated 11.5 percent of U.S. new moms each year who experience postpartum depression, which can interfere with normal bonding between mothers and infants and lead to feeling hopeless, sad or overly anxious. Here’s a closer look at the drug, its benefits and some potential drawbacks.
How does the new drug work? How exactly brexanolone works is not known. But because the drug’s chemical structure is nearly identical to the natural hormone allopregnanolone, it’s thought that brexanolone operates in a similar way.
Allopregnanolone enhances the effects of a neurochemical called gamma-aminobutyric acid, or GABA, which stops nerve cells in the brain from firing. Ultimately this action helps quell a person’s anxiety or stress. During pregnancy, the concentration of allopregnanolone in a woman’s brain spikes. This leads some neurons to compensate by temporarily tuning out GABA so that the nerve cells don’t become too inactive. Levels of the steroid typically return to normal quickly after a baby is born, and the neurons once again responding to GABA shortly thereafter. But for some women, this process can take longer, possibly resulting in postpartum depression.
Brexanolone temporarily elevates the brain’s allopregnanolone levels again, which results in a patient’s mood improving. But it’s still not clear exactly why the drug has this effect, says Samantha Meltzer-Brody, a reproductive psychiatrist at the University of North Carolina School of Medicine in Chapel Hill and the lead scientist of the drug’s clinical trials. Nor is it clear whether allopregnanolone’s, and thus possible brexanolone’s, influence on GABA is affecting only postpartum depression. But the drug clearly “has this incredibly robust response,” she says, “unlike anything currently available.”
How effective was the drug in clinical trials? Brexanolone went through three separate clinical trials in which patients were randomly given either the drug or a placebo: one Phase II trial, which tests the drug’s effectiveness and proper dosage, and two Phase III trials, which tested the drug’s effects on moderate or severe postpartum depression and were necessary to gain approval for the drug’s commercial use in people.
Of 234 people who completed the trials, 94 received the suggested dosage of brexanolone. About 70 of those patients, or 75 percent, had what Meltzer-Brody described as a “robust response” to just one course of treatment. And of those patients with positive responses, 94 percent continued to feel well 30 days after the treatment. The results suggest that the drug may be most effective for those with severe postpartum depression; among those with moderate symptoms, the drug and the placebo had a fairly similar impact.
Can people take the drug again? “There’s nothing prohibiting” a second course of brexanolone, but the effects of a repeat course have not been studied, Meltzer-Brody says. The drug was designed to be taken in tandem with the start of antidepressants, which take effect after about two to four weeks. So by the time the brexanolone wears off, the antidepressants would have kicked in.
It’s not clear yet if some patients could need a second dose. The clinical trials compared a group of women taking both antidepressants and brexanolone with another group taking only brexanolone and found no difference in the two group’s response 30 days after tests ended, Meltzer-Brody says. Because the study ended at 30 days, it’s unclear if the effects of brexanolone on its own last longer.
Can women breastfeed while taking brexanolone? As a precaution, treated women did not breastfeed until six days after taking the drug. But in tests of breastmilk from 12 treated, lactating women, concentrations of brexanolone in breastmilk were negligible — less than 10 nanograms per millileter — in most of the women 36 hours after they received the infusion, according to Sage’s briefing document for the FDA. The FDA has yet to issue guidance on breast feeding.
Are there side effects? About a third of the trial patients experienced sleepiness, sedation or headaches. The possibility of drowsiness led to the FDA’s requirement that the drug be administered by IV drip in a supervised setting. “If someone isn’t supervised, then there would be the risk that someone could get sleepy and pass out,” Meltzer-Brody says.
Are there plans for different versions of the drug? Sage Therapeutics is developing a pill version of a drug called SAGE-217. It’s chemically similar to brexanolone and has similar antidepressant effects. Early results from a Phase III trial reported by the company in January show that, of 78 women treated with the pill, 72 percent responded favorably within two weeks, and 53 percent had not experienced a recurrence of symptoms four weeks later.
Is it worth the price and time? Setting aside 60 hours to be hospitalized for an expensive drug could be discouraging for some. “It’s going to be very important for insurance to cover it in order for it be accessible,” Meltzer-Brody says. “I’m hoping that will be the case.” But based on the reaction of women with severe postpartum depression who participated in the trials, “two-and-a-half days seems like nothing if your debilitating, depressive symptoms will be gone.”
These are wondrous times for space exploration. Just when you think exploring the cosmos couldn’t possibly get more fun, another discovery delivers a new “oh wow” moment.
Consider the asteroid Bennu. It’s an unprepossessing space rock that drew scientists’ curiosity because it is among the most pristine objects in our solar system, and it might provide clues to the origins of life. But checking out Bennu is no trip to Paris; it’s about 130 million kilometers from Earth. NASA launched its OSIRIS-REx probe to Bennu in 2016, and it didn’t arrive until last December. The spacecraft is currently orbiting its quarry in preparation for an attempt at gathering samples from the asteroid’s surface in 2020 and then toting them back to Earth. Estimated delivery date: September 24, 2023. Clearly, asteroid science is not a discipline for those with short attention spans. So imagine scientists’ delight when OSIRIS-REx already had news to share: Bennu is squirting jets of dust into space. It’s an asteroid behavior no one had ever seen before. Astronomy writer Lisa Grossman learned all about Bennu’s surprise jets while attending the Lunar and Planetary Science Conference in March. She reports that the dusty fountains may be the work of volatile gases beneath Bennu’s surface. The presence of volatiles would suggest that the rock wandered into the inner solar system relatively recently. But astronomers still have a lot to figure out about Bennu’s history, and they couldn’t be happier.
In other surprising space rock news from the conference, astronomers analyzing the much-more-distant object dubbed Ultima Thule now think it’s an agglomeration of mini-worlds that stuck together in the early days of the solar system — as Grossman terms it, a “Frankenworld.” That’s just the latest unexpected news from this Kuiper Belt denizen. If you’re as space rock obsessed as we are, you may recall that the first fuzzy images from NASA’s New Horizons spacecraft, which flew by Ultima Thule on January 1, suggested that the rock looked like a bowling pin or a snowman spinning in space. More recent images reveal not a snowman, but instead two pancakes or hamburger patties glued end to end (SN: 3/16/19, p. 15). That has scientists scrambling to figure out what forces could create such an oddly shaped object.
We’ll be hearing more about Bennu, Ultima Thule and other residents of our solar system in the months to come. I’m particularly looking forward to news from the Parker Solar Probe, which is tightening its orbit around the sun. I’m the one who is going to have to be patient in this case, though that’s not an attribute typically associated with journalists. The spacecraft won’t make its closest encounter with the sun until 2024, before ending its mission the following year. But the probe will be reporting in, and we’ll be reporting, too, as it makes this historic journey (SN: 1/19/19, p. 7).
Open your Web browser or your trusty print magazine and join us for the adventure. We hope you’ll enjoy the journey as much as we do.
There are no teensy cups. But a urine test for wild mosquitoes has for the first time proved it can give an early warning that local pests are spreading diseases.
Mosquito traps remodeled with a pee-collecting card picked up telltale genetic traces of West Nile and two other worrisome viruses circulating in the wild, researchers in Australia report April 4 in the Journal of Medical Entomology.
The tests were based on an innovative saliva monitoring system unveiled in 2010: traps that lure mosquitoes into tasting honey-coated cards. Among its advantages, this card-based medical testing doesn’t need the constant refrigeration that checking whole mosquitoes does. And it’s not as labor intensive as monitoring sentinel chickens or pigs for signs of infection. But testing traces of mosquito saliva left on these cards comes close to the limits of current molecular methods for detecting viruses. In part, it’s an issue of volume. A mosquito drools fewer than five nanoliters of saliva when it tastes a card. In comparison, mosquitoes excrete about 1.5 microliters of liquid per pee, offering a veritable flood of material. So Dagmar Meyer of James Cook University in Cairns, Australia and her colleagues created urine collectors using standard overnight light traps and longer-standing traps that exhale delicious carbon dioxide, a mosquito come-hither.
The team set out 29 urine traps in two insect-rich spots in Queensland along with traps equipped to catch mosquito saliva. When mosquitoes fell for the trick and entered a urine trap, their excretions dripped through a mesh floor onto a collecting card. Adding a moist wick of water kept trapped mosquitoes alive and peeing longer, thus improving the sample. Pee traps picked up three viruses — West Nile, Ross River and Murray Valley encephalitis — while the saliva ones detected two, the researchers report.
No one should have to sleep with the fishes, but new research on zebrafish suggests that we sleep like them.
Sleeping zebrafish have brain activity similar to both deep slow-wave sleep and rapid eye movement, or REM, sleep that’s found in mammals, researchers report July 10 in Nature. And the team may have tracked down the cells that kick off REM sleep.
The findings suggest that the basics of sleep evolved at least 450 million years ago in zebrafish ancestors, before the evolution of animals that give birth to live young instead of laying eggs. That’s 150 million years earlier than scientists thought when they discovered that lizards sleep like mammals and birds (SN: 5/28/16, p. 9).
What’s more, sleep may have evolved underwater, says Louis C. Leung, a neuroscientist at Stanford University School of Medicine. “These signatures [of sleep] really have important functions — even though we may not know what they are — that have survived hundreds of millions of years of evolution.” In mammals, birds and lizards, sleep has several stages characterized by specific electrical signals. During slow-wave sleep, the brain is mostly quiet except for synchronized waves of electrical activity. The heart rate decreases and muscles relax. During REM or paradoxical sleep, the brain lights up with activity almost like it’s awake. But the muscles are paralyzed (except for rapid twitching of the eyes) and the heart beats erratically.
For many years, scientists have known that fruit flies, nematodes, fish, octopuses and other creatures have rest periods reminiscent of sleep. But until now, no one could measure the electrical activity of those animals’ brains to see if that rest is the same as mammals’ snoozing.
Leung and colleagues developed a system to do just that in zebrafish by genetically engineering them to make a fluorescent molecule that lights up when it encounters calcium, which is released when nerve cells and muscles are active. By following the flashes of light using a light sheet microscope, the researchers tracked brain and muscle activity in the naturally transparent fish larvae.
The next task was to lull fish asleep under the microscope. In some experiments, the team added drugs that trigger either slow-wave or REM sleep in mammals to the fish’s water. In others, researchers deprived fish of sleep for a night or tuckered the fish out with lots of activity during the day. Results from all the snooze-inducing methods were the same.
Sleeping fish have two distinct types of brain activity while sleeping, the team found. One, similar to slow-wave sleep, was characterized by short bursts of activity in some nerve cells in the brain. The researchers call that state slow-bursting sleep. REM-like sleep, which the researchers dubbed “propagating-wave sleep,” was characterized by frenzied brain activity that spreads like a wave through the brain. The researchers aren’t calling the sleep phases REM or slow-wave sleep because there are some minor differences between the way fish and mammals sleep. A group of cells that line hollow spaces called ventricles deep in the brain seems to trigger that wave of REM-like brain activity. These ependymal cells dip fingerlike cilia into the cerebral spinal fluid that bathes the ventricles and the central nervous system. The cells appear to beat their cilia faster as amounts of a well-known, sleep-promoting hormone called melanin-concentrating hormone in the fluid increases, the researchers discovered. It’s unclear how the ependymal cells communicate with the rest of the brain to set off REM-like activity. Such cells are also present in mammals, but no one has yet been able to see that deeply into the brains of sleeping mammals to determine whether the cells play a role in sleep. But knowing about these cells may help researchers develop better sleep aids, Leung says.
Just as in mammals, zebrafish’s whole bodies are affected during sleep. Their muscles relax during sleep and their hearts slow from about 200 beats per minute when awake to about 110 to 120 beats per minute while asleep during the slow-wave–like sleep. During the REM-like sleep, the heart slows even more to about 90 beats per minute and loses its regular rhythm. And the fish’s muscles also go completely slack. The one characteristic that the fish lack is rapid eye movement. Instead, the eyes roll back into their sockets, says study coauthor Philippe Mourrain, a biologist at Stanford University School of Medicine.
Lack of eye movement could indicate that emotion-processing parts of the brain, such as the amygdala, aren’t as active in zebrafish as they are in mammals, says sleep researcher Allan Pack of the University of Pennsylvania Perelman School of Medicine. With their brain-activity monitoring, the researchers have taken sleep research “to the next level,” says Pack, and “they present pretty compelling evidence” of slow-wave and REM-like sleep in the fish.
The whole-body involvement that the researchers documented solidifies the argument that fish sleep is similar to mammals, says neuroscientist Paul Shaw of Washington University School of Medicine in St. Louis. In all organisms known to snooze, “sleep is manifest everywhere” in the body, he says.
Future experiments may show why poor sleep or a lack of Zs contributes to health problems in people, such as obesity, heart disease and diabetes.
Ice sheets expanded across much of northern Europe from around 25,000 to 19,000 years ago, making a huge expanse of land unlivable. That harsh event set in motion a previously unrecognized tale of two human populations that played out at opposite ends of the continent.
Western European hunter-gatherers outlasted the icy blast in the past. Easterners got replaced by migrations of newcomers.
That’s the implication of the largest study to date of ancient Europeans’ DNA, covering a period before, during and after what’s known as the Last Glacial Maximum, paleogeneticist Cosimo Posth and colleagues report March 1 in Nature. As researchers have long thought, southwestern Europe provided refuge from the last Ice Age’s big chill for hunter-gatherers based in and near that region, the scientists say. But it turns out that southeastern Europe, where Italy is now located, did not offer lasting respite from the cold for nearby groups, as previously assumed.
Instead, those people were replaced by genetically distinct hunter-gatherers who presumably had lived just to the east along the Balkan Peninsula. Those people, who carried ancestry from parts of southwestern Asia, began trekking into what’s now northern Italy by about 17,000 years ago, as the Ice Age began to wane.
“If local [Ice Age] populations in Italy did not survive and were replaced by groups from the Balkans, this completely changes our interpretation of the archaeological record,” says Posth, of the University of Tübingen in Germany.
Posth and colleagues’ conclusions rest on analyses of DNA from 356 ancient hunter-gatherers, including new molecular evidence for 116 individuals from 14 countries in Europe and Asia. Excavated human remains that yielded DNA dated to between about 45,000 and 5,000 years ago (SN: 4/7/21).
Comparisons of sets of gene variants inherited by these hunter-gatherers from common ancestors enabled the researchers to reconstruct population movements and replacements that shaped ancient Europeans’ genetic makeup. For the first time, ancient DNA evidence included individuals from what’s known as the Gravettian culture, which dates from about 33,000 to 26,000 years ago in central and southern Europe, and from southwestern Europe’s Solutrean culture, which dates to between about 24,000 and 19,000 years ago. Contrary to expectations, makers of Gravettian tools came from two genetically distinct groups that populated western and eastern Europe for roughly 10,000 years before the Ice Age reached its peak, Posth says. Researchers have traditionally regarded Gravettian implements as products of a biologically uniform population that occupied much of Europe.
“What we previously thought was one genetic ancestry in Europe turned out to be two,” says paleogeneticist Mateja Hajdinjak of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, who did not participate in the new study. And “it seems that western and southwestern Europe served as a [refuge from glaciation] more than southeastern Europe and Italy.”
Descendants of the western Gravettian population, who are associated with Solutrean artifacts and remnants of another ancient culture in western Europe that ran from about 19,000 to 14,000 years ago, outlasted the Ice Age before spreading northeastward across Europe, the researchers say.
Further support for southwestern Europe as an Ice Age refuge comes from DNA extracted from a pair of fossil teeth that belonged to an individual linked to the Solutrean culture in southern Spain. That roughly 23,000-year-old adult was genetically similar to western European hunter-gatherers who lived before and after the Last Glacial Maximum, Max Planck paleogeneticist Vanessa Villalba-Mouco and colleagues, including Posth, report March 1 in Nature Ecology & Evolution.
Meanwhile, the genetic evidence suggests that hunter-gatherers in what’s now Italy were replaced by people from farther east, probably based in the Balkan region. Those newcomers must have brought with them a distinctive brand of stone artifacts, previously excavated at Italian sites and elsewhere in eastern Europe, known as Epigravettian tools, Posth says. Many archaeologists have suspected that Epigravettian items were products of hunter-gatherers who clustered in Italy during the Ice Age’s peak freeze.
But, Hajdinjak says, analyses of DNA from fossils of Ice Age Balkan people are needed to clarify what groups moved through Italy, and when those migrations occurred.
Ultimately, descendants of Ice Age migrants into Italy reached southern Italy and then western Europe by around 14,000 years ago, Posth and colleagues say. Ancient DNA evidence indicates that, during those travels, they left a major genetic mark on hunter-gatherers across Europe.
Like so many others, I’ve been watching the HBO series The Last of Us. It’s a classic zombie apocalypse drama following Joel (played by Pedro Pascal) and Ellie (Bella Ramsey) as they make their way across the former United States (now run by a fascist government called Fedra).
I’m a big fan of zombie and other post-apocalyptic fiction. And my husband had told me how good the storyline is in the video game that inspired the series, so I was prepared for interesting storytelling. What I didn’t expect was to be so intrigued by the science behind the sci-fi. In the opening minutes of the series, two scientists on a fictional 1968 talk show discuss the microbes that give them pandemic nightmares. One says it’s fungi — not viruses or bacteria — that keep him awake. Especially worrisome, he says, are the fungi that control rather than kill their hosts. He gives the example of fungi that turns ants into living zombies, puppeteering the insects by flooding their brains with hallucinogens.
He goes on to warn that even though human body temperature keeps us fungus-free, that might not be true if the world got a little bit warmer. He predicts that as the thermostat climbs, a fungus that hijacks insects could mutate a gene allowing it to burrow into human brains and take control of our minds. Such a fungus could induce its human puppets to spread the fungus “by any means necessary,” he says. What’s worse, there are no preventatives, treatments or cures, nor any way to make them.
It’s a brief segment, but it had me hooked. It all sounded so chilling and … plausible. After all, fungi like ones that cause nail infections, yeast infections and ringworm already infect people.
So I consulted some experts on fungal infections to find out whether this could actually happen.
I’ve got good news and bad news.
First, the bad news.
Bad news: Climate change has already helped one fungus mutate to infect humans I wanted to know if warming has spurred any fungi to mutate and become infectious. So I called Arturo Casadevall. He has been thinking about fungi and heat for a long time. He’s proposed that the need to avoid fungal infections may have provided the evolutionary pressure that drove mammals and birds to evolve warm-bloodedness (SN: 12/3/10).
Most fungal species simply can’t reproduce at human body temperature (37° Celsius, or 98.6° Fahrenheit). But as the world warms, “these strains either have to die or adapt,” says Casadevall, a microbiologist who specializes in fungal infections at Johns Hopkins Bloomberg School of Public Health. That raises the possibility that fungi that now infect insects or reptiles could evolve to grow at temperatures closer to human body temperature.
At the same time, humans’ average body temperature has been falling since the 19th century, at least in high-income countries, researchers reported in eLife in 2020. One study from the United Kingdom pegs average body temperature at 36.6° C (97.9° F). And some of us are even cooler.
Fungi’s possible adaptation to higher heat and humans’ cooling body temperature are on a collision course, Casadevall says. He and colleagues presented evidence of one such crash. Climate change may have allowed a deadly fungus called Candida auris to acclimate to human body temperatures (SN: 7/26/19). A version of the fungus that could infect humans independently emerged on three continents from 2012 to 2015. “It’s not like someone took a plane a spread it. These things came out of nowhere simultaneously,” Casadevall says.
Some people argue that the planet hasn’t warmed enough to make fungi a problem, he says. “But you have to think about all the really hot days [that come with climate change]. Every really hot day is a selection event,” in which many fungi will die. But some of those fungi will have mutations that help them handle the heat. Those will survive. Their offspring may be able to survive future even hotter heat waves until human body temperature is no challenge.
Fungi that infect people are usually not picky about their hosts, Casadevall says. They will grow in soil or — if given an opportunity — in people, pets or in other animals. The reason fungi don’t infect people more often is that “the world is much colder than we are, and they have no need of us,” he says.
When people do get infected, the immune system usually keeps the fungi in check. But fungal infections can cause serious illness or be deadly, particularly to people with weakened immune systems (SN: 11/29/21; SN: 1/10/23).
The second episode of The Last of Us reveals that the zombie-creating fungi initially spread through people eating contaminated flour. Then, the infected people attack and bite others, spreading the fungus.
In real life, most human infections arise from breathing in spores. But Casadevall says it’s “not implausible” that people could get infected by eating spores or by being bitten.
Also bad: Fungal genes can adapt to higher heat I also wondered exactly how a fungus could evolve in response to heat. Asiya Gusa, a fungal researcher at Duke University School of Medicine, has published one possibility.
In 2020, she and colleagues reported in the Proceedings of the National Academy of Sciences on how one fungus mutated at elevated temperature to become harder to fight.
Cryptococcus deneoformans, which already infects humans (though it’s no zombie-maker), became resistant to some antifungal drugs when grown at human body temperature. The resistance was born when mobile bits of DNA called transposons (often called jumping genes) hopped into a few genes needed for the antifungals to work.
In a follow-up study, Gusa and colleagues grew C. deneoformans at either 30° C or 37° C for 800 generations, long enough to detect multiple changes in their DNA. Fungi had no problem growing at the balmy 30° C (86° F), the temperature at which researchers typically grow fungi in the lab. But their growth slowed at the higher temperature, a sign that the fungi were under stress from the heat.
In C. deneoformans, that heat stress really got things jumping. One type of transposon accumulated a median of 12 extra copies of itself in fungi grown at body temperature. By contrast, fungi grown at 30° C tended to pick up a median of only one extra copy of the transposon. The team reported those results January 20 in PNAS. The researchers don’t yet know the effect the transposon hops might have on the fungi’s ability to infect people, cause disease or resist fungus-fighting drugs.
So yeah, the bad news is not great. Fungi are mutating in the heat and at least one species has gained the ability to infect people thanks to climate change. Other fungi that infect people are more widespread than they were in the 1950s and 1960s, also thanks to a warming world (SN: 1/4/23).
But I promised good news. And here it is.
Good news: Human brains may resist zombification It may not be our body temperature, but our brain chemistry, that protects us from being hijacked by zombifying fungi.
I consulted Charissa de Bekker and Jui-Yu Chou, two researchers who study the Ophiocordyceps fungi that are the model for the TV show’s fungal menace. These fungi infect ants, flooding the insects with a cocktail of chemicals that steer the ants to climb plants. Once in position, the ants chomp down and the chemicals keep the jaw muscles locked in place (SN: 7/17/19).
Unlike most fictional zombies, the ants are alive during this process. “A lot of people get the misconception that we work on undead ants,” says de Bekker, a microbiologist at Utrecht University in the Netherlands. She’s glad to see the show “stick to the story of the host being very much alive while its behaviors change.” The fungi even help preserve the ant, keeping it alive even while feeding on it. But eventually the ant dies. Then a mushroom rises from the corpse, showering spores onto the ground where other ants may become infected.
Related species of Ophiocordyceps infect various species of ants and other insects. But each fungal species is very specific to the host it infects. That’s because the fungi had to individualize the chemicals they use to control the particular species they infect. The ability to manipulate behavior comes at the cost of not being able to infect multiple species. A fungus that specializes in infecting ants probably can’t get past humans’ immune systems, says Chou, a fungal researcher at the National Changhua University of Education in Taiwan. “Think of a key that fits into a specific lock. It is only this unique combination that will trigger the lock to open,” he says.
Even if the fungi evolved to withstand human body temperature and immune system attacks, they probably couldn’t take control of our minds, de Bekker says. “Manipulation is like a whole different ballgame. You need a ton of additional tools to get there.” It took millions of years of coevolution for the fungi to master piloting ants, after all.
While fungi do make mind-altering chemicals that can affect human behavior (LSD and psilocybin, for instance), Casadevall agrees that fungi that mind control insects probably won’t turn humans into zombies. “It’s not one of my worries,” he says.
Infected ants don’t turn into vicious, biting zombies either, de Bekker says. “If anything, we actually see the healthy ants being aggressive toward infected individuals, once they figure out that they’re infected, to basically get rid of them.” That “social immunity” helps protect the rest of the nest from infection.
Also good: Humans are innovative enough to develop treatments The fictional scientist’s assertion that we couldn’t prevent, treat or cure these fungal infections is also a stretch.
Antifungal drugs exist and they cure many fungal infections, though some infections may persist. Some that spread to the brain may be particularly difficult to clear.Some fungi are also evolving resistance to the drugs. And a few fungal vaccines are in the works, although they may not be ready for years.
The experts I talked to say they hope the show will bring attention to real fungal diseases.
Gusa was especially glad to see fungi in the limelight. And she shares my fondness for that retro series opening in which the scientist predicts climate change could spawn mind-controlling fungi bent on infecting every person on the planet.
“I was pretty much yelling at the TV when I watched the [show’s] intro,” in an excited kind of way, she says. “This is the foundation of a lot of my grant funding … the threat of thermal adaptation of fungi.… To see it played out on the screen was something kind of fun.”
Fledgling crustaceans have eyes like the sea, a peculiarity that could help them hide from predators.
Young shrimp, crab or lobster larvae already rock nearly translucent bodies to stay out of view. But dark eye pigments essential for vision pose the risk of exposing the animals anyway.
Some see-through ocean animals rely on mirrored irises or minuscule eyes to avoid detection. Young shrimp and prawns, on the other hand, camouflage their dark pigments behind light-reflecting glass made of tiny, crystalline spheres, researchers report in the Feb. 17 Science. Variations in the size and placement of the orbs allow the crustaceans’ eyes to shine light that precisely matches the color of the surrounding water, possibly rendering them invisible to predators on the hunt for a meal.
Technologies that mimic the nanospheres’ structure could one day inspire more efficient solar energy or bio-friendly paints, the scientists say.
“I’ve often wondered what’s going on with [these animals’] eyeshine,” says evolutionary biologist Heather Bracken-Grissom of Florida International University in Miami, who was not involved in the study. She and colleagues often collect crustaceans from the deep sea, giving them nicknames like “blue-eyed arthropod” or “green-eyed, weird-looking shrimp” because the creatures don’t resemble their adult forms. Now, she says, that eye color makes sense.
In the study, chemist Keshet Shavit and colleagues used an electron microscope to peer into the eyes of lab-raised and wild crustaceans. Inside shrimp and prawn eyes, the team found crystalline nanospheres made of isoxanthopterin, a molecule that reflects light.
The spheres are a bit like disco balls, with highly reflective surfaces pointing outward, says study coauthor Benjamin Palmer, a chemist at Ben-Gurion University of the Negev in Beer-Sheva, Israel. Each sphere is made of thin, isoxanthopterin plates that stick together to form balls that range in size from around 250 to 400 nanometers in diameter.
These balls are arranged in clusters at the base of protein-dense cones that focus light on the animal’s light-sensing nerves, and form a protective cover over the pigmented cells. But crustacean larvae can still see because there are small holes in the glass, Palmer says. “It’s basically allowing light to go down to the retina on some specific angles, but on other angles, it’s reflecting light back.” The size and order of the spheres seem to influence the color of the reflected light, the team’s observations and computer simulations show.
“The correlation between the particle size and the eyeshine color is beyond amazing,” says Shavit, also at Ben-Gurion University. Nanosphere size appears to help the animals’ eyes match the color of their native habitat, helping the critters blend into the background.
Blue-eyed shrimp that inhabit the Gulf of Aqaba’s clear blue waters off the coast of Israel, for instance, have spheres that are approximately 250 to 325 nanometers in diameter. The 400-nanometer-wide spheres of a freshwater prawn (Macrobrachium rosenbergii) glitter yellow-green, mimicking muddy waters found in the salty estuaries where they live. The prawn’s eyes also seem to be able to reflect different colors in different environments. Individuals exposed to sunlight for four hours in the lab had silvery yellow eyes, possibly a result of nanospheres arranged in a disorganized jumble. But individuals left in the dark overnight had green eyes. Their nanospheres are arranged in layers — though the orbs within each layer are still disorganized, Palmer says.
Such adaptable eyes could help larvae move undetected through different parts of the ocean as changing light levels alter the color of the water, Bracken-Grissom says. At night, young crustaceans migrate to shallow waters to feed and dive back down when the sun rises. “If they are in fact using it as a form of camouflage, it would be an ingenious way to camouflage themselves as they move through these different light environments.”
Tanklike armored dinosaurs probably pummeled each other — not just predators — with huge, bony knobs attached to the ends of their tails. Thanks to new fossil findings, researchers are getting a clearer understanding of how these rugged plant eaters may have used their wicked weaponry.
Many dinosaurs known as ankylosaurids sported a heavy, potentially microwave-sized tail club. This natural sledgehammer has long been considered by both scientists and artists as a defensive weapon against predators, says Victoria Arbour, a paleontologist at the Royal British Columbia Museum in Victoria, Canada.
Fossil evidence for tail clubs’ targets was largely lacking, until Arbour and her colleagues chipped more rock away from the same skeleton they used to describe a new armored dinosaur, Zuul crurivastator, in 2017 (SN: 6/12/17).
The dinosaur had five broken spikes on its sides. The team’s statistical analyses showed the damaged spikes clustered in specific regions of the body. If a large carnivorous dinosaur made these injuries, says Arbour, they’d likely be more randomly distributed and include bite and scratch marks. Instead, the injuries are more consistent with clubbing, the researchers report December 7 in Biology Letters.
Armored dinosaurs’ tail clubs start out either absent or too tiny to mount a major defense, and they get proportionally larger with age. Similar growth patterns occur in some modern animal weaponry like antlers. It’s possible that tanklike dinosaurs sparred with each other for mates, food or territory much like male deer and giraffes do today.
And that tail could also be useful in a pinch. “Having a tail club you can swing around at the ankles of a two-legged predator is a pretty effective weapon,” says Arbour.
“Ankylosaurs are often portrayed as stupid, loner dinosaurs,” she adds. The findings “show that they probably had much more complex behaviors than we give them credit for.”