Gene editing of human embryos yields early results

Scientists have long sought a strategy for curing genetic diseases, but — with just a few notable exceptions — have succeeded only in their dreams. Now, though, researchers in China and Texas have taken a step toward making the fantasies a reality for all inherited diseases.

Using the gene-editing tool known as CRISPR/Cas9, the researchers have successfully edited disease-causing mutations out of viable human embryos. Other Chinese groups had previously reported editing human embryos that could not develop into a baby because they carried extra chromosomes, but this is the first report involving viable embryos (SN Online: 4/8/16; SN Online: 4/23/15).
In the new work, reported March 1 in Molecular Genetics and Genomics, Jianqiao Liu of Guangzhou Medical University in China and colleagues used embryos with a normal number of chromosomes. The embryos were created using eggs and sperm left over from in vitro fertilization treatments. In theory, the embryos could develop into a baby if implanted into a woman’s uterus.

Researchers in Sweden and England are also conducting gene-editing experiments on viable human embryos (SN: 10/29/16, p. 15), but those groups have not yet reported results.

Human germline editing wasn’t realistic until CRISPR/Cas9 and other new gene editors came along, says R. Alta Charo, a bioethicist at the University of Wisconsin Law School in Madison. “We’ve now gotten to the point where it’s possible to imagine a day when it would be safe enough” to be feasible. Charo was among the experts on a National Academies of Sciences and Medicine panel that in February issued an assessment of human gene editing. Altering human embryos, eggs, sperm or the cells that produce eggs and sperm would be permissible, provided there were no other alternatives and the experiments met other strict criteria, the panel concluded (SN: 3/18/17, p. 7).
Still, technical hurdles remain before CRISPR/Cas9 can cross into widespread use in treating patients.

CRISPR/Cas9 comes in two parts: a DNA-cutting enzyme called Cas9, and a “guide RNA” that directs Cas9 to cut at a specified location in DNA. Guide RNAs work a little like a GPS system, says David Edgell, a molecular biologist at Western University in London, Ontario. Given precise coordinates or a truly unique address, a good GPS should take you to the right place every time.

Scientists design guide RNAs so that they will carry Cas9 to only one stretch of about 20 bases (the information-carrying subunits of DNA) out of the entire 6 billion base pairs that make up the human genetic instruction book, or genome. But most 20-base locations in the human genome aren’t particularly distinctive. They are like Starbucks coffee shops: There are a lot of them and they are often similar enough that a GPS might get confused about which one you want to go to, says Edgell. Similarly, guide RNAs sometimes direct Cas9 to cut alternative, or “off-target,” sites that are a base or two different from the intended destination. Off-target cutting is a problem because such edits might damage or change genes in unexpected ways.

“It’s a major issue for sure,” says Bruce Korf, a geneticist at the University of Alabama at Birmingham and president of the American College of Medical Genetics and Genomics Foundation. Doctors trying to correct one genetic defect in a patient want to be sure they aren’t accidentally introducing another.

But CRISPR/Cas9’s propensity to cut undesired sites may be exaggerated, says Alasdair MacKenzie, a molecular biologist at the University of Aberdeen in Scotland. In experiments with mice, MacKenzie and colleagues limited how much Cas9 was produced in cells and made sure the enzyme didn’t stick around after it made an edit. No off-target cuts were detected in any of the mice resulting from successfully edited embryos, MacKenzie and colleagues reported in November in Neuropeptides.

Other researchers have experimented with assembling the Cas9 and guide RNAs outside of the cell and then putting the preassembled protein-RNA complex into cells. That’s the strategy the Chinese researchers took in the new human embryo–editing study. No off-target cuts were detected in that study either, although only one edited embryo was closely examined.

Other researchers have been tinkering with the genetic scissors to produce high-fidelity versions of Cas9 that are far less likely to cut at off-target sites in the first place.

When a guide RNA leads Cas9 to a site that isn’t a perfect match, the enzyme can latch onto DNA’s phosphate backbone and stabilize itself enough to make a cut, says Benjamin Kleinstiver, a biochemist in J. Keith Joung’s lab at Harvard Medical School. By tweaking Cas9, Kleinstiver and colleagues essentially eliminated the enzyme’s ability to hold on at off-target sites, without greatly harming its on-target cutting ability.

Regular versions of Cas9 cut between two and 25 off-target sites for seven guide RNAs the researchers tested. But the high-fidelity Cas9 worked nearly flawlessly for those guides. For instance, high-fidelity Cas9 reduced off-target cutting from 25 sites to just one for one of the guide RNAs, the researchers reported in January 2016 in Nature. That single stray snip, however, could be a problem if the technology were to be used in patients.
A group led by CRISPR/Cas9 pioneer Feng Zhang of the Broad Institute of MIT and Harvard tinkered with different parts of the Cas9 enzyme. That team also produced a cutter that rarely cleaved DNA at off-target sites, the team reported last year in Science.

Another problem for gene editing has been that it is good at disabling, or “knocking out,” genes that are causing a problem but not at replacing genes that have gone bad. Knocking out a gene is easy because all Cas9 has to do is cut the DNA. Cells generally respond by gluing the cut ends back together. But, like pieces of a broken vase, they rarely fit perfectly again. Small flaws introduced in the regluing can cause the problem gene to produce nonfunctional proteins. Knocking out genes may help fight Huntington’s disease and other genetic disorders caused by single, rogue versions of genes.

Many genetic diseases, such as cystic fibrosis or Tay-Sachs, are caused when people inherit two mutated, nonfunctional copies of the same gene. Knocking those genes out won’t help. Instead, researchers need to insert undamaged versions of the genes to restore health. Inserting a gene starts with cutting the DNA, but instead of gluing the cut ends together, cells use a matching piece of DNA as a template to repair the damage.

In the new human embryo work, Liu and colleagues, including Wei-Hua Wang of the Houston Fertility Institute in Texas, first tested this type of repair on embryos with an extra set of chromosomes. Efficiency was low; about 10 to 20 percent of embryos contained the desired edits. Researchers had previously argued that extra chromosomes could interfere with the editing process, so Liu’s group also made embryos with the normal two copies of each chromosome (one from the father and one from the mother). Sperm from men that have genetic diseases common in China were used to fertilize eggs. In one experiment, Liu’s group made 10 embryos, two of which carried a mutation in the G6PD gene. Mutations in that gene can lead to a type of anemia.

Then the team injected Cas9 protein already leashed to its guide RNA, along with a separate piece of DNA that embryos could use as a template for repairing the mutant gene. G6PD mutations were repaired in both embryos. Since both of the two embryos had the repair, the researchers say they achieved 100 percent efficiency. But one embryo was a mosaic: It carried the fix in some but not all of its cells. Another experiment to repair mutations in the HBB gene, linked to blood disorders, worked with 50 percent efficiency, but with some other technical glitches.

Scientists don’t know whether editing just some cells in an embryo will be enough to cure genetic diseases. For that reason, some researchers think it may be necessary to step back from embryos to edit the precursor cells that produce eggs and sperm, says Harvard University geneticist George Church. Precursor cells can produce many copies of themselves, so some could be tested to ensure that proper edits have been made with no off-target mutations. Properly edited cells would then be coaxed into forming sperm or eggs in lab dishes. Researchers have already succeeded in making viable sperm and eggs from reprogrammed mouse stem cells (SN: 11/12/16, p. 6). Precursors of human sperm and eggs have also been grown in lab dishes (SN Online: 12/24/14), but researchers have yet to report making viable human embryos from such cells.

The technology to reliably and safely edit human germline cells will probably require several more years of development, researchers say.

Germline editing — as altering embryos, eggs and sperm or their precursors is known — probably won’t be the first way CRISPR/Cas9 is used to tackle genetic diseases. Doctors are already planning experiments to edit genes in body cells of patients. Those experiments come with fewer ethical questions but have their own hurdles, researchers say.

“We still have a few years to go,” says MacKenzie, “but I’ve never been so hopeful as I am now of the capacity of this technology to change people’s lives.”

Shock-absorbing spear points kept early North Americans on the hunt

Ancient North Americans hunted with spear points crafted to absorb shock.

Clovis people, who crossed a land bridge from Asia to North America around 13,500 years ago, fashioned stone weapons that slightly crumpled at the base rather than breaking at the tip when thrust into prey, say civil engineer Kaitlyn Thomas of Southern Methodist University in Dallas and colleagues. The Clovis crumple rested on a toolmaking technique called fluting, in which a thin groove was chipped off both sides of a stone point’s base, the researchers report in the May Journal of Archaeological Science.
Computer models and pressure testing of replicas of fluted and unfluted Clovis points support the idea that fluted bases worked like shock absorbers, preventing tip breakage, the scientists conclude. Slight compression and folding of stone at the base of fluted points after an impact did not cause enough damage to prevent the points from being reused, they say.

“Fluted Clovis points have a shock-absorbing property that increases their durability, which fit a population that needed reliable weapons on a new, unknown continent,” says archaeologist and study coauthor Metin Eren of Kent State University in Ohio. While Clovis people weren’t the first New World settlers (SN: 6/11/16, p. 8), they roamed throughout much of North America. Individuals traveled great distances to find food and move among seasonal camps, Eren says.

Computer models run by Thomas, Eren and colleagues indicated that unlike unfluted points, fluted points increasingly divert pressure away from the tip and toward the base as physical stress on the weapon grows. Computerized, 3-D versions of fluted Clovis points exposed to high-impact pressure crumpled at the base, leaving the tip intact. Unfluted replicas, however, frequently broke at the tip.
Comparable results emerged when the researchers tested 60 fluted and unfluted stone replicas of Clovis points in a viselike machine that applied precise pressures. Each replica was the same size and represented the average outline shape of 241 previously excavated Clovis points. Standardized replicas enabled researchers to focus solely on whether fluting affected how Clovis points react to physical stress.
Fluted Clovis points may have been attached to handles or long shafts in ways that also enhanced the resilience of a weapon’s business end, Eren says. But no such handles, or even materials used to bind Clovis points to handles, have been discovered.

The fluted points would have taken patience and experience to produce, says Eren, himself a crafter of the stone tools. Previous finds suggest that as many as one out of five Clovis points broke as fluted sections were prepared. If all goes well for an experienced toolmaker, it takes 40 to 50 minutes to produce a fluted Clovis point, he estimates.

Fluting techniques became increasingly elaborate until the practice was abandoned around 9,500 years ago. At that time, familiarity with North America’s landscapes and stone sources triggered a shift to making unfluted spear points designed to kill more effectively, but not necessarily to last, Eren suspects. Some of those stone points may have been intended to shatter on impact, creating shrapnel-like wounds, he says.

Searching for signs of crumpling and crushing on the bottoms of early and later fluted Clovis points could help researchers see if the tools always worked as shock absorbers, says archaeologist Ashley Smallwood of the University of West Georgia in Carrollton.

Researchers have previously proposed that fluting represented a stylistic twist with no practical impact, or that it was a way for toolmakers to advertise their skills and suitability as mates, or was part of prehunt rituals. The new results provide an intriguing practical explanation for the technique’s popularity that deserves further study, says archaeologist Daniel Amick of Loyola University Chicago. Aside from their durability, fluted points may have held symbolic meaning for Clovis people, he adds. For instance, if ancient Americans didn’t fully grasp how fluting strengthens stone points, they could have incorporated the technique into supernatural explanations for the success of hunts, Amick suggests.

Lakes worldwide feel the heat from climate change

About 40 kilometers off Michigan’s Keweenaw Peninsula, in the waters of Lake Superior, rises the stone lighthouse of Stannard Rock. Since 1882, it has warned sailors in Great Lakes shipping lanes away from a dangerous shoal. But today, Stannard Rock also helps scientists monitor another danger: climate change.

Since 2008, a meteorological station at the lighthouse has been measuring evaporation rates at Lake Superior. And while weather patterns can change from year to year, Lake Superior appears to be behaving in ways that, to scientists, indicate long-term climate change: Water temperatures are rising and evaporation is up, which leads to lower water levels in some seasons. That’s bad news for hydropower plants, navigators, property owners, commercial and recreational fishers and anyone who just enjoys the lake.
When most people think of the physical effects of climate change, they picture melting glaciers, shrinking sea ice or flooded coastal towns (SN: 4/16/16, p. 22). But observations like those at Stannard Rock are vaulting lakes into the vanguard of climate science. Year after year, lakes reflect the long-term changes of their environment in their physics, chemistry and biology. “They’re sentinels,” says John Lenters, a limnologist at the University of Wisconsin–Madison.

Globally, observations show that many lakes are heating up — but not all in the same way or with the same ecological consequences. In eastern Africa, Lake Tanganyika is warming relatively slowly, but its fish populations are plummeting, leaving people with less to eat. In the U.S. Upper Midwest, quicker-warming lakes are experiencing shifts in the relative abundance of fish species that support a billion-dollar-plus recreational industry. And at high global latitudes, cold lakes normally covered by ice in the winter are seeing less ice year after year — a change that could affect all parts of the food web, from algae to freshwater seals.

Understanding such changes is crucial for humans to adapt to the changes that are likely to come, limnologists say. Indeed, some northern lakes will probably release more methane into the air as temperatures rise — exacerbating the climate shift that is already under way.
Lake layers
Lakes and ponds cover about 4 percent of the land surface not already covered by glaciers. That may sound like a small fraction, but lakes play a key role in several planetary processes. Lakes cycle carbon between the water’s surface and the atmosphere. They give off heat-trapping gases such as
carbon dioxide and methane, while simultaneously tucking away carbon in decaying layers of organic muck at lake bottoms. They bury nearly half as much carbon as the oceans do.

Yet the world’s more than 100 million lakes are often overlooked in climate simulations. That’s surprising, because lakes are far easier to measure than oceans. Because lakes are relatively small, scientists can go out in boats or set out buoys to survey temperature, salinity and other factors at different depths and in different seasons.

A landmark study published in 2015 aimed to synthesize these in-water measurements with satellite observations for 235 lakes worldwide. In theory, lake warming is a simple process: The hotter the air above a lake, the hotter the waters get. But the picture is far more complicated than that, the international team of researchers found.
On average, the 235 lakes in the study warmed at a rate of 0.34 degrees Celsius per decade between 1985 and 2009. Some warmed much faster, like Finland’s Lake Lappajärvi, which soared nearly 0.9 degrees each decade. A few even cooled, such as Blue Cypress Lake in Florida. Puzzlingly, there was no clear trend in which lakes warmed and which cooled. The most rapidly warming lakes were scattered across different latitudes and elevations.

Even some that were nearly side by side warmed at different rates from one another — Lake Superior, by far the largest of the Great Lakes, is warming much more rapidly, at a full degree per decade, than others in the chain, although Huron and Michigan are also warming fast.

“Even though lakes are experiencing the same weather, they are responding in different ways,” says Stephanie Hampton, an aquatic biologist at Washington State University in Pullman.

Such variability makes it hard to pin down what to expect in the future. But researchers are starting to explore factors such as lake depth and lake size (intuitively, it’s less teeth-chattering to swim in a small pond in early summer than a big lake).

Depth and size play into stratification, the process through which some lakes separate into layers of different temperatures. Freshwater is densest at 4° C, just above freezing. In spring, using the Great Lakes as an example, the cold surface waters begin to warm; when they reach 4°, they become dense enough to sink. The lake’s waters mix freely and become much the same temperature at all depths.
But then, throughout the summer, the upper waters heat up relatively quickly. The lake stops mixing and instead separates into layers, with warm water on top and cold, dense water at the bottom. It stays that way until autumn, when chilly air temperatures cool the surface waters to 4°. The newly dense waters sink again, mixing the lake for the second time of the year.

Lake Superior is warming so quickly because it is stratifying earlier and earlier each year. It used to separate into its summer layers during mid- to late July, on average. But rising air temperatures mean that it is now stratifying about a month earlier — giving the shallow surface layers much more time to get toasty each summer. “If you hit that starting point in June, now you’ve got all summer to warm up that top layer,” Lenters says.

Deep lakes warm very slowly in the spring, and small changes in water temperature at the end of winter can lead to large changes in the timing of summer stratification for these lakes. Superior is about 406 meters deep at its greatest point, so it is particularly vulnerable to such shifts.

In contrast, shallow lakes warm much more quickly in the spring, so the timing of their summer stratification is much less variable than for deep lakes. Lake Erie is only 64 meters deep at its maximum, which is why Erie is not experiencing big changes in its stratification start date. Erie is warming one-tenth as fast as Superior, just 0.1 degrees per decade.

Superior is also warming because of a decline in cloud cover over the Great Lakes in recent years; more heat from solar radiation hits the lakes, Lenters said at a limnology meeting in Honolulu in March. Why the cloud cover is changing isn’t known — it could be natural variability. But the increased sunlight means another source of warming for Superior and the other Great Lakes.

On top of that, evaporation, measured from spots like Stannard Rock, also plays into the complexity. High evaporation rates in a warm autumn can actually lead to more ice cover the following winter and slower ice breakup in the spring, because the water is colder after evaporation. “When lakes sweat, they cool off,” Lenters says. All these factors conspire to complicate the picture of why Superior is warming so quickly, and what people in the Great Lakes region can do about it.

A new reality
Warming water — even small changes — can have a big impact on a lake’s ecology. One of the most famous examples is Lake Tanganyika in eastern Africa. It has been warming relatively slowly, about 0.2 degrees per decade. But that is enough to make it more stratified year-round and less likely to mix. With layers stagnating, nutrients that used to rise from the bottom of the lake become trapped down low, Hampton says.

With fewer nutrients reaching the upper waters, lake productivity has plummeted. Since the late 1970s, catches of sardines and sprats have declined by as much as 50 percent, and the hundreds of thousands of people who depend on the lake for food have had to find additional sources of protein. Factors such as overfishing may also play a role, but a study published last August in Proceedings of the National Academy of Sciences found that lake temperatures in the last century were the highest of at least the previous 500 years.

Elsewhere, lake warming seems to be shifting the relative abundances of fish within certain lakes. This is apparent in a study of walleye (Sander vitreus), a popular recreational fishing target in the lakes of the U.S. Upper Midwest.

In Wisconsin, recreational freshwater fishing brings in more than $1.5 billion annually. So officials were worried when, around 2000, anglers and biologists began reporting that walleye numbers seemed to be dropping.

“We’ve seen declines in some of our most valuable fish,” says Jordan Read, a limnologist at the U.S. Geological Survey in Middleton, Wis. Hoping to figure out why, Read and colleagues analyzed water temperatures in 2,148 Wisconsin lakes from 1989 to 2014. Some of these lakes had seen populations of walleye drop as populations of largemouth bass (Micropterus salmoides) increased. Largemouth bass are also popular catches, although not as popular as walleye.
The scientists simulated how lake temperatures would probably rise through the year 2089 and how that might affect walleye survival in the state’s lakes. The team used a measure that describes whether walleye can spawn and their young can survive in a particular environment, compared with the relative abundance of largemouth bass. Up to 75 percent of the lakes studied would no longer be able to support young walleye by 2089, while the number of lakes that could support lots of bass could increase by 60 percent, the researchers estimate in the April Global Change Biology.

“Bass and walleye seem to be responding to the same temperature threshold but in opposite directions,” says Gretchen Hansen, a fisheries scientist at the Minnesota Department of Natural Resources in St. Paul who led the work.

The reason isn’t yet clear. Physiologically, walleye should still be able to survive in the higher temperatures. But something is already causing them to wane — perhaps they have fewer food sources, or they spawn less successfully. Field studies are under way to try to answer that question, Hansen says.

Variability in lake warming offers hope for the walleye. The study identified lakes where the walleye might be able to hold on. Some of these places have warmed less than others, making them more amenable to walleye, even as largemouth bass take over other lakes.

If the researchers can identify lakes that are likely to keep walleye healthy in the future, then officials can foster walleye spawning in those places and keep the state’s fishing industry healthy for decades to come. “While the outlook isn’t great, there are … lakes that are a good target of management action,” Read says. The scientists are now expanding their analysis into Minnesota and other neighboring states.

Less ice
Ecological changes put into motion during a particularly cold or hot time can send ripples during the following seasons, researchers are finding. “What happens in previous seasons sometimes matters more than the current season,” Lenters says. This is especially true for lakes at high latitudes that are covered in ice each winter but may see less ice as temperatures rise. Ice acts as an insulator, protecting the waters from big changes in the air temperature above. When the ice finally melts in spring, the water is exposed to warming from the atmosphere and from sunlight. “It’s a way the temperature can really rapidly increase in those lakes,” Hampton says.
Siberia’s Lake Baikal, for example, sees three to four weeks less ice cover than it did a century ago. That shift could affect Baikal seals ( Pusa sibirica ), the world’s only freshwater seals, which depend on ice cover to birth and shelter their pups each spring. There are no hard data on seal declines, in part because annual surveys of seal populations ceased in the early 1990s when the Soviet Union broke apart. “But if the ice duration is too short, then the pups may be exposed to predators before they’re ready,” Hampton says.
More broadly, and at other lakes, big questions remain about how winter and summer ecosystems connect. Biologists are assessing what wintertime ecosystems look like now, as a framework for understanding future change.

In a survey of 101 ice-covered lakes, Hampton and colleagues found more plankton under the ice than they had expected; chlorophyll levels were 43 percent of what they were in the summer. “It surprised me it was that high,” she says. “Some of these are snow-covered lakes not getting a lot of light.” The team reported its puzzling findings in January in Ecology Letters.

As winter shortens, fish may find more nutrients available to them earlier in the year than usual. Other algae-grazing creatures may become more abundant as the food web adjusts to what’s available with less ice cover.

More methane
Warming lakes themselves might exacerbate climate change. As temperatures rise, methane from microbes called archaea at the lake’s bottom bubbles up through the water column — particularly in northern lakes — and adds to the atmosphere’s greenhouse gas load.

At the Honolulu meeting, biogeochemist Tonya DelSontro of the University of Quebec in Montreal reported on methane release from boreal lakes, those lying between 50° and 70° N in realms such as Canada and Siberia. The boreal region contains up to half of the world’s carbon, with lakes an important source and sink.
DelSontro simulated how the boreal zone’s 9 million lakes would behave in the future. Just a 1 degree rise in surface temperature would boost methane emissions from boreal lakes by about 10 percent, DelSontro found. That’s not taking into account other factors such as a lengthening of the ice-free season, which would also put more methane into the air.

And at the University of Exeter in England, lake expert Gabriel Yvon-Durocher has been working to measure, on a small scale, how exactly ponds and lakes will respond to rising temperatures. His team built a series of experimental ponds, each of which is warmed by a certain temperature range over a certain period of time.

After heating the ponds by 4 to 5 degrees over seven years, the scientists found the lakes’ methane emissions more than doubled. In the same period, the ability to suck down carbon dioxide was cut almost by half. Such shifts could make climate change even worse, the team wrote in February in Nature Climate Change.

With so much variability among lakes, and so much uncertainty remaining about where they may head in the future, Lenters argues that limnologists need to keep gathering as much information as possible. Just as the Stannard Rock lighthouse is providing key data on Lake Superior, other locations need to be pressed into service to keep an eye on what lakes are doing. “There are aspects of the Pacific Ocean we know better than Lake Superior,” he says. “Lakes are woefully understudied.”

New rules for cellular entry may aid antibiotic development

Like entry to an exclusive nightclub, getting inside a gram-negative bacterial cell is no easy feat for chemical compounds. But now a secret handshake has been revealed: A new study lays out several rules to successfully cross the cells’ fortified exteriors, which could lead to the development of sorely needed antibiotics.

“It’s a breakthrough,” says microbiologist Kim Lewis of Northeastern University in Boston, who was not involved with the work. The traditional way to learn how compounds get across the bacterial barrier is to study the barrier, he says. “They decided to attack the problem from the other end: What are the properties of the molecules that may allow them to penetrate across the barrier?” The work describing these properties is published online in Nature on May 10.

Escherichia coli and other gram-negative bacteria — so described because of how they look when exposed to a violet dye called a gram stain — have two cellular membranes. The outer membrane is impermeable to most antibiotics, says Paul Hergenrother, a chemical biologist at the University of Illinois at Urbana-Champaign. “Even if a drug might be really good at killing that gram-negative pathogen, it may not be able to get in the bacteria,” he says.

Many antibiotics that have been effective against gram-negative bacteria are becoming unreliable, as the bugs have developed resistance (SN: 10/15/16, p. 11). To encourage drug development, in February the World Health Organization released a list of pathogens that are resistant to multiple drugs and threaten human health. All of the bacteria in the critical priority group are gram-negative.
The outer membrane of gram-negative cells is dotted with proteins called porins. These channel structures allow the bacteria to take up nutrients. Those antibiotics that can get inside gram-negative bacteria typically pass through porins, says Hergenrother.

To uncover the dos and don’ts of porin passage, Hergenrother’s group synthesized 100 compounds that share characteristics with antimicrobials found in nature, such as those from plants. The researchers took each compound, incubated it with E. coli bacteria in a tube for 10 minutes and then measured how much got inside the cells.

One feature stood out among the dozen of those compounds that significantly accumulated inside the bacterial cells: They all contained an amine group, a chemical group that contains the element nitrogen.

Next, the team collected a larger set of compounds that have amine groups and again measured whether or not the compounds accumulated inside E. coli cells. The researchers used a computer program to predict what other attributes would be necessary to get through porins. This analysis revealed that a compound should be rigid, rather than flexible, and flat, as opposed to spherical. It’s much easier to put a ruler through a narrow opening than a basketball, notes Hergenrother.

To test the new rules, the researchers turned to an antimicrobial called deoxynybomycin. This compound is effective only against gram-positive bacteria, which has just one cellular membrane. Deoxynybomycin is flat and rigid, so it already has “the right geometrical parameters,” says Hergenrother. That makes it a good compound “to try to add an amine to, in a place that doesn’t disrupt how it interacts with the biological target.”

The team synthesized a derivative of deoxynybomycin with an amine group and tested the compound against a number of gram-negative pathogens that are resistant to many antibiotics. The altered deoxynybomycin successfully killed all but one of the types of pathogens tested.

It’s possible other antibiotics that specifically target gram-positive bacteria could be converted into drugs that kill gram-negative bugs too by following the new rules, says Hergenrother. And keeping these guidelines in mind when assembling compound collections could make screening for drug candidates more successful.

The research could also “revive the failed effort to rationally design antibiotics,” says Lewis. Knowing the rules, it may be possible to build a compound that both hits its bacterial target and has the features needed to penetrate the target’s barrier.

U.S. will withdraw from climate pact, Trump announces

President Donald Trump announced on June 1 that the United States will pull out of the Paris climate accord.

In signing the 2015 Paris agreement, the United States, along with 194 other countries, pledged to curb greenhouse gas emissions to combat global warming. But Trump — who has called climate change a “hoax” despite scientific evidence to the contrary — promised during his campaign that he would withdraw from the Paris accord.

“The agreement is a massive redistribution of the United States’ wealth to other countries,” Trump said. “As of today, the United States will cease all implementation of the nonbinding Paris accord and the draconian financial and economic burdens [it] imposes on our country.” This includes making further payments to the Green Climate Fund, set up to help developing countries battle and cope with climate change. The United States has already paid $1 billion of the $3 billion it pledged to the fund.

Trump did leave the door open to reentering the Paris accord under revised terms or signing an entirely new climate agreement.

Magma stored under volcanoes is mostly solid

Most of a volcano’s magma probably isn’t the oozing, red-hot molten goo often imagined.

Analyses of zircon crystals, spewed from a volcanic eruption in New Zealand, show that the crystals spent the vast majority of their time underground in solid, not liquid, magma, researchers report in the June 16 Science. The results suggest the magma melted shortly before the volcano erupted.

This finding helps confirm geologists’ emerging picture of magma reservoirs as mostly solid masses, says geologist John Pallister of the U.S. Geological Survey in Vancouver, Wash., who was not involved in the study. And it could help scientists more accurately forecast when volcanoes are poised to erupt.
Studying magma reservoirs directly is difficult because they’re buried kilometers underground. Heat and pressure would destroy any instruments sent down there. So Kari Cooper, a geochemist at the University of California, Davis, and her colleagues probed magma by scrutinizing seven zircon crystals from New Zealand’s Taupo Volcanic Zone. These crystals formed between a few thousand and a few hundred thousand years ago, when molten magma from deeper in Earth’s crust crept up to the Taupo reservoir, cooled and crystallized into zircon and other minerals. Some of these other minerals eventually melted back into liquid magma and carried the zircon up and out during an eruption 700 years ago.
By examining the distribution of lithium in the zircon crystals, the researchers discerned how long the zircon had existed at temperatures hot enough to melt its mineral neighbors — that is, how long the magma had stayed molten. Lithium, which the crystals would have picked up from surrounding magma, spreads through zircon faster when it’s hotter, Cooper explains.

The diffusion of lithium indicated that the crystals spent, at most, about 1,200 years exposed to a temperature range of 650° to 750° Celsius. At those temperatures, solid magma melts into a state that’s a little like a snow cone — mostly crystalline, with a bit of liquid seeping through. And for just 40 years, the crystals were exposed to temperatures above 750° — hot enough for magma to completely melt. Since the magma spent the overwhelming majority of its lifetime in the reservoir as a mostly solid mass, scientists surmise it melted briefly only before eruption.

“The other cool thing that we found is that most of the crystals are more than 50,000 years old,” Cooper says. In the last 50,000 years, this volcanic system underwent many eruptions before belching up the studied zircon crystals 700 years ago. The relatively short time these crystals experienced high heat suggests that they weren’t affected much by the magma in those previous eruptions. “Everything has to be much more compartmentalized down there than we originally thought,” Cooper says.

The study’s findings raise questions about how mostly solid magma melts and mobilizes before an eruption, says George Bergantz, an earth scientist at the University of Washington in Seattle who was not involved in the research. Cooper suspects that molten material from even deeper underground seeps up and melts solid magma. But, she says, “it’s still very much an open question.”

Drowned wildebeests can feed a river ecosystem for years

More than a million wildebeests migrate each year from Tanzania to Kenya and back again, following the rains and abundant grass that springs up afterward. Their path takes them across the Mara River, and some of the crossings are so dangerous that hundreds or thousands of wildebeests drown as they try to traverse the waterway.

Those animals provide a brief, free buffet for crocodiles and vultures. And, a new study finds, they’re feeding an aquatic ecosystem for years.

Ecologist Amanda Subalusky of the Cary Institute of Ecosystem Studies in Millbrook, N.Y., had been studying water quality in the Mara River when she and her colleagues noticed something odd. Commonly used indicators of water quality, such as dissolved oxygen and turbidity, were sometimes poorest where the river flowed through a protected area. They quickly realized that it was because of the animals that flourished there. Hippos, which eat grass at night and defecate in the water during the day, were one contributor. And dead wildebeests were another.

“Wildebeest are especially good at following the rains, and they’re willing to cross barriers to follow it,” says Subalusky. The animals tend to cross at the same spots year after year, and some are more dangerous than others. “Once they’ve started using a site, they continue, even if it’s bad,” she notes. And on average, more than 6,000 wildebeests drown each year. (That may sound like a lot, but it’s only about 0.5 percent of the herd.) Their carcasses add the equivalent of the mass of 10 blue whales into the river annually.

Subalusky and her colleagues set out to see how all that meat and bone affected the river ecosystem. When they heard about drownings, they would go to the river to count carcasses. They retrieved dead wildebeests from the water to test what happened to the various parts over time. And they measured nutrients up and downstream from river crossings to see what the wildebeest carcasses added to the water.

“There are some interesting challenges working in this system,” Subalusky says. For instance, in one experiment, she and her colleagues put pieces of wildebeest carcass into mesh bags that went into the river. The plan was that they would retrieve the bags over time and see how quickly or slowly the pieces decomposed. “We spent a couple of days putting the whole thing together and we came back the next day to collect our first set of samples,” she recalls. “At least half the bags with wildebeest meat were just gone. Crocodiles and Nile monitors had plucked them off the chain.”
The researchers determined that the wildebeests’ soft tissue decomposes in about two to 10 weeks. This provides a pulse of nutrients — carbon, nitrogen and phosphorus — to the aquatic food web as well as the nearby terrestrial system. Subalusky and her colleagues are still working out the succession of scavengers that feast on the wildebeests, but vultures, marabou storks, egg-laying bugs and things that eat bugs are all on the list.
Once the soft tissue is gone, the bones remain, sometimes piling up in bends in the river or other spots downstream. “They take years to decompose,” Subalusky says, slowly leaching out most of the phosphorus that had been in the animal. The bones can also become covered in a biofilm of algae, fungi and bacteria that provides food for fish.

What initially looks like a short-lived event actually provides resources for seven years or more, Subalusky and her colleagues report June 19 in the Proceedings of the National Academy of Sciences.

The wildebeest migration is the largest terrestrial migration on the planet, and others of its kind have largely disappeared as humans have killed off animals or cut off their migration routes.

Only a few hundred years ago, for instance, millions of bison roamed the western United States. There are accounts in which thousands of bison drowned in rivers, similar to what happens with wildebeests. Those rivers may have fundamentally changed after bison were nearly wiped out, Subalusky and her colleagues contend.

We’ll never know if that was the case, but there are still some places where scientists may be able to study the effects of mass drownings on rivers. A large herd of caribou reportedly drowned in Canada in the 1980s, and there are still some huge migrations of animals, such as reindeer. Like the wildebeests, these animals might be feeding an underwater food web that no one has ever noticed.

How humans (maybe) domesticated themselves

Long before humans domesticated other animals, we may have domesticated ourselves.

Over many generations, some scientists propose, humans selected among themselves for tameness. This process resulted in genetic changes, several recent studies suggest, that have shaped people in ways similar to other domesticated species.

Tameness, says evolutionary biologist and primatologist Richard Wrangham of Harvard University, may boil down to a reduction in reactive aggression — the fly-off-the-handle temperament that makes an animal bare its teeth at the slightest challenge. In this sense, he says, humans are fairly tame. We might show great capacity for premeditated aggression, but we don’t attack every stranger we encounter.
Sometime in the last 200,000 years, humans began weeding out people with an overdose of reactive aggression, Wrangham suggests. Increasingly complex social skills would have allowed early humans to gang up against bullies, he proposes, pointing out that hunter-gatherers today have been known to do the same. Those who got along, got ahead.

Once animals have been selected for tameness, other traits tend to follow, including reshaping of the head and face. Humans even look domesticated: Compared with Neandertals’, our faces are shorter with smaller brow ridges, and males’ faces are more similar to those of females.
Selecting for less-aggressive humans could have also helped us flourish as a social species, says Antonio Benítez-Burraco, who studies language evolution at the University of Huelva in Spain. The earliest Homo sapiens were becoming capable of complex thought, he proposes, but not yet language. “We were modern in the sense of having modern brains, but we were not modern in behavior.”
Once humans began to self-­domesticate, though, changes to neural crest cells could have nudged us toward a highly communicative species. Something similar happens in songbirds: Domesticated birds have more complex songs than their wild counterparts. What’s more, self-domestication may be more common than once thought. Bonobos, Wrangham notes, live in peaceful groups compared with closely related, but more violent, chimpanzees. If humans took steps to domesticate themselves, perhaps they weren’t the only ones.

GM moth trial gets a green light from USDA

Cabbage-chomping moths genetically modified to be real lady-killers may soon take flight in upstate New York. On July 6, the U.S. Department of Agriculture OK’d a small open-air trial of GM diamondback moths (Plutella xylostella), which the agency says do not pose a threat to human or environmental health.

These male moths carry a gene that kills female offspring before they mature. Having fewer females available for mating cuts overall moth numbers, so releasing modified male moths in crop fields would theoretically nip an outbreak and reduce insecticide use.
Originally from Europe, diamondback moths have quite the rap sheet: They’re invasive, insecticide-resistant crop pests. The caterpillar form munches through cauliflower, cabbage, broccoli and other Brassica plant species across the Americas, Southeast Asia, Australia and New Zealand.

After completing successful lab and cage trials, Cornell University entomologist Tony Shelton and colleagues now plan to loose the moths on 10 acres of Brassica fields at the New York State Agricultural Experiment Station in Geneva. The team has clearance to release 10,000 moths at a time, and up to 30,000 weekly.

This GM strain comes from Oxitec, the same firm behind controversial GM mosquitoes proposed for release in Florida (SN Online: 8/5/16). Several agricultural and environmental groups oppose the moth trial, too. While these will be the first GM moths released with a so-called female lethality gene, this won’t be the first genetically modified moth released in the United States. In 2009, researchers in Arizona tested transgenic pink bollworm moths, which threaten cotton fields.

The trial’s exact timeline remains up in the air. The scientists need approval from the New York State Department of Environmental Conservation before going forward.

How earthquake scientists eavesdrop on North Korea’s nuclear blasts

On September 9 of last year, in the middle of the morning, seismometers began lighting up around East Asia. From South Korea to Russia to Japan, geophysical instruments recorded squiggles as seismic waves passed through and shook the ground. It looked as if an earthquake with a magnitude of 5.2 had just happened. But the ground shaking had originated at North Korea’s nuclear weapons test site.

It was the fifth confirmed nuclear test in North Korea, and it opened the latest chapter in a long-running geologic detective story. Like a police examiner scrutinizing skid marks to figure out who was at fault in a car crash, researchers analyze seismic waves to determine if they come from a natural earthquake or an artificial explosion. If the latter, then scientists can also tease out details such as whether the blast was nuclear and how big it was. Test after test, seismologists are improving their understanding of North Korea’s nuclear weapons program.
The work feeds into international efforts to monitor the Comprehensive Nuclear-Test-Ban Treaty, which since 1996 has banned nuclear weapons testing. More than 180 countries have signed the treaty. But 44 countries that hold nuclear technology must both sign and ratify the treaty for it to have the force of law. Eight, including the United States and North Korea, have not.

To track potential violations, the treaty calls for a four-pronged international monitoring system, which is currently about 90 percent complete. Hydroacoustic stations can detect sound waves from underwater explosions. Infrasound stations listen for low-frequency sound waves rumbling through the atmosphere. Radio­nuclide stations sniff the air for the radioactive by-products of an atmospheric test. And seismic stations pick up the ground shaking, which is usually the fastest and most reliable method for confirming an underground explosion.

Seismic waves offer extra information about an explosion, new studies show. One research group is exploring how local topography, like the rugged mountain where the North Korean government conducts its tests, puts its imprint on the seismic signals. Knowing that, scientists can better pinpoint where the explosions are happening within the mountain — thus improving understanding of how deep and powerful the blasts are. A deep explosion is more likely to mask the power of the bomb.
Separately, physicists have conducted an unprecedented set of six explosions at the U.S. nuclear test site in Nevada. The aim was to mimic the physics of a nuclear explosion by detonating chemical explosives and watching how the seismic waves radiate outward. It’s like a miniature, nonnuclear version of a nuclear weapons test. Already, the scientists have made some key discoveries, such as understanding how a deeply buried blast shows up in the seismic detectors.
The more researchers can learn about the seismic calling card of each blast, the more they can understand international developments. That’s particularly true for North Korea, where leaders have been ramping up the pace of military testing since the first nuclear detonation in 2006. On July 4, the country launched its first confirmed ballistic missile — with no nuclear payload — that could reach as far as Alaska.

“There’s this building of knowledge that helps you understand the capabilities of a country like North Korea,” says Delaine Reiter, a geophysicist with Weston Geophysical Corp. in Lexington, Mass. “They’re not shy about broadcasting their testing, but they claim things Western scientists aren’t sure about. Was it as big as they claimed? We’re really interested in understanding that.”

Natural or not
Seismometers detect ground shaking from all sorts of events. In a typical year, anywhere from 1,200 to 2,200 earthquakes of magnitude 5 and greater set off the machines worldwide. On top of that is the unnatural shaking: from quarry blasts, mine collapses and other causes. The art of using seismic waves to tell one type of event from the others is known as forensic seismology.

Forensic seismologists work to distinguish a natural earthquake from what could be a clandestine nuclear test. In March 2003, for instance, seismometers detected a disturbance coming from near Lop Nor, a dried-up lake in western China that the Chinese government, which signed but hasn’t ratified the test ban treaty, has used for nuclear tests. Seismologists needed to figure out immediately what had happened.

One test for telling the difference between an earthquake and an explosion is how deep it is. Anything deeper than about 10 kilometers is almost certain to be natural. In the case of Lop Nor, the source of the waves seemed to be located about six kilometers down — difficult to tunnel to, but not impossible. Researchers also used a second test, which compares the amplitudes of two different kinds of seismic waves.

Earthquakes and explosions generate several types of seismic waves, starting with P, or primary, waves. These waves are the first to arrive at a distant station. Next come S, or secondary, waves, which travel through the ground in a shearing motion, taking longer to arrive. Finally come waves that ripple across the surface, including those called Rayleigh waves.
In an explosion as compared with an earthquake, the amplitudes of Rayleigh waves are smaller than those of the P waves. By looking at those two types of waves, scientists determined the Lop Nor incident was a natural earthquake, not a secretive explosion. (Seismology cannot reveal the entire picture. Had the Lop Nor event actually been an explosion, researchers would have needed data from the radionuclide monitoring network to confirm the blast came from nuclear and not chemical explosives.)

For North Korea, the question is not so much whether the government is setting off nuclear tests, but how powerful and destructive those blasts might be. In 2003, the country withdrew from the Treaty on the Nonproliferation of Nuclear Weapons, an international agreement distinct from the testing ban that aims to prevent the spread of nuclear weapons and related technology. Three years later, North Korea announced it had conducted an underground nuclear test in Mount Mantap at a site called Punggye-ri, in the northeastern part of the country. It was the first nuclear weapons test since India and Pakistan each set one off in 1998.

By analyzing seismic wave data from monitoring stations around the region, seismologists concluded the North Korean blast had come from shallow depths, no more than a few kilometers within the mountain. That supported the North Korean government’s claim of an intentional test. Two weeks later, a radionuclide monitoring station in Yellowknife, Canada, detected increases in radioactive xenon, which presumably had leaked out of the underground test site and drifted eastward. The blast was nuclear.

But the 2006 test raised fresh questions for seismologists. The ratio of amplitudes of the Rayleigh and P waves was not as distinctive as it usually is for an explosion. And other aspects of the seismic signature were also not as clear-cut as scientists had expected.

Researchers got some answers as North Korea’s testing continued. In 2009, 2013 and twice in 2016, the government set off more underground nuclear explosions at Punggye-ri. Each time, researchers outside the country compared the seismic data with the record of past nuclear blasts. Automated computer programs “compare the wiggles you see on the screen ripple for ripple,” says Steven Gibbons, a seismologist with the NORSAR monitoring organization in Kjeller, Norway. When the patterns match, scientists know it is another test. “A seismic signal generated by an explosion is like a fingerprint for that particular region,” he says.

With each test, researchers learned more about North Korea’s capabilities. By analyzing the magnitude of the ground shaking, experts could roughly calculate the power of each test. The 2006 explosion was relatively small, releasing energy equivalent to about 1,000 tons of TNT — a fraction of the 15-kiloton bomb dropped by the United States on Hiroshima, Japan, in 1945. But the yield of North Korea’s nuclear tests crept up each time, and the most recent test, in September 2016, may have exceeded the size of the Hiroshima bomb.
Digging deep
For an event of a particular seismic magnitude, the deeper the explosion, the more energetic the blast. A shallow, less energetic test can look a lot like a deeply buried, powerful blast. Scientists need to figure out precisely where each explosion occurred.

Mount Mantap is a rugged granite mountain with geology that complicates the physics of how seismic waves spread. Western experts do not know exactly how the nuclear bombs are placed inside the mountain before being detonated. But satellite imagery shows activity that looks like tunnels being dug into the mountainside. The tunnels could be dug two ways: straight into the granite or spiraled around in a fishhook pattern to collapse and seal the site after a test, Frank Pabian, a nonproliferation expert at Los Alamos National Laboratory in New Mexico, said in April in Denver at a meeting of the Seismological Society of America.

Researchers have been trying to figure out the relative locations of each of the five tests. By comparing the amplitudes of the P, S and Rayleigh waves, and calculating how long each would have taken to travel through the ground, researchers can plot the likely sites of the five blasts. That allows them to better tie the explosions to the infrastructure on the surface, like the tunnels spotted in satellite imagery.

One big puzzle arose after the 2009 test. Analyzing the times that seismic waves arrived at various measuring stations, one group calculated that the test occurred 2.2 kilometers west of the first blast. Another scientist found it only 1.8 kilometers away. The difference may not sound like a lot, Gibbons says, but it “is huge if you’re trying to place these relative locations within the terrain.” Move a couple of hundred meters to the east or west, and the explosion could have happened beneath a valley as opposed to a ridge — radically changing the depth estimates, along with estimates of the blast’s power.

Gibbons and colleagues think they may be able to reconcile these different location estimates. The answer lies in which station the seismic data come from. Studies that rely on data from stations within about 1,500 kilometers of Punggye-ri — as in eastern China — tend to estimate bigger distances between the locations of the five tests when compared with studies that use data from more distant seismic stations in Europe and elsewhere. Seismic waves must be leaving the test site in a more complicated way than scientists had thought, or else all the measurements would agree.
When Gibbons’ team corrected for the varying distances of the seismic data, the scientists came up with a distance of 1.9 kilometers between the 2006 and 2009 blasts. The team also pinpointed the other explosions as well. The September 2016 test turned out to be almost directly beneath the 2,205-meter summit of Mount Mantap, the group reported in January in Geophysical Journal International. That means the blast was, indeed, deeply buried and hence probably at least as powerful as the Hiroshima bomb for it to register as a magnitude 5.2 earthquake.

Other seismologists have been squeezing information out of the seismic data in a different way — not in how far the signals are from the test blast, but what they traveled through before being detected. Reiter and Seung-Hoon Yoo, also of Weston Geophysical, recently analyzed data from two seismic stations, one 370 kilometers to the north in China and the other 306 kilometers to the south in South Korea.

The scientists scrutinized the moments when the seismic waves arrived at the stations, in the first second of the initial P waves, and found slight differences between the wiggles recorded in China and South Korea, Reiter reported at the Denver conference. Those in the north showed a more energetic pulse rising from the wiggles in the first second; the southern seismic records did not. Reiter and Yoo think this pattern represents an imprint of the topography at Mount Mantap.

“One side of the mountain is much steeper,” Reiter explains. “The station in China was sampling the signal coming through the steep side of the mountain, while the southern station was seeing the more shallowly dipping face.” This difference may also help explain why data from seismic stations spanning the breadth of Japan show a slight difference from north to south. Those differences may reflect the changing topography as the seismic waves exited Mount Mantap during the test.

Learning from simulations
But there is only so much scientists can do to understand explosions they can’t get near. That’s where the test blasts in Nevada come in.

The tests were part of phase one of the Source Physics Experiment, a $40-million project run by the U.S. Department of Energy’s National Nuclear Security Administration. The goal was to set off a series of chemical explosions of different sizes and at different depths in the same borehole and then record the seismic signals on a battery of instruments. The detonations took place at the nuclear test site in southern Nevada, where between 1951 and 1992 the U.S. government set off 828 underground nuclear tests and 100 atmospheric ones, whose mushroom clouds were seen from Las Vegas, 100 kilometers away.

For the Source Physics Experiment, six chemical explosions were set off between 2011 and 2016, ranging up to 5,000 kilograms of TNT equivalent and down to 87 meters deep. The biggest required high-energy–density explosives packed into a cylinder nearly a meter across and 6.7 meters long, says Beth Dzenitis, an engineer at Lawrence Livermore National Laboratory in California who oversaw part of the field campaign. Yet for all that firepower, the detonation barely registered on anything other than the instruments peppering the ground. “I wish I could tell you all these cool fireworks go off, but you don’t even know it’s happening,” she says.

The explosives were set inside granite rock, a material very similar to the granite at Mount Mantap. So the seismic waves racing outward behaved very much as they might at the North Korean nuclear test site, says William Walter, head of geophysical monitoring at Livermore. The underlying physics, describing how seismic energy travels through the ground, is virtually the same for both chemical and nuclear blasts.
The results revealed flaws in the models that researchers have been using for decades to describe how seismic waves travel outward from explosions. These models were developed to describe how the P waves compress rock as they propagate from large nuclear blasts like those set off starting in the 1950s by the United States and the Soviet Union. “That worked very well in the days when the tests were large,” Walter says. But for much smaller blasts, like those North Korea has been detonating, “the models didn’t work that well at all.”
Walter and Livermore colleague Sean Ford have started to develop new models that better capture the physics involved in small explosions. Those models should be able to describe the depth and energy release of North Korea’s tests more accurately, Walter reported at the Denver meeting.

A second phase of the Source Physics Experiment is set to begin next year at the test site, in a much more rubbly type of rock called alluvium. Scientists will use that series of tests to see how seismic waves are affected when they travel through fragmented rock as opposed to more coherent granite. That information could be useful if North Korea begins testing in another location, or if another country detonates an atomic bomb in fragmented rock.

For now, the world’s seismologists continue to watch and wait, to see what the North Korean government might do next. Some experts think the next nuclear test will come at a different location within Mount Mantap, to the south of the most recent tests. If so, that will provide a fresh challenge to the researchers waiting to unravel the story the seismic waves will tell.

“It’s a little creepy what we do,” Reiter admits. “We wait for these explosions to happen, and then we race each other to find the location, see how big it was, that kind of thing. But it has really given us a good look as to how [North Korea’s] nuclear program is progressing.” Useful information as the world’s nations decide what to do about North Korea’s rogue testing.