Patience is a virtue in the hunt for dark matter. Experiment after experiment has come up empty in the search — and the newest crop is no exception.
Observations hint at the presence of an unknown kind of matter sprinkled throughout the cosmos. Several experiments are focused on the search for one likely dark matter candidate: weakly interacting massive particles, or WIMPs (SN: 11/12/16, p. 14). But those particles have yet to be spotted.
Recent results, posted at arXiv.org, continue the trend. The PandaX-II experiment, based in China, found no hint of the particles, scientists reported August 23. The XENON1T experiment in Italy also came up WIMPless according to a May 18 paper. Scientists with the DEAP-3600 experiment in Sudbury, Canada, reported their first results on July 25. Signs of dark matter? Nada. And the SuperCDMS experiment in the Soudan mine in Minnesota likewise found no hints of WIMPs, scientists reported August 29.
Another experiment, PICO-60, also located in Sudbury, reported its contribution to the smorgasbord of negative results June 23 in Physical Review Letters.
Scientists haven’t given up hope. Researchers are building ever-larger detectors, retooling their experiments and continuing to expand the search beyond WIMPs.
Every day, it seems like there’s a new natural disaster in the headlines. Hurricane Harvey inundates Texas. Hurricane Irma plows through the Caribbean and the U.S. south, and Jose is hot on its heels. A deadly 8.1-magnitude earthquake rocks Mexico. Wildfires blanket the western United States in choking smoke.
While gripping tales of loss and heroism rightly fill the news, another story quietly unfolds. Hurricanes, droughts, oil spills, wildfires and other disasters are natural labs. Data quickly gathered in the midst of such chaos, as well as for years afterward, can lead to discoveries that ultimately make rescue, recovery and resilience to future crises possible.
So when disaster strikes, science surges, says human ecologist Gary Machlis of Clemson University in South Carolina. He has studied and written about the science done during crises and was part of the U.S. Department of the Interior’s Strategic Sciences Group, which helps government officials respond to disasters.
The science done during Hurricane Harvey is an example. Not long after the heavy rains stopped, crews of researchers from the U.S. Geological Survey fanned across Texas, dropping sensors into streams. The instruments measure how swiftly the water is flowing and determine the severity of the flooding in different regions affected by the hurricane. Knowing where the flooding is the worst can help the Federal Emergency Management Agency and other government groups direct funds to areas with the most extreme damage. In the days leading up to Irma’s U.S. landfall, scientists from the same agency also went to the Florida, Georgia and South Carolina coasts to fasten storm-tide sensors to pier pylons and other structures. The sensors measure the depth and duration of the surge in seawater generated by the change in pressure and winds from the storm. This data will help determine damage from the surge and improve models of flooding in the future, which could help provide a better picture of where future storm waters will go and who needs to be evacuated ahead of hurricanes.
Even as Irma struck Florida, civil engineer Forrest Masters of the University of Florida in Gainesville, his students and collaborators traveled to the southern part of the state to study the intensity and variation in the hurricane’s winds. As winds blew and rain pelted, the team raised minitowers decked with instruments designed to measure ground-level gusts and turbulence. With this data, the researchers will compare winds in coastal areas, near buildings and around other structures, data that can help government agencies assess storm-related damage to buildings and other structures. The team will also take the data back to the Natural Hazards Engineering Research Infrastructure labs at the University of Florida to study building materials and identify those most resistant to extreme winds. “Scientists want to use their expertise to help society in whatever way they can during a disaster,” says biologist Teresa Stoepler, who was a member of the Strategic Sciences Group when she worked at USGS.
As a former science & technology policy fellow with the American Association for the Advancement of Science, Stoepler studied the science that resulted from the 2010 Deepwater Horizon oil spill. This devastating explosion of an oil rig spewed 210 million gallons of petroleum into the Gulf of Mexico. It also opened the door for scientific research. Biologists, chemists, psychologists and a range of other scientists wanted to study the environmental, economic and mental health consequences of the disaster; local scientists wanted to study the effects of the spill on their communities; and leaders at the local and federal government needed guidance on how to respond. There was a need to coordinate all of that effort.
That’s where the Strategic Sciences Group came in. The group, officially organized in 2012, brought together researchers from federal, academic and nongovernmental organizations. The goal was to use data collected from the spill to map out possible long-term environmental and economic consequences of the disaster, determine where research still needed to be done and determine how to allocate money for response and recovery efforts.
Not long after its formation, the group had another disaster to respond to: Superstorm Sandy devastated the U.S. East Coast, even pushing floodwaters into the heart of New York City. Scientific collaborations allowed researchers and policy makers to get a better sense of whether wetlands, sea walls or other types of infrastructure would be best to invest in to prevent future devastation. The work also gave clues as to what types of measurements, such as the height of floodwaters, should be made in the future — say, during storms like Harvey and Irma — to speed recovery efforts afterward.
Moving forward, we’re likely to see this kind of collaboration coming into play time and again. No doubt, more natural disasters loom. And other groups are getting into crisis science. For instance, Stanford University, with its Science Action Network, aims to drive interdisciplinary research during disasters and encourage communication across the many groups responding to those disasters. And the Disaster Research Response program at the National Institutes of Health provides a framework for coordinating research on the medical and public health aspects of disasters and public health emergencies.
Surges in science will stretch from plunging into the chaos of a crisis to get in-the-moment data to monitoring years of aftermath. Retrospective studies of the data collected a year, three years or even five years after a disaster could reveal where there are gaps in the science and how those can be filled in during future events.
The more data collected, the more discoveries made and lessons learned, the more likely we’ll be ready to face the next disaster.
For the first time, researchers have disabled a gene in human embryos to learn about its function.
Using molecular scissors called CRISPR/Cas9, researchers made crippling cuts in the OCT4 gene, Kathy Niakan and colleagues report September 20 in Nature. The edits revealed a surprising role for the gene in the development of the placenta.
Researchers commonly delete and disable genes in mice, fruit flies, yeast and other laboratory critters to investigate the genes’ normal roles, but have never done this before in human embryos. Last year, government regulators in the United Kingdom gave permission for Niakan, a developmental biologist at the Francis Crick Institute in London, and colleagues to perform gene editing on human embryos left over from in vitro fertilization treatments (SN Online: 2/1/16). The researchers spent nearly a year optimizing techniques in mouse embryos and human stem cells before conducting human embryo experiments, Niakan says. This groundbreaking research allows researchers to directly study human development genes, says developmental biologist Dieter Egli of Columbia University. “This is unheard of. It’s not something that has been possible,” he says. “What we know about human development is largely inferred from studies of mice, frogs and other model organisms.”
Other researchers have used CRISPR/Cas9 to repair mutated genes in human embryos (SN: 4/15/17, p. 16; SN: 9/2/17, p. 6). The eventual aim of that research is to prevent genetic diseases, but it has led to concerns that the technology could be abused to produce “designer babies” who are better looking, smarter and more athletic than they otherwise would be.
“There’s nothing irresponsible about the research in this case,” says stem cell researcher Paul Knoepfler of the University of California, Davis, School of Medicine. The researchers focused on basic questions about how one gene affects human embryo development. Such studies may one day lead to better fertility treatments, but the more immediate goal is to gain better insights into human biology.
Niakan’s group focused on a gene called OCT4 (also known as POU5F1), a master regulator of gene activity, which is important in mouse embryo development. This gene is also known to help human embryonic stem cells stay flexible enough to become any type of body cell, a property known as pluripotency. Scientists use OCT4 protein to reprogram adult cells into embryonic-like cells, an indication that it is involved in early development (SN: 11/24/07, p. 323). But researchers didn’t know precisely how the OCT4 gene operates during human development. Niakan already had clues that it works at slightly different times in human embryos than it does in mice (SN: 10/3/15, p. 13).
In the experiment, human embryos lacking OCT4 had difficulty reaching the blastocyst stage: Only 19 percent of edited embryos formed blastocysts, while 47 percent of unedited embryos did. Blastocysts are balls of about 200 cells that form about five or six days after fertilization. The ball’s outer layer of cells gives rise to the placenta. Inside the blastocyst, one type of embryonic stem cells will become the yolk sac. Another kind, about 20 cells known as epiblast progenitor cells, will give rise to all the cells in the body. Niakan and colleagues predicted from earlier work with mice and human embryonic stem cells that the protein OCT4 would be necessary for the epiblast cells to develop correctly. As predicted, “knocking out” the OCT4 gene disrupted epiblasts’ development. What the researchers didn’t expect is that OCT4 also affects the development of the placenta precursor cells on the outside of the blastocyst.
“That’s not predicted anywhere in the literature,” Niakan says. “We’ll be spending quite a lot of time on this in the future to uncover exactly what this role might be.”
At the beginning of 2017, parents and pediatricians got new peanut guidelines that, for most kids, are very pro-peanut. My colleague and fellow mom Meghan Rosen wrote about the recommendations, issued from the National Institute of Allergy and Infectious Diseases.
This “let them eat nuts” advice is based in part on a large and unusually clear dataset from a study that looked at babies at high risk of developing an allergy to peanuts. In the study, some of the children were regularly fed peanut-containing foods until their fifth birthdays. The others avoided any food with peanuts. By the end of the study, the kids who regularly ate peanut-containing food were way less likely to have a peanut allergy than the kids who had avoided the nut, the researchers found. In a nutshell, parents of low-risk babies (infants without an egg allergy or severe eczema) should feel free to put peanut-containing food in the rotation as soon as their babies are ready for solid foods, around 4 to 6 months of age.
Whole peanuts and peanut butter are both choking hazards and shouldn’t be fed to babies. Instead, peanut butter (or peanut flour or peanut butter powder) can be mixed into breast milk, formula, fruit, yogurt or purees.
Babies with severe eczema or who are allergic to eggs ought to be seen by an allergist who can help guide the introduction of peanuts to the diet. Those appointments may reveal that some babies are in fact already allergic to peanuts. For those kids, peanuts may need to be avoided altogether.
When the recommendations were released, health officials were optimistic that the advice would lead to a reduction in peanut allergy in kids, given the study that found an early peanut introduction curbed this allergy. But that intro may not be happening as early or often as officials had hoped. Bryce Hoffman, an allergist in New York City, suspected that the guidelines weren’t being used. Anecdotally, he and his colleagues hadn’t seen many infants coming into his allergy office with referrals for peanut allergy testing.
To get an idea of whether the guidelines were being applied (or not), Hoffman and his colleagues surveyed pediatricians about their advice to parents on peanuts. The results, which he presented October 30 at the annual meeting of the American College of Allergy, Asthma and Immunology, were discouraging.
Of the 79 pediatricians who responded, 30, or 38 percent, were not using the new guidelines in their practice. What’s more, 61, or 77 percent, of the pediatricians recommended high-risk patients eat peanuts later than ages 4 to 6 months, instead of sending those children to an allergist first for testing. Close to half (44 percent) of the pediatricians said they didn’t routinely test high-risk patients for allergy or send them to an allergist for testing.
This reluctance to test poses a problem for infants who “may have a dangerous anaphylactic reaction if given peanut,” Hoffman says. Testing can help clear up whether these infants should strictly avoid peanuts or be exposed to the nut first in an allergist’s office.
Peanuts got a bad rap for a long time, so it’s not surprising that parents and pediatricians might not jump at the chance to get peanut-containing food on the menu. The survey suggested that pediatricians aren’t familiar enough with the guidelines and, what’s more, are often too rushed to delve into the details during checkups. But for the new guidelines to do any good, they need to be used.
NASA’s New Horizons mission needs a catchier nickname for its next destination. The bar isn’t exactly high.
On New Year’s Day 2019, the spacecraft will fly by the tiny Kuiper Belt world that bears the official designation of (486958) 2014 MU69. NASA announced Monday that it is asking the public for an easier-to-remember nickname. The SETI Institute is hosting the contest.
As with similar crowdsourced naming campaigns, the name options vary widely. Current candidates range from Mjölnir (the hammer of the Norse god Thor) to Z’ha’dum (a planet from Babylon 5) to Peanut, Almond and Cashew — multiple name options may be necessary if the object is a binary pair. Whatever the object is named, it will be the most distant solar system body ever visited. NASA will submit a formal name (or names) to the International Astronomical Union after the flyby, based on whether MU69 turns out to be a single body, binary pair or other system.
While anyone is welcome to submit a name or vote on existing options, SETI must approve any options before they appear on the ballot. So the odds don’t look good for Planet McPlanetface.
The naming campaign will close at 3 p.m. EST on December 1. The winner will be announced in early January.