Clownfish and anemones depend on one another. The stinging arms of the anemones provide clownfish with protection against predators. In return, the fish keep the anemone clean and provide nutrients, in the form of poop. Usually, several individual clownfish occupy a single anemone — a large and dominant female, an adult male and several subordinates — all from the same species. But with 28 species of clownfish and 10 species of anemone, there can be a lot of competition for who gets to occupy which anemone.
In the highly diverse waters of the Coral Triangle of Southeast Asia, however, clownfish have figured out how to share, researchers report March 30 in the Proceedings of the Royal Society B. Anemones in these waters are often home to multiple species of clownfish that live together peacefully.
From 2005 to 2014, Emma Camp, of the University of Technology Sydney and colleagues gathered data on clownfish and their anemone homes from 20 locations that had more than one species of clownfish residents. In 981 underwater survey transects, they encountered 1,508 clownfish, 377 of which lived in groups consisting of two or more fish species in a single anemone.
Most of those cohabiting clownfish could be found in the waters of the Coral Triangle, the team found, with the highest levels of species cohabitation occurring off Hoga Island in Indonesia. There, the researchers found 437 clownfish from six species living among 114 anemones of five species. Every anemone was occupied by clownfish, and half had two species of the fish.
In general, “when the number of clownfish species exceeded the number of host anemone species, cohabitation was almost always documented,” the researchers write.
The multiple-species groups divvied up space in an anemone similar to the way that a single-species group does, with subordinate fish sticking to the peripheries. That way, those subordinate fish can avoid fights — and potentially getting kicked off the anemone or even dying. “Living on the periphery of an anemone, despite the higher risk of predation, is a better option than having no host anemone,” the team writes.
These multi-species groups might even be better for both of the clownfish species, since they wouldn’t have to compete so much over mates, and perhaps even less over food, if the species had different diets.
This isn’t the first time that scientists have found cohabitation to be an effective strategy in an area of high biodiversity. This has also been demonstrated with scorpions in the Amazon. But it does show how important it is to conserve species in regions such as this, the researchers say — because losing one species can easily wipe out several more.
Everyone ages. Growing old is a fundamental feature of human existence.
Though we might not always be aware of aging, it looms in all of our futures. As Science News editor in chief Eva Emerson writes, “Aging happens to each of us, everywhere, all the time. It is so ever-present and slow that we tend to take little notice of it. Until we do.”
But, our scientific understanding of aging pales in comparison to its significance in our lives. While new studies reveal exciting prospects for slowing the effects of aging, its causes and extensive effects remain enigmatic. Scientists are still divided on some fundamentals of aging, and that’s why aging research raises some interesting questions. For example, how does it change the brain? How did different life histories evolve? How old is the oldest blue whale? This special report addresses those questions and more.
Standoffish electrons typically keep one another at arm’s length, repelling their neighbors. But surprisingly, under certain circumstances, this repulsion can cause pairs of electrons to soften their stance toward one another and attract instead, new research shows. The effect may be the key to someday producing a new type of high-temperature superconductor, scientists report in the July 21 Nature.
Though the effect was first predicted over 50 years ago, previous attempts to coerce electrons to behave in this chummy way have failed. Like charges repel, so negatively charged electrons ordinarily rebuff one another. But now researchers have validated the counterintuitive idea that an attraction between electrons can emerge. “Somehow, you have [this] magic that out of all this repulsion you can create attraction,” says study coauthor Shahal Ilani, a physicist at the Weizmann Institute of Science in Rehovot, Israel. Ilani and colleagues produced the effect in a bare-bones system of electrons in carbon nanotubes. Operating at temperatures just above absolute zero, the system is made up of two perpendicular carbon nanotubes — hollow cylinders of carbon atoms — about 1 nanometer in diameter.
Two electrons sit at sites inside the first nanotube. Left to their own devices, those two electrons repel one another. A second nanotube, known as the “polarizer,” acts as the “glue” that allows the two electrons to attract. When the scientists brought the two nanotubes close together, says Ilani, “the electrons in the first nanotube changed their nature; they became attractive instead of repulsive.”
This flip is due to the nature of the polarizer. It contains one electron, which is located at one of two sites in the carbon nanotube — either between the first nanotube’s pair of electrons or farther away. The pair of electrons in the first nanotube repels the polarizer’s electron, kicking it from the near to the far site. And the electron’s absence leaves behind a positively charged vacancy, which attracts the pair of electrons toward it — and toward each other. It’s a “tour de force,” says Takis Kontos, a physicist at the École Normale Supérieure in Paris, who wrote a commentary on the paper in the same issue of Nature. Although the system the scientists created is very simple, he says, “the whole experiment built around it is extremely complex.” Electrons are known to attract in certain situations. In conventional superconductors, electrons pair up due to their interactions with ions in the material. This buddy system allows superconductors to conduct electricity without resistance. But such superconductors must be cooled to very low temperatures for this effect to occur.
But in 1964, physicist William Little of Stanford University theorized that electron pairs could likewise attract due to their interactions with other electrons, instead of ions. Such pairs should stay linked at higher temperatures. This realization sparked hopes that a material with these attracting electrons could be a room-temperature superconductor, which would open up a wealth of technological possibilities for efficiently transmitting and storing energy.
It’s yet to be seen whether the effect can produce a superconductor, and whether such a superconductor might work at higher temperatures — the new discovery shows only that the attraction can occur due to electrons’ repulsion. It’s “the first important step,” says Ilani. Now, scientists can start thinking of how to build “interesting new materials that are very different than what you can find in nature.”
As our first beach vacation with two little kids loomed, I had to do one of those chores that sounds easy but turns out to be anything but. I had to buy sunscreen. It seems like the task should take five minutes on Amazon. But as any parent with an internet connection knows, the choice is fraught.
You can pick from formulas that block rays with chemicals or minerals such as zinc oxide. SPFs can surpass 100. There are lotions and sprays, “organic” and “natural,” “sensitive” and “sport.” Some are marketed specifically for kids. Some are endorsed by various interest groups. And then there’s the cost. Highly rated sunscreens on Amazon vary in cost by 3,000 percent. Amid the chaos, I ended up picking an SPF 30 and being done with it.
With the beach vacation behind us, I’ve had the presence of mind to take a clear-eyed look at the sunscreen boondoggle. It seems that my trouble was in some ways a mess of my own making. I was distracted by zippy marketing words that obscured the core attributes of a good sunscreen. It turns out that the task can be pretty simple, if you keep a few key things in mind.
Look for SPF of 30 or higher. The sun protection factor is a measurement that tells you how long the sunscreen will protect your skin from sunburn-causing ultraviolet-B rays, as compared to no sunscreen at all. Sunscreens with an SPF of 30 will protect you from UV-B rays for 30 times longer than normal. Say you’d normally burn after 20 minutes in the sun without any sunscreen. After you correctly apply a sunscreen with SPF 30, you’d be able to go 10 hours. Sunscreens with an SPF of 30 will block 97 percent of UV-B rays. There may be diminishing returns as the SPF number goes up, though. There’s little evidence that SPFs over 50 offer increasing protection.
Make sure the sunscreen says “broad spectrum” on the label. That means the sunscreen will thwart both UV-A and UV-B rays. UV-A rays are thought to be responsible for deep skin damage, while UV-Bs are the ones that cause sunburns. Both flavors of the sun’s rays can increase the risk of skin cancer.
Choose one that’s water resistant. That doesn’t mean that water won’t wash it off. No sunscreen is completely waterproof, a fact that means sunscreens are no longer allowed to make that claim.
Don’t necessarily trust the ratings. Of the sunscreens in the top 1 percent on Amazon, a full 40 percent of them failed to meet the criteria set out by the American Academy of Dermatology — SPF of 30 or higher, broad spectrum and water resistant, a recent JAMA Dermatology paper found.
Use it. As study coauthor Steve Xu of Northwestern University says, “Picking the right product is only the first step. Using it correctly is just as important.” Put sunscreen on to your kids before they go outside, so the sunscreen can soak in. Make sure to cover all exposed skin, including the tops of ears and toes. And reapply every two hours, or immediately after your kids get out of water.
Those are the main points. Of course, you could choose to wade through a lot more weeds in the decision. Sunscreens based on physical blockers, such as zinc oxide or titanium dioxide, reflect the sun’s rays. Chemical blockers such as oxybenzone absorb the rays and get rid of the extra energy in harmless ways. Mineral-based sunscreens may be less likely to irritate babies’ and children’s skin than chemical blockers, for instance.
The jury is still out on spray sunscreen, which aerosolizes the particles and promises fewer child chase-downs. The FDA has called for more data to evaluate those. And then there’s the question of sunscreen for babies younger than six months. Most sources say that when possible, opt for shade and hats instead of sunscreen for the littlest babies.
So as you prepare for your time in the sun, stick to a few basic facts to help you choose a sunscreen. Instead, apply the time you save toward forcing cute sunhats on your squirmy kids.
Figuring out the nuts and bolts of the cell’s recycling machinery has earned the 2016 Nobel Prize in physiology or medicine. Cell biologist Yoshinori Ohsumi of the Tokyo Institute of Technology has received the prize for his work on autophagy, a method for breaking down and recycling large pieces of cellular junk, such as clusters of damaged proteins or worn-out organelles.
Keeping this recycling machinery in good working condition is crucial for cells’ health (SN: 3/26/11, p. 18). Not enough recycling can cause cellular trash to build up and lead to neurological diseases such as Alzheimer’s and Parkinson’s. Too much recycling, on the other hand, has been linked to cancer. “It’s so exciting that Ohsumi has received the Nobel Prize, which he no question deserved,” says biologist Jennifer Lippincott-Schwartz of Howard Hughes Medical Institute’s Janelia Research Campus in Ashburn, Va. “He set the framework for an entire new field in cell biology.”
Ohsumi‘s discoveries helped reveal the mechanism and significance of a fundamental physiological process, biologist Maria Masucci of the Karolinska Institute in Sweden said in a news briefing October 3. “There is growing hope that this knowledge will lead to the development of new strategies for the treatment of many human diseases.”
Scientists got their first glimpse of autophagy in the 1960s, not long after the discovery of the lysosome, a pouch within cells that acts as a garbage disposal, grinding fats and proteins and sugars into their basic building blocks. (That discovery won Belgian scientist Christian de Duve a share of the Nobel Prize in 1974.) Researchers had observed lysosomes stuffed with big chunks of cellular material — like the bulk waste of the cellular world — as well as another, mysterious pouch that carried the waste to the lysosome.
Somehow, the cell had devised a way to consume large parts of itself. De Duve dubbed the process autophagy, from the Greek words for “self” and “to eat.” But over the next 30 years, little more became known about the process. “The machinery was unknown, and how the system was working was unknown, and whether or not it was involved in disease was also unknown,” said physiologist Juleen Zierath, also of the Karolinska Institute, in an interview after the prize’s announcement.
That all changed in the 1990s when Ohsumi decided to study autophagy in a single-celled organism called baker’s yeast, microbes known for making bread rise. The process was tricky to catch in action, partly because it happened so fast. So Ohsumi bred special strains of yeast that couldn’t break down proteins in their cellular garbage disposals (called vacuoles in yeast).
“He reasoned that if he could stop the degradation process, he could see an accumulation of the autophagy machinery in these cells,” Zierath said.
And that’s just what Ohsumi saw. When he starved the yeast cells, the “self-eating” machinery kicked into gear (presumably to scrounge up food for the cells). But because the garbage disposals were defective, the machinery piled up in the vacuoles, which swelled like balloons stuffed with sand. Ohsumi could see the bulging, packed bags clearly under a light microscope. He published the work in a 1992 paper in the Journal of Cell Biology. Finding the autophagy machinery let Ohsumi study it in detail. A year later, he discovered as many as 15 genes needed for the machinery to work. In the following years, Ohsumi and other scientists examined the proteins encoded by these genes and began to figure out how the components of the “bulk waste” bag, or autophagosome, came together, and then fused with the lysosome.
The work revealed something new about the cell’s garbage centers, Zierath said. “Before Ohsumi came on the scene, people understood that the waste dump was in the cell,” she said. “But what he showed was that it wasn’t a waste dump. It was a recycling plant.”
Later, Ohsumi and his colleagues studied autophagy in mammalian cells and realized that the process played a key maintenance role in all kinds of cells, breaking down materials for reuse. Ohsumi “found a pathway that has its counterparts in all cells that have a nucleus,” says 2013 Nobel laureate Randy Schekman, a cell biologist at the University of California, Berkeley. “Virtually every corner of the cell is touched by the autophagic process.”
Since Ohsumi’s discoveries, research on autophagy has exploded, says Lippincott-Schwartz. “It’s an amazing system that every year becomes more and more fascinating.”
Ohsumi, 71, remains an active researcher today. He received the call from the Nobel committee at his lab in Japan. The prize includes an award of 8 million Swedish kronor (equivalent to about $934,000). About his work, he said: “It was lucky. Yeast was a very good system, and autophagy was a very good topic.”
Still, he added in an interview with a Nobel representative, “we have so many questions. Even now we have more questions than when I started.”
SAN DIEGO — A small number of people maintain razor-sharp memories into their 90s, despite having brains chock-full of the plaques and tangles linked to Alzheimer’s disease. Researchers suspect that these people’s brains are somehow impervious to the usual devastation thought to be caused by those plaques and tangles.
Researchers studied the brains of people 90 years old or older who had excellent memories, performing as well as people in their 50s and 60s on some tests. Postmortem brain tissue from eight such people revealed a range of Alzheimer’s features. Two participants had remarkably clean brains with few signs of amyloid-beta plaques and tangles of tau protein. Four participants had middling levels. Surprisingly, the other two samples were packed with plaques and tangles, enough to qualify those people for an Alzheimer’s diagnosis based on their brains. “These people, for all practical purposes, should be demented,” study coauthor Changiz Geula of Northwestern University’s medical school said November 15 in a news briefing at the annual meeting of the Society for Neuroscience.
Further tests revealed that even in the midst of these Alzheimer’s hallmarks, nerve cells survived in people with strong memories. Those people had more healthy-looking nerve cells than people with dementia and similar plaque and tangle levels. The researchers don’t know how these mentally sharp people avoid the ravages thought to be caused by plaques and tangles. “What’s surprising is this segment of people does exist,” Geula says. “We have to find out why.”
When Christian Agrillo runs number-related experiments in his lab, he wishes his undergraduate subjects good luck. For certain tests, that’s about all he says. Giving instructions to the people would be unfair to the fish.
Agrillo, of the University of Padua in Italy, is finishing up several years of pitting humans against fish in trials of their abilities to compare quantities. He can’t, of course, tell his angelfish or his guppies to choose, say, the larger array of dots. So in recent tests he made the bemused students use trial and error too. “At the end, they start laughing when they find they are compared with fish,” he says. Yet the fish versus humans face-offs are eye-opening comparisons in his search for the deep evolutionary basis of what has blossomed into human mathematics. If it turns out that fish and people share some idiosyncrasies of their number sense (like spidey sense, except focused on quantities rather than danger), those elements might in theory date from a common ancestor more than 400 million years old. Comparisons of animals’ mental powers are “the paleontology of cognition,” Agrillo says.
No one seriously argues that animals other than people have some kind of symbolic numeral system, but nonhuman animals — a lot of them — can manage almost-math without numbers.
“There’s been an explosion of studies,” Agrillo says. Reports of a quantity-related ability come from chickens, horses, dogs, honeybees, spiders, salamanders, guppies, chimps, macaques, bears, lions, carrion crows and many more. And nonverbal number sensing, studies now suggest, allows much fancier operations than just pointing to the computer screen that shows more dots.
News stories on this diversity often nod to the idea that such a broad sweep of numberlike savvy across the animal tree of life could mean that animals all inherited rudiments of quantification smarts from a shared ancestor. Some scientists think that idea is too simple. Instead of inheriting the same mental machinery, animals could have just happened upon similar solutions when confronting the same challenge. (Birds and bats both fly, but their wings arose independently.)
Chasing down those deep origins means figuring out how animals, including humans too young or too rushed, manage quantitative feats without counting. It’s not easy. Putting together what should be a rich and remarkable story of the evolution of nonverbal number sense is just beginning. Who’s (sort of) counting? Symbolic numbers do marvels for humankind, but for millions of years, other animals without full powers to count have managed life-and-death decisions about magnitude (which fruit pile to grab, which fish school to join, whether there are so many wolves that it’s time to run). Counting dog treats For a sense of the issues, consider the old and the new in dog science. Familiar as dogs are, they’re still mostly wet-nosed conundrums when it comes to their number sense.
When food is at stake, dogs can tell more from less, according to a string of laboratory studies over more than a decade. And dogs may be able to spot cheating when people count out treats. Dog owners may not be amazed at such food smarts, but the interesting question is whether dogs solve the problem by paying attention to the actual number of goodies they see, or some other qualities.
An experiment in England in 2002, for instance, let 11 pet dogs settle down in front of a barrier that researchers then moved so the dogs could get a peek at a row of bowls. One bowl held a Pedigree Chum Trek treat. The barrier went up again, and researchers lowered a second treat into a bowl behind the screen, or sometimes just pretended to. When the barrier dropped again, the dogs overall stared a bit longer if only one treat was visible than if 1 + 1 had indeed equaled 2. Five of the dogs, in an extra test, also stared longer on average after a researcher covertly sneaked an extra treat into a bowl and then lowered the barrier on the unexpected 1 + 1 = 3.
Dogs could in theory recognize funny business by paying attention to the number of treats — or the treats’ “numerosity,” as researchers often call a quantity recognized nonverbally. But, depending on the design of a test, dogs might also get the right answers by judging the total surface area of treats instead of their numerosity. A multitude of other clues — density of objects in a cluster, a cluster’s total perimeter or darkness and so on — would also work. Researchers lump those giveaways under the term “continuous” qualities, because they change in a smooth continuum of increments instead of in the discrete 1, 2, 3.
The continuous qualities present a real staring-at-the-ceiling, heavy-sigh challenge for anyone inventing a numerosity test. By definition, nonverbal tests don’t use symbols such as numbers, so an experimenter has to show something, and those somethings inevitably have qualities that intensify or dwindle as the numerosity does. To at least see whether dogs evaluate total area to choose more food, Krista Macpherson of the University of Western Ontario in Canada devised a task for her rough collie Sedona. The dog had already served as an experimental subject in Macpherson’s earlier test of whether real dogs would try to seek help for their owners in danger, as TV’s trusty Lassie did. Sedona hadn’t tried to seek help for Macpherson (no dog in the test aided its owner), but she had proved amenable to doing lab work, especially for bits of hot dog or cheese.
Sedona was put to work to select whichever of two magnet boards had a greater number of geometric shapes fastened to it. Macpherson varied the dimensions of black triangles, squares and rectangles so that their total surface area wasn’t a reliable clue to the right answer.
The idea came from an experiment involving monkeys that reacted to a computer touch screen. But “I’m all cardboard and tape,“ Macpherson says. Sedona was perfectly happy to look at two magnet boards fastened to cardboard boxes on the ground and then indicate her choice by knocking over a box.
Sedona in the end triumphed at picking the box with more geometric thingies regardless of area, though the project took considerable effort from both woman and beast. The dog worked through more than 700 trials, starting as simply as 0 versus 1 and eventually scoring better than chance scrutinizing bigger magnitudes, such as 6 versus 9, Macpherson and William A. Roberts reported in Learning and Motivation in 2013. (Eight versus nine finally stumped the collie, but more on patterns in accuracy later.) In a 2016 paper in Behavioural Processes, another lab hailed the Sedona research as the “only evidence of dogs’ ability to use numerical information.”
More is better Dogs might have number sense, but when, or how much, they use it is another matter, notes Clive Wynne of Arizona State University in Tempe, a coauthor of that 2016 paper. To see what dogs do in more natural situations, he and Maria Elena Miletto Petrazzini of the University of Padua designed a test offering pets at a doggie daycare a choice of two plates of cut-up treat strips. A mix of breeds considered such options as a few big treat strips versus a smaller total amount of treats cut up into numerous small pieces. The dogs, without Sedona’s arduous training, went for the greater total amount of food, regardless of the number of pieces. Of course they did; it’s food — more is better. Without controls, food tests may not be measuring numerosity at all.
It’s not just edibility that affects whether an animal pays attention to numerosity. Experience with similarity or differences in objects can matter. Rosa Rugani, also at Padua, has pioneered studying number sense in recently hatched chicks, which can learn experimental procedures fast if she gets them motivated. “One of the more fascinating challenges of my job is to come up with ‘games’ the chicks like to play,” she says. Newly hatched chicks can develop a strong social attachment to objects, as if little plastic balls or ragged crosses of colored bars were pals to huddle near in a flock. Taking advantage of this tendency, Rugani let day-old chicks imprint on either two or three objects. Then she watched them choose between two little flocks of novel pals to toddle over to. If the potential buddy-objects in a flock looked identical to each other, the chicks in the test typically just moved near the larger cluster or largest object. But if the buddies in each group had individual quirks, mixing colors, shapes and sizes, the chicks paid attention to numerosity. Those imprinted on three pals were a bit more likely to club with three different kinds of pals; those imprinted on the pairs more often clubbed with the twos.
Some animals can deal with what people would call numerical order, or ordinality. Rats have learned to choose a particular tunnel entrance, such as the fourth or 10th from the end, even when researchers fiddled with distances between entrances. Five-day-old chicks rewarded for pecking at an item in a sequence, the fourth hole or the third jar, still showed a preference for position when researchers lengthened the distances between options or even moved the whole array.
Rhesus monkeys react if researchers violate rules of addition and subtraction, as dogs seemed to do in the Chums experiment. Chicks can track additions and subtractions too, well enough to pick the card hiding the bigger result. The chicks can also go one better. Rugani and colleagues have shown that chicks have some sense of ratios, for example choosing between mixes of red and green dots to match a ratio they learned from such mixes as 18 greens mingling with 9 reds.
A sense of numerosity itself, regardless of volume or surface area, may not be limited to fancy vertebrate brains. One recently published test takes advantage of overkill among golden orb-web spiders (Nephila clavipes). When they have a crazy run of luck catching insects faster than they can eat them, the spiders wrap each catch in silk and fasten it with a single strand to dangle from the center of the web. Turning this hoarding tendency into a test, Rafael Rodríguez of the University of Wisconsin–Milwaukee tossed bits of mealworms of different sizes into the web as spiders created a dangling treasure trove. Then shooing the spider off the web, he snipped the strands and watched how long the spiders searched for their stolen meals. Losing a greater volume of food inspired more strumming of the web and searching about. But losing four items instead of just one or two increased the search time even more, Rodríguez and his colleagues reported in 2015 in Animal Cognition. It’s not just volume of food in a hoard, they argue. Numerosity has its own effects.
At a glance Nonhuman animals don’t have human language for counting, so researchers studying behavior talk about an “approximate number system” that allows for good-enough estimates of quantities with no real counting. One of the features of this still mysterious system is its declining accuracy in comparing bigger numbers that are very close together, the trend that made Sedona the collie’s struggles as noteworthy as her successes.
As the ratios of the two quantities Sedona had to compare drew closer to 1, she was more prone to make mistakes. Her scores worsened as she moved from 0.11 (comparing 1 to 9), 0.2 (1 to 5) and so on. She never conquered the fiendish 8 versus 9. That same trend, described by what’s called Weber’s law, shows up in humans’ nonverbal approximate number system as well as in those of other animals. When Agrillo tested guppies against humans, both fell behind in accuracy for such difficult comparisons as 6 versus 8. But for small quantities, both fish and people performed well, he and colleagues reported in 2012. People and fish could tell 3 dots from 4 about as reliably as 1 dot from 4. Researchers have long recognized this instant human ease of dealing with very small quantities, calling it subitizing: suddenly just seeing that there are three dots or ducks or daffodils without having to count them. Agrillo suspects the underlying mechanism will prove different from the approximate number systems, though he describes this as a minority view.
The similarity between guppies and people in subitizing skill doesn’t prove it’s a shared inheritance from that ancient common ancestor several hundred million years ago, Agrillo says. Yet the similarity does raise the possibility. Struggling to separate some pure response to numerosity from all the confounding surface areas and other continuous qualities may not even be the most important question, says Lisa Cantrell, now at the University of California, Davis. Human babies, as an example of noncounting animals, might start figuring out the world by relying on these other confounders and grow into their numerical abilities, she and Linda Smith of Indiana University, Bloomington, suggested in 2013. The hypothesized approximate number system might be part of some more general way of perceiving the world, which can draw on multiple clues to get a clearer sense of quantity. Cantrell and Smith called their version of the idea the “signal clarity hypothesis.”
Into their heads Studying behavior alone isn’t enough to trace the inheritance of any part of number savvy, says Andreas Nieder of the University of Tübingen in Germany. “At the behavioral level, it may look as if number estimation follows the same laws, but the underlying neural code could actually look quite different.”
He’s not going as far afield as fish yet, but Nieder and colleagues have looked at how monkey and bird brains handle quantity. The researchers described neurons (nerve cells) in the brains of carrion crows (Corvus corone corone) that function much like those in rhesus macaques.
Research in monkeys over the last 15 years has identified what Nieder calls “number neurons.” They could have multiple functions, but each responds to a specific number of whatevers, be it six crows or six crowbars. Some number neurons respond to sight, some to sound, and amazingly, some to either. The neurons could be responding to increasing total surface area or density or darkness. But researchers have varied one aspect at a time, and used multiple imaging and pharmacological techniques, to argue that as far as strenuous efforts can tell, these neurons detect the actual numerosity.
Individual neurons in parts of a monkey brain have their own preferred number and respond most strongly to it and less so to neighboring numbers. The neurons for three get less excited for two and four, while others light up at four. In 2015, Nieder and colleagues started untangling how monkey neurons handle zero, suggesting the beginnings of an ability to treat “nothing there” as an abstract numerosity of zero.
These neurons lie in notable places: the six-layered neocortex of the parietal and frontal lobes of the brain. That’s territory that primates boast about, a feature of mammalian brain structure credited with allowing human mental capacities to reach such heights. Nonmammalian vertebrates, including birds, don’t have a multilayered neocortex. Yet Nieder and colleagues have, for the first time, detected individual neurons in the bird brain that fire in response to numerosities much as primate number neurons do.
Story continues after graph
Neurons for numbers Recordings from four nerve cells in monkeys suggest each cell responds most to a particular number of dots (lines with circles) and the same number of musical tones (squares). The bird versions of number neurons lie in a relatively newfangled area of the avian brain called the nidopallium caudolaterale, or NCL. It didn’t exist as such, nor did the primate’s precious neocortex, in the reptile-ish ancestors that mammals and birds last shared some 300 million years ago. Both the bird NCL and the primate number neuron zones arose from the same tissue, the pallium. In mammals, that ancient pallium morphed into layers of neocortex tissue, in birds the transformation went a different way.
For the number sense tingling through specialized neurons in birds and primates alike, similarity does not strictly mean shared inheritance, Nieder wrote in the June Nature Reviews Neuroscience. The systems of number neurons probably specialized independently.
Finding some brain structures to compare across deep time is a promising step in fathoming the evolution of animal number sense, but it’s just a beginning. There are many questions about how the neurons work, not to mention what’s going on in all those other brains that contemplate quantity. For now, looking across the tree of life at the crazy abundance of number smarts, which may or may not be related but are certainly numerous, the clearest thing to say may be just: Wow.
Scientists investigating what keeps lungs from overinflating can quit holding their breath.
Experiments in mice have identified a protein that senses when the lungs are full of air. This protein helps regulate breathing in adult mice and gets breathing going in newborn mice, researchers report online December 21 in Nature.
If the protein plays a similar role in people — and a few studies suggest that it does — exploring its activity could help explain disorders such as sleep apnea or chronic obstructive pulmonary disease. “These are extremely well done, very elegant studies,” says neonatologist Shabih Hasan of the University of Calgary in Canada, a specialist in breathing disorders in newborns. Researchers knew that feedback between the lungs and brain maintains normal breathing. But “this research give us an understanding at the cellular level,” says Hasan. “It’s a major advance.”
Called Piezo2, the protein forms channels in the membranes of nerve cells in the lungs. When the lungs stretch, the Piezo2 channels detect the distortion caused by the mechanical force of breathing and spring open, triggering the nerves to send a signal. Led by neuroscientist Ardem Patapoutian, researchers discovered that the channels send signals along three different pathways. Mice bred to lack Piezo2 in a cluster of nerve cells that send messages to the spinal cord had trouble breathing and died within 24 hours. Similarly, newborn mice missing Piezo2 channels in nerves that communicate with the brain stem via a structure called the jugular ganglion also died. Mice lacking Piezo2 in the nodose ganglion, a structure that also links to the brain stem, lived to adulthood. But their breathing was abnormal and an important safety mechanism in the lungs of these mice didn’t work. Called the Hering-Breuer reflex, it kicks in when the lungs are in danger of overinflating. When functioning properly, Piezo2’s signal prevents potentially harmful overinflation by temporarily halting breathing. Known as apnea, this cessation of breathing can be dangerous in other instances but prevents damage in this case.
“Breathing is a mechanical process,” says Patapoutian, a Howard Hughes Medical Institute investigator at the Scripps Research Institute in La Jolla, Calif. “Intuitively, you could imagine that lung stretch sensors could play an important role in regulating breathing pattern. Amazingly, however, no definitive proof for such a feedback mechanism existed.”
Previous work in mice by Patapoutian and colleagues found that Piezo2 channels play a major role in sensing touch. The channels also function in proprioception, the sense of where body parts are in relation to each other, Patapoutian and colleagues reported last year.
Two recent studies by different research teams have found that people with mutations in a Piezo2 gene have problems with touch, proprioception and, in one study, breathing. Although small, the studies suggested that investigating Piezo2 in people could shed light on breathing disorders or other problems. The protein channels might play a role in sensing the “fullness” of the stomach and bladder and perhaps other mechanical processes such as heart rate control, Patapoutian says.
Investigating Piezo2 could also help explain how newborn lungs transition from being fluid-filled to breathing air, says neuroscientist Christo Goridis of the École des Neurosciences Paris.
Everybody wants more juice from their batteries. Smartphones and laptops always need recharging. Electric car drivers must carefully plan their routes to avoid being stranded far from a charging station. Anyone who struggles with a tangle of chargers every night would prefer a battery that can last for weeks or months.
For researchers who specialize in batteries, though, the drive for a better battery is less about the luxury of an always-charged iPad (though that would be nice) and more about kicking our fossil fuel habit. Given the right battery, smog-belching cars and trucks could be replaced with vehicles that run whisper-quiet on electricity alone. No gasoline engine, no emissions. Even airplanes could go electric. And the power grid could be modernized to use cheaper, greener fuels such as sunlight or wind even on days when the sun doesn’t shine bright enough or the wind doesn’t blow hard enough to meet electricity demand.
A better battery has the potential to jolt people into the future, just like the lithium-ion battery did. When they became popular in the early 1990s, lithium-ion batteries offered twice as much energy as the next best alternative. They changed the way people communicate.
“What the lithium-ion battery did to personal electronics was transformational,” says materials scientist George Crabtree, director of the Joint Center for Energy Storage Research at Argonne National Laboratory in Illinois. “The cell phone not only made landlines obsolete for many, but [the lithium-ion battery] put cameras and the internet into the hands of millions.” That huge leap didn’t happen overnight. “It was the sum of many incremental steps forward, and decades of work,” says Crabtree, who coordinates battery research in dozens of U.S. labs. Lithium-ion batteries have their limits, however, especially for use in the power grid and in electric vehicles. Fortunately, like their Energizer mascot, battery researchers never rest. Over the last 10 years, universities, tech companies and car manufacturers have explored hundreds of new battery technologies, reaching for an elusive and technically difficult goal: next-generation batteries that hold more energy, last longer and are cheaper, safer and easier to recharge.
A decade of incremental steps are beginning to pay off. In late 2017, scientists will introduce a handful of prototype batteries to be developed by manufacturers for potential commercialization. Some contain new ingredients — sulfur and magnesium — that help store energy more efficiently, delivering power for longer periods. Others will employ new designs. “These prototypes are proof-of-principle batteries, miniature working versions,” Crabtree says. Getting the batteries into consumer hands will take five to 10 years. Making leaps in battery technology, he says, is surprisingly hard to do.
Power struggle Batteries operate like small chemical plants. Technically, a battery is a combination of two or more “electrochemical cells” in which energy released by chemical reactions produces a flow of electrons. The greater the energy produced by the chemical reactions, the greater the electron flow. Those electrons provide a current to whatever the battery is powering — kitchen clock, smoke alarm, car engine.
To power any such device, the electrons must flow through a circuit connecting two electrodes, known as an anode and a cathode, separated by a substance called an electrolyte. At the anode, chemical oxidation reactions release electrons. At the cathode, electrons are taken up in reduction reactions. The electrolyte enables ions created by the oxidation and reduction reactions to pass back and forth between the two electrodes, completing the circuit.
Depending on the materials used for the electrodes and the electrolyte, a battery may be recharged by supplying current that drives the chemical reactions in reverse. In creating new recipes for a rechargeable electrochemical soup, though, battery researchers must beware of side reactions that can spoil everything. “There’s the chemical reaction you want — the one that stores energy and releases it,” Crabtree says. “But there are dozens of other … reactions that also take place.” Those side reactions can disable a battery, or worse, lead to a risk of catastrophic discharge. (Consider the recent fires in Samsung’s Galaxy Note 7 smartphones.)
Early versions of the lithium-ion battery from the 1970s carried an anode made of pure lithium metal. Through repeated use, lithium ions were stripped off and replated onto the anode, creating fingerlike extensions that reached across to the cathode, shorting out the battery. Today’s lithium-ion batteries have an anode made of graphite (a form of carbon) so that loose lithium ions can snuggle in between sheets of carbon atoms.
Lithium-ion batteries were originally developed with small electronics in mind; they weren’t designed for storing electricity on the grid or powering electric vehicles. Electric cars need lots of power, a quick burst of energy to move from stop to start. Electric car manufacturers now bundle thousands of such batteries together to provide power for up to 200 miles before recharging, but that range still falls far short of what a tank of gas can offer. And lithium-ion batteries drain too quickly to feed long hours of demand on the grid.
Simply popping more batteries into a car or the grid isn’t the answer, Crabtree says. Stockpiling doesn’t improve the charging time or the lifetime of the battery. It’s also bulky. Carmakers have to leave drivers room for their passengers plus some trunk space. To make electric vehicles competitive with, or better than, vehicles run by internal-combustion engines, manufacturers will need low-cost, high-energy batteries that last up to 15 years. Likewise, grid batteries need to store energy for later use at low cost, and stand up to decades of use.
“There’s no one battery that’s going to meet all our needs,” says MIT materials scientist Yet-Ming Chiang. Batteries needed for portable devices are very different from those needed for transportation or grid-scale storage. Expect to see a variety of new battery types, each designed for a specific application. Switching to sulfur For electric vehicles, lithium-sulfur batteries are the next great hope. The cathode is made mainly of sulfur, an industrial waste product that is cheap, abundant and environmentally friendly. The anode is made of lithium metal.
During discharge, lithium ions break away from the anode and swim through the liquid electrolyte to reach the sulfur cathode. There, the ions form a covalent bond with the sulfur atoms. Each sulfur atom bonds to two lithium ions, rather than just one, doubling the number of bonds in the cathode of the battery. More chemical bonds means more stored energy, so a lithium-sulfur battery creates more juice than a lithium-ion one. That, combined with sulfur’s light weight, means that, in principle, manufacturers can pack more punch for a given amount of weight, storing four or five times as much energy per gram.
Ultimately, that upgrade could boost an electric vehicle’s range up to as much as 500 miles on a single charge. But first, researchers have to get past the short lifetime of existing lithium-sulfur batteries, which, Crabtree says, is due to a loss of lithium and sulfur in each charge-discharge cycle.
When lithium combines with sulfur, it also forms compounds called polysulfides, which quickly gum up the battery’s insides. Polysulfides form within the cathode during battery discharge, when stored energy is released. Once they dissolve in the battery’s liquid electrolyte, the polysulfides shuttle to the anode and react with it, forming a film that renders the battery useless within a few dozen cycles — or one to two months of use.
At Sandia National Laboratories in Albuquerque, N.M., a team led by Kevin Zavadil is trying to block the formation of polysulfides in the electrolyte. The electrolyte consists of salt and a solvent, and current lithium-sulfur batteries require a large amount of electrolyte to achieve a moderate life span. Zavadil and his team are developing “lean” electrolyte mixtures less likely to dissolve the sulfur molecules that create polysulfides.
Described September 9 in ACS Energy Letters, the new electrolyte mix contains a higher-than-usual salt concentration and a “sparing” amount of solvent. The researchers also reduced the overall amount of electrolyte in the batteries. In test runs, the tweaks dropped the concentration of polysulfides by several orders of magnitude, Zavadil says. “We [also] have some ideas on how to use membranes to protect the lithium surface to prevent polysulfides from happening in the first place,” Zavadil says. The goal is to produce a working prototype of the new battery — one that can last through thousands of cycles — by the end of 2017.
At the University of Texas at Austin, materials engineer Guihua Yu, along with colleagues at Zhejiang University of Technology in Hangzhou, China, is investigating another work-around for this battery type: replacing the solid sulfur cathode with an intricate structure that encapsulates the sulfur in an array of nanotubes. Reported in the November issue of Nano Letters, the nanotubes that encase the sulfur are fashioned from manganese dioxide, a material that can attract and hold on to polysulfides. The nanotubes are coated with polypyrrole, a conductive polymer that helps boost the flow of electrons.
This approach reduces buildup and boosts overall conductivity and efficiency, Yu says. So far, with the group’s new architecture, the battery loses less than 0.07 percent of its capacity per charge and discharge cycle. After 500 cycles, the battery maintained about 65 percent of its original capacity, a great improvement over the short lifetime of current lithium-sulfur batteries. Still, for use in electric vehicles, scientists want batteries that can last through thousands of cycles, or 10 to 15 years. Scientists at Argonne are constructing another battery type: one that replaces lithium ions at the anode with magnesium. This switch could instantly boost the electrical energy released for the same volume, says Argonne materials engineer Brian Ingram, noting that a magnesium ion has a charge of +2, double that of lithium’s +1. Magnesium’s ability to produce twice the electrical current of lithium ions could allow for smaller, more energy-dense batteries, Ingram says.
Magnesium comes with its own challenge, however. Whereas lithium ions zip through a battery’s electrolyte, magnesium ions slowly trudge. A team of researchers at Argonne, Northwestern University and Oak Ridge National Laboratory shot high-energy X-rays at magnesium in various batteries and learned that the drag is due to interactions with molecules that the magnesium attracts within the electrolyte. Ingram and his group are experimenting with new materials to find a molecular recipe that reduces such drag.
Ingram’s team is trying to nudge its “highly functioning, long-lasting” prototype to 3 volts by December. Today’s typical lithium-ion battery has 3.8 to 4 volts. At 3 volts, Ingram says, the magnesium battery would pack more power than a 4-volt lithium-ion battery and “create a tremendous amount of excitement within the field.”
Going with the flow Together, transportation and the electricity grid account for about two-thirds of U.S. energy use. But today, only 10 percent of the electricity on the grid is from renewable sources, according to the U.S. Energy Information Administration. If wind and solar power are ever to wrestle energy production away from fossil fuels, big changes must happen in energy storage. What is needed, Crabtree says, is a battery that can store energy, and lots of it, for later use. “Though the sun shines in the middle of the afternoon, peak demand comes at sunset when people go home, turn on lights and cook dinner,” he says.
To reliably supply electricity at night or on cloudy, windless days requires a different type of battery. By design, flow batteries fit the bill. Instead of having solid electrodes, flow batteries store energy in two separate tanks filled with chemicals — one positively charged, the other negatively charged. Pumps move the liquids from the tanks into a central chamber, or “stack,” where dissolved molecules in the liquids undergo chemical reactions that store and give up energy. A membrane located in the stack keeps the positive and negative ions separated. Flow batteries can store energy for a long time and provide power as needed. Because the energy-storing liquids are kept in external tanks, the batteries are unlikely to catch fire, and can be built large or small depending on need. To store more power, use a larger tank.
So far, however, flow batteries are expensive to make and maintain, and have had limited use for providing backup power on the grid. Today’s flow batteries are packed with rare and toxic metal components, usually vanadium. With many moving parts — tanks, pumps, seals and sensors — breakdowns and leakage are common.
At MIT, Chiang and colleagues are developing flow batteries that can bypass those drawbacks. One, an hourglass flow battery, does away with the need for costly and troublesome pumps. The stack where the chemical reactions occur is in the constricted middle, with tanks at either end. Gravity allows the liquids to flow through the stack, like sand in an hourglass. A motor adjusts the battery’s angle to speed or slow the flow.
The hourglass design is like a “concept car,” Chiang says. Though the final product is likely to take a slightly different shape, the design could serve as a model for future flow batteries. Simply changing the tilt of the device could add a short infusion of power to the grid during periods of peak demand, or slowly release energy over a period of many hours to keep air conditioners and heaters running when the sun is down.
In another design, the group has replaced vanadium with sulfur, which is inexpensive and abundant. Dissolved in water (also cheap and plentiful), sulfur is circulated into and out of the battery’s stack, creating a reaction that stores or gives up energy, similar to commercial flow batteries. The group is now refining the battery, first described in 2014 in Nano Letters, aiming for higher levels of energy.
Another challenge in developing flow batteries is finding ways to keep active materials confined to their tanks. That’s the job of the battery membrane, but because the organic molecules under consideration for battery use are almost always small, they too easily slip through the membrane, reducing the battery’s lifetime and performance. Rather than change the membrane, a group led by chemist Joaquín Rodríguez-López of the University of Illinois at Urbana-Champaign devised ways to bulk up the battery’s active materials by changing their size or configuration. The scientists linked tens to millions of active molecules together to create large, ringed structures, long strings of molecules hooked onto a polymer backbone, or suspensions of polymers containing up to a billion molecules, they reported in the Journal of the American Chemical Society in 2014.
With the oversized molecules, even “simple, inexpensive porous membranes are effective at preventing crossover,” Crabtree says. A prototype flow battery that provides low-cost power and lasts 20 to 30 years is expected to be completed in the coming year.
Getting air Looking beyond 2017, scientists envision a new generation of batteries made of low-cost, or even no-cost, materials. The lithium-air battery, still in early development, uses oxygen sucked in from the atmosphere to drive the chemical reaction that produces electricity. In the process, oxygen combines with lithium ions to form a solid compound (lithium peroxide). During charging, solid oxygen reverts back to its gaseous form.
“Lithium-air potentially offers the highest energy density possible,” says MIT materials engineer Ju Li. “You basically react lithium metal with oxygen in air, and in principle you get as much useful energy as gasoline.”
Lithium-air has problems, though. The batteries are hard to recharge, losing much of their power during the process. And the chemical reaction that powers the battery generates heat, cutting the battery’s energy-storage capacity and life span.
Using electron microscopy to study the reaction products of a lithium-air prototype, Li and his group came up with a possible solution: Keep oxygen in a solid form sealed within the battery to prevent the oxygen from forming a gas. By encasing oxygen and lithium in tiny glasslike particles, the scientists created a fully sealed battery. The new strategy, published July 25 online in Nature Energy, curbed energy loss during recharging and prevented heat buildup.
“If it works on a large scale, we would have an electrical vehicle that’s competitive with gasoline-driven cars,” Li says. Reaching that goal would be a big step toward a greener planet.
Sometimes, the improbable happens. The stock market crashes. A big earthquake shakes a city. A nuclear power plant has a meltdown. These seemingly unpredictable, rare incidents — dubbed black swan events — may be unlikely to happen on any specific day, but they do occur. And even though they may be rare, we take precautions. A smart investor balances their portfolio. A California homeowner stores an earthquake preparedness kit in the closet. A power plant designer builds in layers of safeguards.
Conservation managers should be doing the same thing, scientists warn. Black swan events happen among animals, too, and they rarely have positive effects, a new study finds.
How often do black swan events impact animals? To find out, Sean Anderson of the University of Washington in Seattle and colleagues looked at data for 609 populations of birds, mammals and insects. Often, the data were noisy; there could be lots of ups and downs in population sizes, not always with good explanations for what happened. But, Anderson notes, “it turns out that there are plenty of black swan events that are so extreme that we can easily detect them with available data.”
The researchers looked for upswings or declines that were so big they would be observed only once every 10,000 years. For example, the team found a population of gray herons in England that experienced large die-offs in the 1920s, ’40s and ’60s. Those declines were well outside the normal ups and downs found in the population. Harsh winters meant limited food availability for the birds, and the population crashed several times. “The last event actually involved population crashes two years in a row, and it took three times longer for the population to recover than expected,” Anderson notes.
About 4 percent of the populations in the study experienced a black swan event, the team reports in the March 21 Proceedings of the National Academy of Sciences. And this was usually a sharp decline in population size. That’s because there are limits to how fast a population can grow — organisms can only have so many babies per year. But there are no limits to how fast the members of a group can die.
Whether that 4 percent figure holds for the entire animal world is hard to tell. But it does confirm that these events do happen and that they are rarely good. And, given the nature of the events that disrupted populations, it’s possible they may become more common due to climate change.
“We found that most black swan events were caused by things like extreme climate or disease, and often an unexpected combination of factors,” Anderson says. Climate change is expected to increase the frequency and magnitude of events such as heat waves and drought. “We may observe more black swan events in animal populations in the future because of these climate extremes,” he says. Conservationists can’t predict when such events will happen, but there are ways to minimize their impact when they do, Anderson and colleagues suggest. For animals, this could mean making sure a population doesn’t get so small that something like a disease outbreak — such as the one that happened with saiga antelope in 2015 — or a really bad winter results in extinction.