Buried within the bleak news about the Covid-19 pandemic have been some bright spots. Social distancing, lockdowns, and masking, which reduced viral transmission, also seem to have quenched some of the other respiratory diseases that circulate in the winter. Influenza, respiratory syncytial virus (RSV), enterovirus D68—this year, the surveillance networks that keep track of those diseases could barely find them. The national FluView map maintained by the Centers for Disease Control and Prevention, which visualizes the season’s intensity in traffic-light colors, has barely budged out of green since the fall.
This is good: One pandemic was more than enough, and preventing 38 million illnesses and 22,000 deaths—the flu’s toll in the United States during the 2019–2020 season—counts as a win. And yet, some researchers are worried. The downward trends in flu and other respiratory diseases show us how good we were at avoiding any contact we could and sanitizing like mad when we couldn’t. But they may also be a warning of unintended consequences to come. It is accepted doctrine in immunology that exposure to routine infections and common microbes early in life helps our immune systems learn what they should target and what to leave alone. Failure to get those exposures at the right time leaves the immune system overreacting to every minor insult.
It’s possible—though this is still speculation—that one of the long-term effects of this past year could be an increase in allergies and related diseases such as eczema and asthma, particularly in children who were babies and toddlers as the pandemic began. “The group that’s going to have suffered most during this whole issue of confinement is going to be the young children—toddlers, children from the first year, two years, three years of life,” says Graham Rook, a physician and emeritus professor of medical microbiology at University College London, whose work supports this possible effect. “That’s the critical period for having the right microbiota in place.”
While acting out of best intentions—the need to reduce the spread of a novel, lethal virus—we may have created a worldwide natural experiment in reducing exposure to microbes of all kinds. “Every other example in our history in which we disrupt exposure to good microbes has had unintended consequences,” says B. Brett Finlay, a microbiologist and professor at the University of British Columbia and author of the book Let Them Eat Dirt. “A child born by C-section has a 25 to 30 percent higher chance of getting obesity and diabetes because they don’t encounter the vaginal and fecal microbes of a normal birth. When you treat kids with antibiotics, they have much higher rates of obesity and asthma later in life.”
Finlay is one of 23 prominent researchers from six countries who warned in February in the Proceedings of the National Academy of Sciences about the long-term consequences to children of a hyper-hygienic, locked-down world. “They’re not going to day care, they’re not playing with the neighbors’ kids, if they were born during this time they were kicked out of the hospital early,” he says. “My guess is that five years from now we are going to see a bolus of kids with asthma and obesity who were the Covid kids.”
To understand these predictions, it’s necessary to dip into immunology, and particularly into the “hygiene hypothesis,” a concept that’s been evolving for more than 30 years. It aims to explain how our immune systems learn what they should try to combat. In its original version, the hypothesis started as a short letter to The British Medical Journal in 1989 suggesting an explanation for a well-documented rise in allergies after 1950—a period when increasing industrialization and intensified food production were supposed to be making the world more healthy, not less.
The key finding in the study, though, wasn’t that more people had allergies; that was an accepted observation already. It was who had them and who didn’t. The author, immunologist David Strachan, reported that people then in their twenties, who had been part of a huge, lifelong study of British children born in 1958, seemed less likely to have hay fever if they had grown up with older siblings. The implication was that the older sibs—who would have been leaving the house, going to school, and running around outdoors with friends when the toddlers stayed home—were exposing younger kids to something they brought home. It was a phenomenon that wouldn’t be available to an eldest or only child—people who, in this original research, had higher rates of hay fever than younger siblings did.
The possibility that early exposure to something prevented later trouble was intuitively appealing, and it led to a cascade of research associating allergies, eczema, and asthma with hygienic modern living. Many observational studies reported that allergies and asthma were less likely in people whose childhoods were spent outside cities, who were put in day care as infants, or who grew up with pets or were raised on farms—leading overall to a conclusion that messy, dirty premodern life was healthier for a growing child.
This led to a backlash—a sense that parents desperate to avoid allergies were neglecting basic cleanliness—and to a reframing of the hygiene problem. Version 2.0, formulated by Rook in 2003, proposes that the source of allergies isn’t a lack of infections, but rather deprivation of contact with environmental organisms that were our evolutionary companions over millennia. Rook called this the “old friends” hypothesis, suggesting that exposure to those organisms allowed our immune systems to learn the difference between pathogens and inoffensive fellow-travelers.
While this rethink was occurring, lab science was achieving the tools to characterize the microbiome, the films of bacteria and fungi that occupy the external and internal surfaces of everything in the world, including us. That helped recast the exposures that kids received in those observational studies—to animals, other children, dung, dander, and dust—not as infectious threats, but as opportunities to stock their microbiomes with a diverse array of organisms.
And that recognition in turn led to Version 3.0, the hygiene hypothesis as it exists now. Renamed the “disappearing microbiota” hypothesis and reformulated 10 years ago by microbiologist Stanley Falkow (who died in 2018) and physician-researcher Martin J. Blaser, this iteration proposes that our microbiomes mediate our immune systems. It also warns that our microbial diversity is becoming depleted, and thus less protective, because of the impact of antibiotics, antiseptics, and poor diets, among other threats.
That’s a quick tour of the contention that a lack of exposure—to childhood infections, environmental bacteria, and other opportunities to recharge microbial diversity—lets immune systems fall out of balance with their surroundings. It’s an idea that today is broadly accepted in pediatrics and immunology, though the surviving proponents of the various versions might disagree over details. But what does it mean for our immune systems as we emerge from combatting Covid-19? The hypothesis can’t say exactly what will happen, because so far researchers only have data on the prevalence of viral infections, not on other types of exposures. But that data is provocative.
In the southern hemisphere, where flu season overlaps the northern hemisphere’s summer, there was “virtually no influenza circulation” in 2020, according to a CDC report in September. The agency hasn’t yet published its final report on the US experience with the flu this winter, but the World Health Organization reported last month that it remained “below baseline” throughout the northern hemisphere.
There was a similar—though not so pronounced—dip for related infections. Researchers from Princeton University and the National Institutes of Health estimated in December that transmission of RSV, which typically affects babies, declined in the US by 20 percent. The US also experienced lower rates of enterovirus D68 (a respiratory infection linked in rare cases to acute flaccid myelitis, a floppy paralysis), according to a paper published in March. Similar low incidence shows up, according to researchers, in Taiwan’s national data for enterovirus and also pneumonia and flu. And according to a preprint posted in March, the winter epidemic of RSV in France began four months late—in December, although in years past it started in autumn.
France is experiencing an out-of-season RSV outbreak now, according to Jean-Sébastien Casalegno, a physician and virologist at the Institut des Agents Infectieux of the Hospices Civils in Lyon and first author on that preprint. So are Australia and South Africa. Plotting the RSV cases temporally, Casalengo says, shows that they match to schools reopening for the southern hemisphere’s current semester. In other words, school closures, masking, and distancing—broadly, what epidemiologists call “nonpharmaceutical interventions,” or NPIs—worked to keep infections down.
What that means for transmission of these diseases in the future is unclear. The Princeton group predicts that RSV, which circulates on a two-year cycle, will come roaring back. “We can be pretty sure that this next respiratory season won’t be a classical one,” says Casalegno, who leads an independent research group studying the spread of RSV. “This is such an unprecedented situation, there are not straightforward answers.”
So this is complex. Children might be at risk from losing exposure to microbes. They might also be at risk from diseases that occur out of season. And counterintuitively, they might equally be placed at some long-term risk from not having had those childhood infections at the right time. Broadly speaking, our immune system has two arms: innate and adaptive. The first is preprogrammed by inheritance, and the second learns its task after birth. Both obtain “tuning,” as Rook puts it, as a child encounters the world, and NPIs will have deprived both of their expected lessons. “It may have consequences,” he says. “It remains to be seen how big the effect is.”
Denise Daley, a genetic epidemiologist and associate professor at the University of British Columbia whose work has explored how virus exposure modulates immune responses, points out that our first broad impulse at the start of the pandemic—not to mention recommendations by just about every health authority—was to wash hands and sanitize surfaces. That may not have made much difference for Covid-19 transmission, which has now been shown to be primarily airborne, and not via surface spread. But it may have eliminated other exposures we need in order to build up immunity to more common diseases or to acquire helpful microbiota. “Hand sanitizer was the first thing that disappeared off the shelves,” she points out, “when previously there had been recommendations to reduce the amount of hand sanitizers that were being used because we were reducing our exposure to potentially beneficial microbes.”
Now, emphasizing again: This is speculative. The association between microbial exposures, antibiotic use, and later emergence of allergy and associated diseases is strong, demonstrated not just in observational studies that documented human lives, but in animal research done by microbiome researchers including Blaser. But none of those studies were conducted in conditions that could match the most widespread pandemic in 100 years. And in all of them, there was a time lag between when an exposure occurred and when the result—any allergy, asthma, eczema—began to show up. So it’s too soon to look at allergy data and assume that any changes in case numbers are an aftereffect of combatting Covid; if that occurs, it will happen some years down the road.
It’s also not to say that lockdowns, masking, and the other NPIs were a mistake. They were a necessary tool to slow the spread of a brand-new virus when few other tools were available. They had an array of unintended consequences: lost jobs, a devastated hospitality industry. If allergy and reactive diseases rise as a result, those would be an unintended consequence, too.
And the consequences may not only be for children. Our microbiomes are continually restocked and remodeled by the foods we eat, the conditions we live in, and the other microbes we challenge them with. Everyone who locked down or lathered up, not just young children, was deprived of some of that microbial exchange. “I think we’ll see the biggest effects in the bookends of life, early and late,” Finlay says. “I think of my parents, who have been locked down for a year in an elder-care place where even the health care workers they see are in full gowns, full masks, gloves. They don’t have grandkids and dogs running through; they’re not at family reunions kissing everyone. They’re not picking microbes up anywhere.”
As with so much in the pandemic, this microbial deprivation could fall hardest on those who were already racially or economically marginalized—and whose microbiomes may already have been depleted, leaving them with fragile immune systems even before Covid hit.
“There are parts of the world—and it’s not just true for sub-Saharan Africa, it’s true for parts of the Paris region and the United States—where people live in very crowded housing, in various kinds of food deserts, who may have lost their jobs in confinement,” says Tamara Giles-Vernick, a medical anthropologist who directs the anthropology and ecology of disease emergence unit at Institut Pasteur and was one of the PNAS authors. (The 23-person group is known as CIFAR Humans and the Microbiome from its funding source in the Canadian government.) Those stressors, she said, could not only produce the kind of effects that Finlay and Rook predict down the line; they also could pose difficulties for preventing Covid right now, by undermining the immune response those city residents ought to make to vaccines.
Paradoxically, there may have been an unintended benefit from our lack of exposure to microbes in 2020. Some of the respiratory infections that occur each winter result in bronchitis and middle ear infections. Both of those can be caused by viruses or by bacteria—but because doctors often diagnose those based on past experience and not on test results, they may prescribe antibiotics that are not needed. As a result, both diseases are responsible for some of the vast overuse of antibiotics against viral diseases that cannot be combated by them.
Whenever we use antibiotics, we court the possibility that the bacteria will adapt to protect themselves; antibiotic resistance, the sum of those adaptations, is believed to take 700,000 lives every year. So experiencing a drop in viral illnesses that might cause antibiotics to be misprescribed was significant. And the early evidence is that antibiotics prescriptions did drop. In February, CDC physician and researcher Arjun Srinivasan presented the agency’s first accounting of antibiotic use during the pandemic to a federal panel known as PACCARB, the Presidential Advisory Council on Combating Antibiotic-Resistant Bacteria.
He reported that, by pulling from multiple health care databases, the CDC discovered that antibiotic use during Covid-19 rose in hospitals, where severely ill patients might have been given the drugs to prevent them from developing pneumonia while on ventilators. But antibiotic prescribing shrank in outpatient medicine, in the doctors’ offices and urgent care centers where people seek help for conditions such as bronchitis: There were 32 percent fewer antibiotic prescriptions written last December than a year earlier. That was not a one-time artifact. Normally, antibiotic prescriptions climb month by month during the winter; there were 14 percent more prescriptions written in December 2019 than in the month before. But in December 2020, there were 7 percent fewer. Across 2020, Srinivasan said, that predictable seasonal rise was simply missing. (Similar results were published last week on dips in antibiotic use during Covid in British Columbia and South Korea.)
The benefit of doctors writing fewer antibiotic prescriptions isn’t only that there is less selective pressure on bacteria, which would encourage them to evolve toward resistance. It also makes it less likely that antibiotic use, in children and in the mothers of newborns, will damage developing microbiomes. “There was less inappropriate treatment with antibiotics,” says Blaser, the developer of the disappearing-microbiota hypothesis, who directs the Center for Advanced Biotechnology and Medicine at Rutgers University—and who happens to chair PACCARB, where he heard Srinivasan’s testimony in real time. “So it could be that the children of the world are better off, because they’re not getting those inappropriate antibiotics that will affect their microbiome at a critical early point in their life. And that actually might be true for adults, too.”
If the nonpharmaceutical interventions that controlled Covid-19 did have an effect on developing immune systems, that impact might not be visible for several years. Finlay estimates that asthma arising from a lack of microbial exposure might take five years to manifest. Rook thinks doctors should be watching for more allergies and asthma as the children of the pandemic reach school age. To monitor for those events might require a new surveillance system, something that funnels reports from pediatricians and immunologists to national health authorities. It could provide insights into the after-effects of the pandemic to build something like that. It would represent one less way in which Covid took us by surprise.
- 📩 The latest on tech, science, and more: Get our newsletters!
- How to find a vaccine appointment and what to expect
- Variant hunters race to find new strains where testing lags
- Vaccine passports are coming. What will that mean?
- Larry Brilliant has a plan to speed up the pandemic’s end
- Scientists need to admit what they got wrong about Covid
- Read all of our coronavirus coverage here