Between the mid-1990s and the mid-2000s alone, the likelihood of having a classmate with a food allergy increased by 20 per cent in the United States. In fact, over the past five decades, the incidence of all allergies and autoimmune diseases – caused by your body attacking itself – has skyrocketed. What could explain our sudden hypersensitivity to our surroundings and ourselves? Since evolution operates on the timescale of millennia, the culprits lie not in our genes but somewhere within our environment.

One thing that has changed in public health is our awareness of germs and how they spread. In response to that insight, over the past half-century our implementation of hygiene practices has spared us from debilitating infections and enormous human misery. But the new vigilance might have altered the development of our immune system, the collection of organs that fight infections and internal threats to our health.

The idea that too clean an environment might be harmful has been dubbed “the hygiene hypothesis”. The concept has been perverted by some to suggest that the less clean the environment, the better. But its meaning is different: it is not dirt that we are missing but exposure to certain microbes that normally contribute to the development of our immune system. “It’s not that we aren’t exposed enough to microbes but that we’re not exposed to the right types of microbes,” says the immunobiologist Ruslan Medzhitov at the Yale School of Medicine, also head of the Food Allergy Science Initiative at the Broad Institute.

So what has changed? In short, it’s the standard for what constitutes a good microbe versus a bad one. “Take bacterial species that increase nutrient absorption from food,” Medzhitov says. These were immensely beneficial at a time where you had to go days without eating. Today in the parts of the world with an overabundance of food, having such bacteria in your intestine contributes to obesity. “Microbes that cause intestinal inflammation are another example of what we call bad microbes because they induce [detrimental immune] responses. But in the past, these microbes could have protected you from intestinal pathogens,” he adds.

The most common such foes are helminths, commonly known as intestinal worms. While they have been eradicated in most of the economic West, their absence from our environment might be contributing to the increase in allergic and autoimmune diseases by diverting our immune system’s attention towards our food and ourselves. This is what Ken Cadwell, a microbiologist at New York University, contends. He is interested in better understanding Crohn’s, an autoimmune disease caused by our immune system attacking our gastrointestinal tract.

Worm treatment

Cadwell’s team discovered that the bacterium Bacteroides vulgatus causes reduced mucus secretion in mice, something that results in the condition. In collaboration with the parasitologist P’ng Loke, also at NYU, Cadwell then decided to treat mice that had Crohn’s disease with helminths, known to trigger mucus secretions in the intestine. And it worked! The two researchers identified that the ensuing establishment of Clostridiales bacteria in the gut cured the mice.

But would it work in humans? Loke collected stool samples from populations living in helminth-endemic areas in Malaysia before and after deworming treatment, and he noticed something intriguing. People from these regions had abundant Clostridiales in their gut before treatment and a lower incidence of autoimmune bowel diseases. After successful deworming, the protective bacteria disappeared, setting the stage for autoimmune illness to rear its head.

Yet treating humans with worms will not work as a public-health measure. Worm treatment is beneficial only for those Crohn’s patients with a genetic make-up causing reduced mucus-secretion and lacking Clostridiales bacteria. For everybody else, it is just a harmful infection. Now scientists are trying to better understand how to reap the benefits of worm therapy while avoiding its downsides. “I am more in favour of an alternative that gives the same [health] effect without having to give somebody a live parasite,” says Cadwell. “That would be ideal.”

Besides the change in our exposure to microbes, our modern lives include diets much higher in sugar, salt and carbs. Now a team from Harvard and Yale has discovered that even moderate increases in salt consumption trigger activation of immune actors, known as Th17 cells, especially prevalent in such autoimmune diseases as arthritis and multiple sclerosis, to name just two. “Recent studies found that if you have the right [genes], your likelihood of developing rheumatoid arthritis goes up 10-fold from the average risk in the population. But if you combine that with smoking, it leads to an increased risk of 100-fold. Now a recent study also showed that if you have the right [genes], you smoke, but you also consume a high-salt diet, the risk goes up 500-fold,” says the immunologist Vijay Kuchroo at Harvard, who did the work with his colleague, the neurologist David Hafler of Yale.

We are forced to conclude that the explosion of allergies and autoimmune diseases results from a mismatch between genes selected by pressures of our evolutionary past and the reality of modern life. While we have adapted in the past, we might not be able to adapt again by relying on biology alone. There is no going back – the old world is gone. To curtail the increase in allergies and autoimmunity, we have to evolve our social, political and scientific regimes, providing healthier food options at lower cost and improving our understanding of the underlying immunological mechanisms at play.

The writer is a graduate student in immunology at the Iwasaki Lab at the Yale School of Medicine.

This article was first published on Aeon.