Excerpted from “An Elegant Defense: The Extraordinary New Science of the Immune System,” published on Tuesday by William Morrow.
Should you pick your nose?
Don’t laugh. Scientifically, it’s an interesting question.
Should your children pick their noses? Should your children eat dirt? Maybe: Your body needs to know what immune challenges lurk in the immediate environment.
Should you use antibacterial soap or hand sanitizers? No. Are we taking too many antibiotics? Yes.
“I tell people, when they drop food on the floor, please pick it up and eat it,” said Dr. Meg Lemon, a dermatologist in Denver who treats people with allergies and autoimmune disorders.
“Get rid of the antibacterial soap. Immunize! If a new vaccine comes out, run and get it. I immunized the living hell out of my children. And it’s O.K. if they eat dirt.”
Dr. Lemon’s prescription for a better immune system doesn’t end there. “You should not only pick your nose, you should eat it,” she said.
She’s referring, with a facetious touch, to the fact our immune system can become disrupted if it doesn’t have regular interactions with the natural world.
“Our immune system needs a job,” Dr. Lemon said. “We evolved over millions of years to have our immune systems under constant assault. Now they don’t have anything to do.”
She isn’t alone. Leading physicians and immunologists are reconsidering the antiseptic, at times hysterical, ways in which we interact with our environment.
Why? Let us turn to 19th-century London.
The British Journal of Homeopathy, volume 29, published in 1872, included a startlingly prescient observation: “Hay fever is said to be an aristocratic disease, and there can be no doubt that, if it is not almost wholly confined to the upper classes of society, it is rarely, if ever, met with but among the educated.”
Hay fever is a catchall term for seasonal allergies to pollen and other airborne irritants. With this idea that hay fever was an aristocratic disease, British scientists were on to something.
More than a century later, in November 1989, another highly influential paper was published on the subject of hay fever. The paper was short, less than two pages, in BMJ, titled “Hay Fever, Hygiene, and Household Size.”
The author looked at the prevalence of hay fever among 17,414 children born in March 1958. Of 16 variables the scientist explored, he described as “most striking” an association between the likelihood that a child would get hay fever allergy and the number of his or her siblings.
It was an inverse relationship, meaning the more siblings the child had, the less likely it was that he or she would get the allergy. Not just that, but the children least likely to get allergies were ones who had older siblings.
The paper hypothesized that “allergic diseases were prevented by infection in early childhood, transmitted by unhygienic contact with older siblings, or acquired prenatally from a mother infected by contact with her older children.
“Over the past century declining family size, improvements in household amenities, and higher standards of personal cleanliness have reduced the opportunity for cross infection in young families,” the paper continued. “This may have resulted in more widespread clinical expression of atopic disease, emerging in wealthier people, as seems to have occurred for hay fever.”
This is the birth of the hygiene hypothesis. The ideas behind it have since evolved and expanded, but it provides profound insight into a challenge that human beings face in our relationship with the modern world.
Our ancestors evolved over millions of years to survive in their environments. For most of human existence, that environment was characterized by extreme challenges, like scarcity of food, or food that could carry disease, as well as unsanitary conditions and unclean water, withering weather, and so on. It was a dangerous environment, a heck of a thing to survive.
At the center of our defenses was our immune system, our most elegant defense. The system is the product of centuries of evolution, as a river stone is shaped by water rushing over it and the tumbles it experiences on its journey downstream.
Late in the process, humans learned to take steps to bolster our defenses, developing all manner of customs and habits to support our survival. In this way, think of the brain — the organ that helps us develop habits and customs — as another facet of the immune system.
We used our collective brains to figure out effective behaviors. We started washing our hands and took care to avoid certain foods that experience showed could be dangerous or deadly. In some cultures, people came to avoid pork, which we now know is highly susceptible to trichinosis; in others, people banned meats, with we later learned may carry toxic loads of E. coli and other bacteria.
Ritual washing is mentioned in Exodus, one of the earliest books in the Bible: “So they shall wash their hands and their feet, that they die not.”
Our ideas evolved, but for the most part, the immune system did not. This is not to say that it didn’t change. The immune system responds to our environment. When we encounter various threats, our defenses learn and then are much more able to deal with that threat in the future. In that way, we adapt to our environment.
We survived over tens of thousands of years. Eventually, we washed our hands, swept our floors, cooked our food, avoided certain foods altogether. We improved the hygiene of the animals we raised and slaughtered for food.
Particularly in the wealthier areas of the world, we purified our water, and developed plumbing and waste treatment plants; we isolated and killed bacteria and other germs.
The immune system’s enemies list was attenuated, largely for the good. Now, though, our bodies are proving that they cannot keep up with this change. We have created a mismatch between the immune system — one of the longest surviving and most refined balancing acts in the world — and our environment.
Thanks to all the powerful learning we’ve done as a species, we have minimized the regular interaction not just with parasites but even with friendly bacteria and parasites that helped to teach and hone the immune system — that “trained” it. It doesn’t encounter as many bugs when we are babies. This is not just because our homes are cleaner, but also because our families are smaller (fewer older children are bringing home the germs), our foods and water cleaner, our milk sterilized. Some refer to the lack of interaction with all kinds of microbes we used to meet in nature as the “old friends mechanism.”
What does the immune system do when it’s not properly trained?
It can overreact. It becomes aggrieved by things like dust mites or pollen. It develops what we called allergies, chronic immune system attacks — inflammation — in a way that is counterproductive, irritating, even dangerous.
The percentage of children in the United States with a food allergy rose 50 percent between 1997–1999 and 2009–2011, according to the Centers for Disease Control and Prevention. The jump in skin allergies was 69 percent during that period, leaving 12.5 percent of American children with eczema and other irritations.
Food and respiratory allergies rose in tandem with income level. More money, which typically correlates with higher education, has meant more risk of allergy. This may reflect differences in who reports such allergies, but it also springs from differences in environment.
These trends are seen internationally, too. Skin allergies “doubled or tripled in industrialized countries during the past three decades, affecting 15–30 percent of children and 2–10 percent of adults,” according to a paper citing research from the Journal of Allergy and Clinical Immunology.
By 2011, one in four children in Europe had an allergy, and the figure was on the rise, according to a report by the World Allergy Organization. Reinforcing the hygiene hypothesis, the paper noted that migration studies have shown that children born overseas have lower levels of some types of both allergy and autoimmunity than migrants whose children are born in the United States.
There are related trends in inflammatory bowel disease, lupus, rheumatic conditions and, in particular, celiac disease. The last results from the immune’s system overreacting to gluten, a protein in wheat, rye and barley. This attack, in turn, damages the walls of the small intestine.
This might sound like a food allergy, but it is different in part because of the symptoms. In the case of an autoimmune disorder like this one, the immune system attacks the protein and associated regions.
CreditMike McQuade
Allergies can generate a more generalized response. A peanut allergy, for instance, can lead to inflammation in the windpipe, known as anaphylaxis, which can cause strangulation.
In the case of both allergy and autoimmune disorders, though, the immune system reacts more strongly than it otherwise might, or than is healthy for the host (yeah, I’m talking about you).
This is not to say that all of these increases are due to better hygiene, a drop in childhood infection, and its association with wealth and education. There have been many changes to our environment, including new pollutants. There are absolutely genetic factors as well.
But the hygiene hypothesis — and when it comes to allergy, the inverse relationship between industrialized processes and health — has held up remarkably well.
As our bodies strive for balance, Madison Avenue has made a full-court press for greater hygiene, sometimes to our detriment.
We’re fed a steady diet of a hygiene-related marketing that began in the late 1800s, according to a novel study published in 2001 by the Association for Professionals in Infection Control and Epidemiology. Scientists at Columbia University who did the research were trying to understand how we became so enamored of soap products.
Some highlights:
-
The Sears catalog in the early 1900s heavily advertised “ammonia, Borax, and laundry and toilet soap.”
-
“During the early to mid-1900s, soap manufacturing in the United States increased by 44 percent,” coinciding with “major improvements in water supply, refuse disposal and sewage systems.”
-
The marketing trailed off in the 1960s and 1970s as antibiotics and vaccines were understood to be the answer to infectious agents, with less emphasis on “personal responsibility.”
-
But then, starting in the late 1980s, the market for such hygiene products — home and personal — surged 81 percent. The authors cite a “return of public concern for protection against infectious disease,” and it’s hard not to think of AIDS as part of that attention. If you’re in marketing, never waste a crisis, and the messages had an impact.
-
The study cites a Gallup poll from 1998 that found that 66 percent of adults worried about virus and bacteria, and 40 percent “believed these microorganisms were becoming more widespread.” Gallup also reported that 33 percent of adults “expressed the need for antibacterial cleansers to protect the home environment,” and 26 percent believed they were needed to protect the body and skin.
They were wrong. And even doctors have been wrong.
[Like the Science Times page on Facebook. | Sign up for the Science Times newsletter.]
They have vastly overprescribed antibiotics. These may be a huge boon to an immune system faced with an otherwise deadly infection. But when used without good reason, the drugs can wipe out healthy microbes in our gut and cause bacteria to develop defenses that make them even more lethal.
A scientist who led efforts at the World Health Organization to develop global policy to limit use of antibiotics told me that, philosophically, this is a lesson that runs counter to a century of marketing: We’re not safer when we try to eliminate every risk from our environment.
“We have to get away from the idea of annihilating these things in our local environment. It just plays upon a certain fear,” said the scientist, Dr. Keiji Fukuda.
Has much of our hygiene been practical, valuable, life-preserving? Yes.
Have we overcorrected? At times. Should you pick your nose? Or put another way: Might that urge to pick be part of a primitive strategy to inform your immune system about the range of microbes in your environment, give this vigilant force activity, and train your most elegant defense?
Yes. Perhaps.
In short, from a cultural standpoint, you still probably shouldn’t pick — not in public. But it is a surprisingly fair scientific question.