Main content starts here, tab to start navigating

Health Hack #43 - Stay Clean, But Not Too Clean

The Peanut Story

Surprisingly, peanuts belong to the legume family. They’re technically not nuts. However, they’ve managed to rise to the top of the nut allergy spectrum, particularly in small children. Those with peanut allergies tend to avoid nuts altogether, and peanuts are quite often mentioned in the same category as other nuts. There’s certainly logic here, since peanuts are often manufactured in factories that produce other types of nuts, giving rise to possibilities of contamination.

Studies showed that in the mid-90’s approximately 4 in 1,000 children under 8 years of age possessed an allergy to peanuts. This can be boiled down to 1 in 250 children. Most pre-schools have far fewer than 250 children, and many elementary schools do as well, particularly private schools. In 1995, chances were high that schools for young children were completely devoid of students with peanut or other nut allergies.

By 2008, the same research group, using the same methods as in the mid-90’s, unveiled that rates had more than tripled. An average of 14 in 1,000 children were allergic to peanuts. This boils down to 3.5 in 250 children, more than 1 in a 100. At this rate, chances were that an elementary school was home to a child with the allergy.

During this 10-15 year period where peanut allergy rates in children more than tripled, it was evident that measures had been taken (and publicized) to shield immunocompromised children from peanuts. As these purported protective strategies evolved, a revised narrative emerged, suggesting that peanuts were dangerous to children in general. Parents began shielding their children from peanuts in the name of keeping them safe, despite there being no detected allergy nor increased susceptibility.

A team of researchers assembled themselves to test the hypothesis that the very avoidance of peanuts that dramatically increased beyond the mid-90’s was responsible for the increase in allergies. The theory? That by depriving a child of exposure to peanuts early in life, their immune systems would miss out on the opportunity to develop an adaptive strategy for warding off a perceived pathogen, were there to exist a pre-existing issue tied to peanuts, or the likelihood of one due to other factors.

The study was coined LEAP. Learning Early About Peanut Allergy. The parents of 640 children (all aged between 4-11 months at the outset) participated. A key prerequisite for a child’s admittance into the study was the presence of eczema and/or confirmed cases of other allergies, as both factors dramatically heightened the risk of their developing a peanut allergy later in childhood. Half of the parents were instructed to comply with modern wisdom, and asked to avoid exposure to peanuts altogether. The other half was required to feed their child a peanut-containing snack an average of three times per week. The research team tracked their participants closely for years, ultimately putting their hypothesis to the test once the children turned 5 years old. Drumroll...

The subset of children permitted to have their routine peanut treats showed an aggregate allergy rate of 3%. Not a particularly bad number, considering their high predispositions. The group of children subjected to an all out ban on peanuts had a very different outcome. 17% of these children possessed a certified peanut allergy when tested at the age of 5. The hypothesis behind the launching of the study rang true.

What to Make of This

In the context of this article, the story behind the rise in peanut allergies has less to do with the dangers of peanuts than it does with the perils of a cultural shift we’ve undergone these past few decades. We’d previously (Be Careful What You Wash With) delved into issues with the modern day soap industry, revealing how noxious chemicals had infiltrated our showers in the name of extending shelf life and lowering costs. That’s merely one facet of the paradigm shift surrounding modern sanitation practices. On the other end, a component less economically driven, is our repulsion towards germs and microbes, and our desires to protect ourselves so entirely from the pathogen world that we wind up doing more harm than good.

To pause a moment, it’s pertinent to note that COVID-19 represents a glaring exception to some of the excessive sanitation practices expressed in further paragraphs. There lurks a clear and present danger, and actively protecting oneself from the virus is paramount to one’s health, and protective strategies often require the recurring use of antibacterials in order to fortify a proper shield. This article is not intended to dismiss CDC-recommendations towards sanitation in respect to COVID. Rather, it speaks primarily towards the before and after.

The Microbial World

It’s important to recognize that in many ways, we, as humans, are a highly evolved collection of microbes interacting inside of a physical framework that we refer to as our bodies. Residing in our relatively small guts alone are upwards of 10 trillion organisms, composed of more than 1,000 different species of microbes. It’s our continually evolved interaction with this microbial world that made us who we are, physiologically. Evolution has granted us the ability to use these microbes to our advantage, increasing our fitness for survival. Our bodies interact with outside microbes such as “germs” (like bacteria, viruses, fungi), and in most cases we increase our body’s resilience by doing so.

Bringing COVID back in the conversation as an apt example, let’s consider the mechanisms at play behind the search for a vaccine that we’re being told is truly what’s needed to certifiably bring this pandemic to a halt. People often associate vaccines as being cures. In many ways they are, but connecting these two dots directly can easily obscure the process by which a vaccine becomes useful. 

A vaccine doesn’t eradicate an existing strand of virus in our system. It’s not throwing water on a fire. Instead, it’s deliberately introducing a small dose of the virus it’s designed to protect us against. It provides a mild enough dose to avoid sickness, while allowing our immune system the ability to develop a strategy for neutralizing the intrusive microbes. For the long-term, it allows us to create a learned response system towards future encounters so that we’re adequately prepared to ward off a much greater viral dose. This, to routine readers, will ring true as an apt use of the term hormesis, suggesting that a heavy dose may wipe us out, but an appropriate dose can actually be beneficial.

When we overly deprive our inner microbial worlds of certain exposures, we effectively let our guards down, and our immune system takes a hit. Worse yet, when we actively kill part of our inner defense structure without valid reason, we’re fighting a war with fewer troops. These two circumstances stem from trends which have arisen within this new paradigm. The first speaks to the peanut strategy of aggressively avoiding all that could be harmful. In doing so, we’re likely to lose some resilience. The second speaks towards the overuse of antibiotics. A wonderful creation, single-handedly responsible for a rise in the average lifespan of our human race since its advent. However, when used unnecessarily or too frequently, our long-term resilience is slashed again. While antibiotics are able to target and destroy a bodily disturbance like a bacterial infection, they’re only able to do so at the expense of killing a slew of beneficial bacteria in the process. They should be used carefully, and at the discretion of one’s doctor. But calling upon your doctor friend to call you in a Z-pack because you’ve had a faint cough for a few days is likely to cost you more than it benefits you in the long run.

Interesting Tidbits to Consider

Children born via C-section have an increased risk of developing asthma and various other allergies, compared to natural births. Downstream, the likelihood of obesity for children born of C-sections and the onset of Type-2 Diabetes is substantially higher than in natural births. For those scratching their heads, a natural birth exposes the baby to a whole new world of bacteria, which is partly lost during the sterilized process of a C-section. In the context of evolution, we can derive the notion that this bacterial exposure is a vital aspect of early life.

Children raised in a home with dogs have a notably lower likelihood of developing allergies. If the intuitiveness here isn’t clear right off the bat, consider the microbial contents that enter our homes each time our dog is outside. We’d be skeeved to the high heavens to bear crawl on urban sidewalks with bare hands and feet, and then run around in a muddy field before heading back home for dinner. Most of us wouldn’t have the stomach to do it, and for certain if we did, we’d scrub each inch of our bodies squeaky clean before touching a morsel of food. Otherwise we’d get sick, right? But we don’t seem to have any problems letting that cute little dog back inside without a shower. Nor do we pause at the irony when scratching our dog’s head with the same hand we’re using to shovel pistachios into our mouths.

By now the photo from this section should be making sense. Children growing up with dogs have been shown to be less prone to developing allergies than children without them. Taking this a step further, children growing up on farms are shown to possess even more immune resilience. The more exposure at an early age, the stronger the inner armor seems to become. And for those (like myself) who grew up without a dog and without a farm, a late start doesn’t disqualify us from participating in the race.

Stay Clean, but not too clean.