Most of us grew up scared of dirt. As kids, we were told to always wash our hands with soap and water after touching soil. And as adults, we’re always armed with hand sanitizers or rubbing alcohol in our purses (I’m guilty of this!). But a growing body of scientific research suggests that our dirt-free lifestyle may be a little over-sanitised, and that soil contact may actually be beneficial to our health.
In particular, scientists find that touching earth could boost our immune system, strengthen our gut health, and even help improve our complexion! Surprised? Turns out, the secret is getting our bodies reintroduced to friendly bacteria.
Let’s dig into how this works.