Friday, December 11, 2009

Small Game in Primitive Living, Part 1: The Paiute Deadfall

Some of you may recall the post I wrote on Matt Graham, called "Talking Nutrition with a Wild Man," in which I describe his 6-month primitive living experience and explain how he attained adequate nutritional support in the wilderness of southern Utah. Most of what he ate consisted of small game: mice, squirrels, rabbits. What I didn't get into was the methods Matt utilized to capture these animals and the specific nutrients and calories that they supplied him with. In this post I'll describe one of the methods he used -- my friend and yours, the figure-four Paiute deadfall. In a later post, we'll dive deeper into the nutrional contents and caloric contribution of small game commonly caught by such a trap.

Mouse Pancakes and Squirrel Flapjacks

When I was 19-years-old, I learned how to make my first trap from my long-time mentor and friend, Vince Pinto (who now owns and operates Raven's Way Wild Journeys). It was a simple little contraption. Two sticks, each around six inches long and some agave fiber cordage made for the basis of a trap that I would use to procure many rodents in many primitive living trips in the years that followed. Named the figure-four Paiute deadfall after the very crafty, very omnivorious Northern Paiute natives who apparently pioneered it, this ingenious trap is hailed by modern-day abos everywhere as a reliable way to provide food in the bush. Matt Graham certainly endorses it. In fact, on his primitive trip, Matt only used two traps to supply his meat quarry: the Paiute deadfall and a spring snare. Knowing how to set a few traps really well, in his opinion, is better than knowing how to set many types of traps poorly. Quality not quantity.

Taking into consideration all the Paiute deadfalls I've set over the years, I'd say that this trap has squashed a mouse or a squirrel 50% of the time. Set six traps, get three little critters in my stomach. Matt and other masters of this trap with plenty of "dirt time" in the wilderness probably have a much higher success rate due to intimate knowledge of their regions as well as years of refinement and experimentation.

The Paiute deadfall components, in all its glory (aside from the rock needed to set it), is pictured below:

Yup, that's it. A few sticks and some cordage with a "toggle" piece attached. The sticks, made from pine, can be fashioned out of any sturdy type of wood. The uppermost stick in the photo -- the top-piece -- has cordage tied on and a notch carved into it for reasons you'll understand in a little bit. The other stick -- the foundation piece -- is carved to a flat point to fit nicely into the top-piece's notch; it's two stubby "legs" give it more stability when setting the trap. The cordage is made from artificial sinew, but any strong natural fibers will do. When I lived in the Sonoran desert, I was a big fan of yucca and agave fibers. I'll leave natural cordage-making skills for another time, though.

The dimensions pictured are by no means the only way to go about making this trap, but they seem to work well for me. All of the fancy carvings on the sticks aren't really neccesary, but I find them helpful in allowing me to adjust the length of cord on-the-fly by wrapping it around the top-piece, as well as securing the cord better on the bottom foundation piece. Below is what the trap looks like once it's set (here, you can see more clearly how I used the carvings):

You'll notice that, to complete the trap, I had to (1) find a rock with a nice flat bottom and wide base and (2) attain a long, thin bait stick to thread underneath the rock. Obviously, if I was actually setting this trap to procure an animal, I'd have speared some bait onto the bait stick before setting, such as local wild plants that a small animal might like -- maybe pinyon pine nuts, seed heads of various grasses -- or food I might have with me as trail snacks, such as raisins or peanut butter. Mice seem to really love raisins. Although no rodents came along after I set it, to my good fortune (and perhaps to the benefit of the more squeamish readers of this blog), this particular trap attracted a wild desert tangerine!

Notice the rocks I stacked on top of the main rock after setting the trap. This increases the weight and, thus, crushing force and speed of the deadfall. It also allows me to evaluate the stability of the trap. Now let's trigger the trap, lift the rock, and see what happens to the unsuspecting tangerine:

Ouch! As you can see, when the bait stick is tugged at or nudged in any way, the trap is triggered and the rock falls abrubtly, leaving very little time for the animal (or, in this case, fruit) to escape. The main components of the Paiute deadfall spring neatly into the air and out of the way of the rock, allowing the rock to lay perfectly flesh with the ground (or, ideally, a hard, flat rock underneath) -- only the thin bait stick, along with the bait, actually receives the impact. The unlucky creature who happened to trigger the trap is instantly crushed to death. Squish! Mouse pancakes!

More Resources For the Paiute Deadfall

For brevity's sake, I'm going to leave a more detailed discussion of this trap to Jim Riggs, one of the great influences of the primitive skills movement and a man with a lot of "dirt time." His article "Rocking On with the Paiute Deadfall" is by far the most thorough and well-written piece I have seen on this subject. Those of you out there in cyberland who want to experiment with a Paiute deadfall will benefit greatly from Mr. Riggs' description of the trap.

Also, a very good explanation of the Paiute deadfall is given in this video. I've never met the man who made this tutorial (Mark Lummio of Bushcraft Northwest), but he seems to really know what he's talking about. Absolutely fantastic video.

In the next post, I'll get into the nutritional and caloric details of small game animals that a person might find underneath his or her deadfall rocks day to day; and I'll evaluate the realities inherent in living off of such fare as Matt Graham did: processing, cooking, and eating trapped animals to thrive -- not just survive -- in the wilderness primitively.

Thanks to my good pal, Jeff Macdonald, for helping with the pictures.

Monday, November 16, 2009

Roadkill Mule Deer: $417

Out here in rural Utah there really isn't much traffic. More than five cars driving through town at the same time feels like rush hour and fender-benders are rare. There isn't a stop light for at least 50 miles. Yet, safe as this sounds for drivers on these country roads, there are still dangers lurking behind every juniper tree, potential disasters waiting in the sagebrush. Deer, elk -- even rabbits -- are hazardous highway threats and can leap out into the road at any mile marker. The damage they can cause to a vehicle -- and the person inside that vehicle -- can be monumental. Anything from a smashed grill, broken headlights, or a complete totaling are all possibilities, along with injuries to driver and passengers if the deer or elk happens to blast through the windshield.

So I was glad to receive a phone call and hear that my friends, Danny and Gretchen, were safe and sound after hitting and killing a two-point mule deer buck near the town cemetery. I was also quite excited to hear that they intended to call the game warden and take the deer home with them to butcher -- and that they needed my help to load the animal into their pick-up truck. "We'll give you some of the meat, if you're interested," Gretchen said. Hmm. Fresh venison, bones for soup, maybe some liver -- sounds good to me!

So off I went with my girlfriend at 9:30 p.m. to do a little "gathering." I can't really say that there was any hunting involved, as all we did was pick up a very dead deer. After loading it into Gretchen and Danny's truck, we agreed to meet up the next day to butcher.

On a beautiful Sunday afternoon in the front yard of a friends house, we began the fascinating process that human beings have engaged in for hundreds of thousands of years before us -- except we used modern technology to get the job done quicker. Danny tied one end of a rope to the hitch of his pick-up truck and the other end over a sturdy tree branch and around the buck's neck, almost like a noose readied for a hanging. Hopping in his truck and pulling forward, the 200+ lbs. animal magically rose into the air and was suspended at the perfect butchering height. Now that's country.

Danny then quickly and efficiently removed the hide of the deer by pulling it back and cutting the fine, sheath-like material just underneath. Once it was freed, Gretchen decided to work the hide and prepare it for tanning. Next, Danny hacksawed off the legs above the knee joints and began quartering the animal, procuring all the choice cuts: brisket, loins, backstrap, and plenty of meat for roasts. He had already thrown out the liver, heart, and other organ meats as they were all damaged and bloody. This particular buck was fairly lean, which makes sense given that it was a lower elevation-dwelling mule deer in October. Deer and elk in higher elevations would certainly be fatter this time of year as they pile on extra stores for the cold winter months.

Danny explained that much of the meat on the buck was questionable for human consumption due to the prevalence of CWD -- Chronic Wasting Disease -- in Utah deer populations. The risk of transmission of this prion disease (similar to the infamous "mad cow disease") to humans is thought to be very low, if at all. But we weren't willing to take the risk, so any meat stained with blood from the spine was discarded (CWD is a central nervous system disease). And we sure weren't about to eat deer brains either, so the head was also tossed.

In the end, after quartering and butchering the buck, we attained 70-80 lbs. of roasts, steaks, and tough meat for burger grind. My friend at the local meat locker was kind enough to blend the grind meat with some grass-fed beef fat he had on hand, which made for some amazing hamburgers. We also hacked up the bones, rich with marrow, for broth and soup.

Total processing costs: $17. Estimated damage to Danny and Gretchen's vehicle: $400.

Roadkill Mule Deer: $417. A true country delicacy.

Wednesday, October 28, 2009

Tribal Fattening Practices

While in America "thin is in," in some cultures around the world "fat is where it's at." One such culture can be found in the desert-streaked country of Mauritania, located in West Africa. Here, the true marker of beauty and health in a woman is the amount of rolls she has. But there's one problem: human beings eating normal amounts of natural foods don't get obese and overweight, and the common foods available in West Africa include raw goat's milk, meat, millet, couscous, dates, peanuts, and other whole foods. While most of us in the Western world can easily become fat through years of eating fattening, unnatural, metabolism-altering foods like high-fructose corn syrup, trans-fats, and high-gluten white flour, the Mauritanian people don't have such "luxuries" -- so they resort to good old-fashioned force-feeding to accomplish the task.

The Mauritanian fattening practice, called leblouh, takes place when young women enter a tiny sandstone hut. Inside resides an old woman, the "fattener," whose primary job in the community is to make sure these young women (sometimes beginning as young as 5-years-old) become plump and, thus, attractive and suitable for marraige.

Obvious moral and ethical implications of this practice aside, I thought it would be interesting to find out just how much food is utilized to accomplish the fattening. I was surprised to find out that these women typically are force-fed -- to the point of nasuea and vomiting at times -- a whole-foods diet of up to 16,000 calories. This includes four meals per day of:

...crushed dates and peanuts with couscous and oil ... cloying, egg-size balls of around 300 calories apiece. Each girl eats about 40 per day, along with 12 pints of goat's milk and gruel ... (Source)

To bolster the fattening process, the women also must not get any exercise whatsoever, remaining in the huts for several years until they are married off. Additionally, because the task of eating such inordinate amounts of food is so physically challenging to the young women, the old woman "fattener" threatens to beat them if they refuse to eat.

Because the Mauritanian women are limited to traditional foods, which lead to satiety rather quickly due to high nutrient content and are difficult to overeat, some have sought out methods to increase their appetite unnaturally to be able to gain those extra pounds of beauty. One such method is the purchase of certain pharmaceuticals:

Sold secretly at city markets, they include hormones used to fatten camels and chickens, and steroids for asthma and cancer ... (Source)

The difficulty inherent in these traditional peoples' ability to gain weight while eating whole foods challenges the notion, once again, that carbohydrates lead to obesity. Here we have a culture whose only way of fattening young women is by force-feeding them massive amounts of proteins, carbohydrates, and fats. If to fatten the young women it was only necessary to emphasize carbohydrates in the diet, as Gary Taubes and other low-carb proponents might suggest, then why must the women be forced to eat excessive amounts food to become overweight? Why not just eat millet and couscous and dates? Many modernized folks seem to have no trouble at all gaining unneeded weight while eating far less than 16,000 calories. Yet these Mauritanian women must resort to appetite increasing drugs or the threat of a beating while eating about that much food to do the same:

Although hardly skeletal at 5'6" and 180 pounds, Hawer [a 26-year-old Mauritanian woman] says she has trouble piling on weight, and was teased by plumper girls as a teenager. Recently, her husband told her that he "didn't like sleeping with a bag of bones. Desperate to be bigger, Hawer uses drugs to aid weight gain." (Source)

It's the quality of food that's the difference. Traditional versus modern food. High-fructose corn syrup, one the great fatteners in the indutrialized nations, would be a prized commodity in Mauritania.

One other interesting observation is that the older women in the culture, who have already gone through the fattening process during their younger years and have resumed eating a normal amount of traditional foods, appear to be at a healthy weight. Did they diet to lose their weight? I doubt it. Below is a picture of women who are campaigning against the practice of leblouh. All have gone through the leblouh in their youth, and none of them remain overweight:

Sunday, October 4, 2009

The People of the Deer

Lately I've been enjoying an anthropological narrative called People of the Deer by Farley Mowat -- the famed and tenacious environmentalist, humanitarian, and defender of true scientific inquiry. In this colorful true-story adventure, first published in 1952, Mowat finds himself drawn, as he so often does, to a place far away in the middle of nowhere in the deep northern territories of Canada. It is here that he befriends and lives among an Eskimo group called the Ilhalmiut and begins to understand how modern encroachment -- namely fur-trapping and government policy -- is negatively affecting the native peoples' ability to live in a place where their ancestors had thrived for thousands of years before them. Mowat writes, quite bluntly, in the foreward of the 1975 edition of the book: "Genocide can be practiced in a variety of ways." Similar to Weston Price, he is not hesitant to place blame on Western culture for the decimation and struggle of the traditional peoples with whom he became intimately acquainted. From the foreword:

We have long prided ourselves on being a democratic nation, dedicated to the altar of freedom. Freedom for whom? If it is only freedom for ourselves to do as we please at the expense of others, then our pious stance is even more abhorrent than that of any overt tyrant -- for ours is based on a vile hypocrisy.

Fat and Deer Hairs

While the book contains many fascinating tidbits, among the most intriguing are Mowat's detailed descriptions of the traditional Ilhalmiut diet and their shifting health as a result of Western influence. When he first arrives at the small settlement of Ilhalmiut, the author is welcomed with a tray of meat that might make any Westerner's stomach churn:

Half a dozen parboiled legs of deer were spread out in a thick gravy which seemed to be composed of equal parts of fat and deer hairs. Bobbing about in the debris were a dozen tongues and, like a cage holding the lesser cuts of meats, there was an entire rib basket of a deer.

Still hungry? There's more!

There were side dishes too ... a skin sack, full of flakes of dry meat ... a smoking bundle of marrow bones ... neatly cracked to so that we would have no trouble extracting the succulent marrow. (p. 82)


The cooking varied somewhat, but the food did not. The rule was meat at every meal and nothing else but meat, unless you could count a few well-rotted duck eggs which served as appetizers. To satisfy my curiousity I tried to estimate the quantity of meat Hekwaw [a member of the tribe] put away each day. I discovered he could handle ten to fifteen pounds when he was really hungry... (p. 85)

It doesn't take Mowat long to identify the key ingredient of the Ilhalmiut diet: fat. From his own experience on lean meat for an extended period of time, he describes the vast importance of fat in an all-meat diet through his battle with an affliction which he names, for want of a better term, mal de caribou, also known by a great many arctic explorers, prisoners of war, and human carnivores as rabbit starvation:

... persistent diarrhea was only part of the effect of mal de caribou. I was [also] filled with a sick lassitude, an increasing loss of will to work that made me quite useless ...

Mowat's guide -- a half-Eskimo, half-white man named Franz -- prepared and administered a peculiar remedy:

... he took out a half-pound of precious lard, melted it in a frying pan, and, when it was lukewarm and not yet congealed, he ordered me to drink it. Strangely, I was greedy for it ... I drank a lot of it, then went to bed; and by morning I was completely recovered ... I was suffering from a deficiency of fat and did not realize it. (p. 88)

Death and Disease Among the Ilhalmiut

Concerning the health of the Ilhalmiut people, Mowat goes into extensive historic, anecdotal, and statistical detail while attempting to get at the root of the Northern natives' plight of disease and illness following the arrival of Western culture. It's no secret to those who have studied into the writings and theories of nutritional heroes such as Weston Price, Sir Robert McCarrison, T.L. Cleave, and others that when modern foods such as white flour and sugar are introduced to a traditional culture ill health follows, worsening from generation to generation. Farley Mowat joins the ranks of these great independent thinkers when he waxes sensible, explaining his own theory as to why the people of the far North and other native peoples in history have succumbed to tuberculosis, measles, and small pox:

Perhaps you have heard of the decimation of the forest Indians brought about by disease, by lack of adaptability, by inherent laziness and indolence or by other causes ... you have never heard the truth, for all of these apparent causes are manifestions of the real destroyer, which is -- starvation. If you ask about the thousands of Indians and Eskimos who die each year of tuberculosis, if you ask about the measles and smallpox epidemics which ... have destroyed over one-tenth of the Northern natives ... these people too die of starvation ... (p.91)

Is it just me, or is Mr. Mowat on to something here? He goes on to tell the story of an Inuit tribe he lived with in the winter of 1948, the Idthen Eldeli -- literally meaning "Eaters of the Deer":

In 1860 ... there were about 2000 members of the Idthen ... when the deer moved ... the Idthen people followed after ... [they] anually traveled over a thousand miles through the Barrens.

In the eighteenth century the famous explorer Samuel Hearne journeyed ... with a band of these Indians and he speaks, as do many others, of the almost superhuman endurance and physical capacity of the Idthen people.

In the winter of 1948 when I lived with the Idthen ... they numbered a little over 150 men, women, and children who spent the winters on their scanty trap lines, starving through the cold months until they could fish for life along the opening rivers ... They are a passive, beaten, hopeless people who wait miserably for death. (p. 92)

What could be the instigator of such an unfortunate circumstance? Ol' Farley doesn't mince words:

Starvation first came to them when they began to subsist on a winter diet which now consists of 80 percent white flour, with a very little lard and baking powder, and in summer almost nothing but straight fish. The Idthen people now get little of the red meat and white fat of the deer, once their sole food. Three generations have been born and lived -- or died -- upon a diet of flour bannocks and fish eaten three times a day and washed down with tea. Each of these generations has been weaker and had less "immunity" to disease than the last. (p. 93)

Government aid: giving natives the short end of the stick in America since 1492. It's interesting how what Mowat refers to as starvation can also be seen as a displacement of native foods, as Weston Price pointed out in the 1930s. Either way, the result is lowered immunity and degeneration. Mowat's solution for the dilemma of this "starvation?" Here it is, in characteristic common sense:

Surely there is but one way to cure a man of the diseases which are the products of three generations of starvation, and that is to feed him. (p. 95)

Let them eat meat and fat!

Thursday, September 17, 2009

The Ambiguity of Scientific Research

It always boggles my mind how, in so much of the health research focusing on diet that we see today, there is little to no emphasis on the types of foods eaten by participants in the studies. One case in point is a study that Stephan Guyenet recently blogged about, called "Effect of a High-Protein, Low-Carbohydrate Diet on Blood Glucose Control in People With Type 2 Diabetes." This study is a typical example of diabetics who go on a low-carbohydrate diet and experience positive results in moderating blood glucose levels. The participants were split into two groups -- one high-carb, the other low-carb. The authors describe the diets as follows (emphasis mine):

The control (15% protein) diet was designed according to the recommendations of the American Heart Association and the U.S. Department of Agriculture. The diet consisted of 55% carbohydrate, with an emphasis on starch-containing foods, 15% protein, and 30% fat (10% monounsaturated, 10% polyunsaturated, and 10% saturated fatty acid). A second diet was designed to consist of 20% carbohydrate, 30% protein, and 50% fat. The saturated fatty acid content of the test diet was ∼10% of total food energy; thus, the majority of the fat was mono- and polyunsaturated.

The authors also describe their low-carb diet as a diet "in which readily digestible starch-containing foods have been de-emphasized."

Okay. Is it just me, or are studies like this almost completely worthless? Yes, it's fascinating that when carbohydrates are lowered in the diet, blood sugar levels normalize -- this a very consistent finding throughout the scientific world. But, why? Why does this happen? What is the mechanism behind it? Is it simply a reduction in carbohydrates that makes the difference? Or could it be a reduction in harmful foods like sugar that does the trick?

In other words, the study is setting itself up for ambiguity. Do the results mean that everyone who is near-diabetic should immediately "cut the carbs?" Or should they "de-emphasize" starch containing foods? Or should they reduce sugar consumption? What is the real problem here?

Well, let's see. Maybe -- just maybe -- we can determine the composition of the diet and dive deeper into these questions. Let's take a look at a nifty table from the study:

Not real helpful is it? Now we know that the participants on the high-carb diet ate 274 grams as "starch" and the rest (114 grams) as sugars of some kind. That tells us absolutely nothing about what foods the participants actually ate. For all we know, they could be eating nothing but waffles and sodas for carbs!

Now, the reason this frustrates me is because certain foods -- namely wheat, trans fats, and sugar/high-fructose corn syrup --- can have profound effects on human metabolism in and of themselves. For example, Dr. William Davis of The Heart Scan Blog recently had this to say about the metabolic effects of wheat (emphasis mine):

A patient would come to the office ... with a blood sugar of 118 mg/dl (in the pre-diabetic range) and the other phenomena of pre-diabetes or metabolic syndrome (high blood pressure, high inflammation/c-reactive protein, low HDL, high triglycerides, small LDL), and the characteristic wheat belly. Eliminate wheat and, within three months, they lose 30 lbs, blood sugar drops to normal, blood pressure drops, triglycerides drop by several hundred milligrams, HDL goes up, small LDL plummets, c-reactive protein drops.

As for trans fats, check out what the authors of this study (done on rats) conclude:

In this study, we observed profound metabolic responses to a low-fat diet enriched with trans-fatty acids that were associated with hyperphagia, increased hepatic and visceral fatness, and diminished whole-body glucose disposal, all hallmarks of metabolic syndrome.

(This particular study, it should be noted, has its own flaws in terms of isolating factors, but the stuff on trans fats seems pretty solid.)

And then there's refined fructose-containing foods, such as high-fructose corn syrup and sugar -- don't get me started!

So would somebody please explain to me why these harmful foods aren't taken into consideration in so much of the mainstream dietary research out there (such as the low-carb study that I began this rant with)? It would seem to me that, due to the significant metabolic effects of these foods (vegetable oil, I haven't forgotten about you!), any dietary study not detailing complete food logs is not even worth a glance. It's great that reducing carbohydrates has a positive effect on the health of diabetics, but would simply reducing wheat or sugar or trans fats have the same effect? If there's a study out there that dives into this molotov cocktail of franken-foods, I'd love to see it.

Monday, September 14, 2009

My Health Profile (part 3): The Turnaround

At a friend's potluck in Tucson, I said "what-the-heck" and ate a palm-sized portion of New Zealand Grass-Fed lamb. After all, the meat seemed ethical, and my friend -- who I respected as a morally responsible, spiritually-savvy person -- was enjoying the meat, too. After a few hours, I found myself asking an attractive woman for her phone number. Something was definitely different. My 2.5 year vegetarian streak was over.

The next morning, I woke up with muscles where I hadn't felt muscles in years. My head felt crisp and clear. It was the first time in years that I felt genuinely excited about the day ahead. A gratifying, "Ahhhhh ... " came out of my mouth. That's when I decided that I had found an answer.

A few days later, I was visiting that same friend who fed me my first tasty morsel of meat in over two-and-a-half years, and it just so happened that he had a very intriguing and pertinent book on his book shelf that I was drawn to: Nourishing Traditions by Sally Fallon. I borrowed the book and devoured the information whole like a wolf scarfing down a fresh post-famine kill. I had an intuitive hunch before diving into this book that animal foods were a necessary part of the diet -- after all, I'd felt much better after eating some meat and butter -- but Ms. Fallon, bless her heart, provided me with the reason behind this vague feeling and assisted me in further understanding the whys and wherefores.

Now, I was on a mission to rebuild my body and my life with nutrient-dense foods. I tried eating meat more often and didn't deny myself of Thankgiving turkey or Christmas ham. My first "meat-fest" trials ended in pain and agony as my body had forgotten just how to digest the rich proteins and fats. For a few weeks I had horrible indigestion headaches and a heavy feeling that permeated my entire body. But I was determined to feed myself and get through the adjustment period. Researching information on the internet, I found that the body can take weeks to months to rev up digestive juices for meat after being without it for a long time. This is probably why vegetarians often say, "I tried eating meat again -- I felt horrible!" After about a month's time, I was beginning to feel stronger and lighter in my body. After a few more months I was back to my ideal weight and body composition, my facial hair grew in thicker and more evenly, and my libido was definitely back. And I was genuinely happy and outgoing -- a big change from my low-energy, slightly-depressed vegetarian days.

Nowadays, I feel grateful and blessed to have pulled myself out of the vegetarian abyss that seems to suck so many people in. Many intelligent, environmentally sensitive, and/or health-driven individuals fall far into this black hole of nutrition and can't get out. My hope is that by sharing my story and disseminating nutrition and health information based on evolution, history, traditional cultures, personal experience, and modern-day science, I can influence others to change their bodies -- and their lives -- for the better.

Tuesday, September 8, 2009

My Health Profile (part 2): Seeking Wellness

After summer was over and my muscles had shrunken significantly, I decided college wasn't for me and rejected a scholarship to the University of Arizona, resolving to fulfill philosophical fancies I'd had since age sixteen to live in the wilderness and learn how to survive with nothing and need nobody.

I ended up in central Arizona as a farm intern at the Reevis Mountain School of Self-Reliance, a living, working homestead eight miles deep into the Superstition Wilderness. It was here that my foray into alternative health and healing began (and my muscles continued to shrink). Despite the fact that the founder of the school, Peter Bigfoot, was a former vegetarian of 30+ years -- fully fruitarian for one of those years -- and was unabashedly eating plenty of meat when I arrived, somehow (possibly from the media and word of mouth) I got the bright idea that vegetarianism was the healthiest diet to consume. Now for the downward spiral.

As I got deeper and deeper into wilderness survival following my time at Reevis, I also got deeper and deeper into simple vegetarian staples: amaranth, quinoa, oatmeal, peanut butter, raisins, beans, huge salads with olive oil, and other "healthy" whole foods. I also got more and more interested in restricing my food intake -- maybe someday I would have to eat so little that I could survive in the mountains all by myself and be a hermit! Wouldn't that be nice? Oh, to be 19 again.

The skinnier I got, the healthier I thought I was becoming. Anyone that ate the typical American diet became a glutton and destroyer of the earth in my eyes. After all, it was the problem of over-consumption that was bringing the planet to an early demise, and food was one of those products that was almost certainly abused and taken for granted. So I was going to be better than that. Yes, I was going to be a low-calorie vegetarian, save myself and save the planet.

Now, not only was I a vegetarian for health and survivalist reasons, but I also had the entire world's suffering behind me to rationalize my choice. I ate less and less. I fasted. I dumpster-dived. I ate wild edibles and garden veggies. I harvested citrus in Tucson over the walls of neighbor's yards. I learned to survive in a brutal, unforgiving, and unethical world. I felt empowered, independent, free. Yeah, I weighed 155 pounds and looked gaunt and sickly -- so what? I was healthy! Wasn't I?

It took about 3 years of that behavior -- that way of relating to myself and the world -- to finally give meat a try again. I was at my body's breaking point. I felt dizzy when I stood up, fatigued and weak. Daily yoga two or three times a day was all that seemed to keep my limbs, joints, and muscles feeling relatively pain-free. My lower back was worn and aching constantly. Anything physical became a chore. My libido was completely shot -- I hadn't thought about being with a woman for years. Then came the miracle.

Friday, September 4, 2009

My Health Profile (part 1): The Formative Years

Health awareness and a desire to be in the best health I could be began at an early age for me. I can recall being six or seven-years-old and eating the crust on my bread -- not because I liked it, but because I was told it was good for me. I would choke down green peas or iceberg lettuce in order to satisfy the arbitrary requirement for something "green" with dinner. Last at the dinner table, I sat slowly chewing gristly, lean meat until every morsel was gone. Then, and only then, could I indulge in some ice cream.

A craving for real food seemed to permeate my childhood. Lean, well-cooked meat, cereal, 2% milk, enriched wheat bread and pasta, and the occasional cookie (or two or three) didn't seem to satisfy this craving. I often found myself nibbling on margarine when clearing the dinner table or spreading some other butter substitute on bread so thick that it would leave teeth marks. Naturally, I desired something fatty and rich and nutrients, but since no such thing was available (besides cheese), I went for the trans-fat laden, unreal goop that was as close to real butter as I could find in the refrigerator.

All that being said, I'd like to believe that I ate better than most kids growing up in America in the 80s and 90s. Or maybe I just ate less junk food than most kids. It seemed to be rare in my friends' households to limit soda and candy as my family did, or to only have dessert when dinner was finished. My family also emphasized exercise, and my brother and I were engaged in sports by age 4. Also, as much as I hated it as a kid, I have to give lots of credit to my dad for insisting that I play outside during the day and only watch a maximum of 2 hours of television daily. This certainly kept me active and physically fit growing up.

At the tail end of elementary school, it was time for that orthodontist-assisted rites-of-passage we call "braces." Pictures of me before the procedure reveal that I was your typical crooked-teeth, pinched nostrils, narrow-faced kid.

When middle school approached and I began to have more responsibility for my health, I would frequently spend some of my lunch money on the soda machines on school grounds. I was up to a 3 soda per day habit, and I felt guilty because I knew soft drinks were "bad" for me in any amount besides moderation. Fast forward to Freshman year in high school when I began abstaining from sodas completely after making a deal with my mom that if I stopped, she would stop, too. From then on, it was mostly water, orange juice, and occasionally gatorade as my beverages of choice. To this day, I haven't taken up drinking sodas again.

High school was a time of pumping iron, playing sports, building muscle, and trying my best to eat "right" according to what the bodybuilders at the gym were recommending: egg whites, protein powders, and lean meat -- essentially an emphasis on protein as the ultimate food and keeping fat as low as possible. Yes, I was attempting to adhere to a low-fat diet. That's probably why I ate so many fructose-fueled Power Bars. I was compensating for the lack of fat in my diet. Looking back, it's astonishing to see how "puffy" my face and overall musculature was. It was also during this time that I had my wisdom teeth removed, a "necessary" procedure (according to the orthodontist) if I was to prevent future dental disasters.

Following high school graduation, I thought, "Time to start being realistic." The expensive protein shakes and Power Bars were not economically viable options if I was to survive in the real world. Nor was a gym membership. I drastically changed my diet and lifestyle to appeal to my economic sensibilities. I stopped lifting weights and pounding protein shakes and began experimenting with hiking for exercise and eating cheap staple foods like beans and rice, pasta and tortillas. A month later, my muscles deflated.

Tuesday, July 28, 2009

The Darwin of Nutrition

Here is an excerpt from an article I wrote called, "Weston A. Price: A Search For True Health," recently published in The Bulletin of Primitive Technology, Spring 2009.

Following the realization that food was the major contributing factor in human health and disease, Weston Price kept a keen eye out for what specific foods seemed to keep the primitives in good health. It was already obvious that industrialized foodstuffs weren't supportive of optimal health, so now it was Price's mission to determine what particular “nutritional programs” contributed to the well-being of the primitive groups. What food traditions had thousands of years of trial and error resulted in? Dr. Price noted every culture's dietary habits, including special foods utilized during times of child-rearing for the man and woman. It impressed him that the primitives seemed to be aware of preventative measures beginning with the health of the parents:

A very important phase of my investigations has been the obtaining of information from these various primitive racial groups indicating that they were conscious that [physical degeneration] would occur if the parents were not in excellent condition and nourishment. Indeed, in many groups I found that the girls were not allowed to be married until after they had had a period of special feeding. In some tribes a six month period of special nutrition was required before marriage. (Nutrition & Physical Degeneration, p. 3)

Dr. Price was convinced by the ubiquitous nature of this practice that many ailments of modern civilization were caused by prenatal undernourishment and that many of these problems could be prevented by the proper nutritional reinforcement of the parents to be.

Curious about the nutritional content of the primitive diets – particularly those that were emphasized for child-rearing – he took several samples of foods from each locale in order to test them at his laboratory in the United States. Armed with such information, Dr. Price believed that he could then determine what all the varied diets of each culture had in common and further understand the nutritional wisdom of the primitives. When he analyzed the traditional foods, he was excited to find that, on the whole, foods in the native diets were four times richer in water-soluble vitamins and minerals and ten times richer in fat-soluble vitamins than the industrialized American diet of his day. Of the native foods studied, Price realized that the foods which the primitives most emphasized and often times considered sacred (especially for child-rearing) were rich in “fat-soluble activators:” vitamins A, D, and what he referred to as “activator X” – now understood to be vitamin K2. This included foods such as fish eggs, liver, certain insects, and other cholesterol-rich, fat-rich foods (see table below; foods high in fat-soluble vitamins are in bold).

The qualities of the foods, Price came to realize through testing native foods as well as conducting experiments in his laboratory, depended greatly on the quality of the soil and the feed given to the animals. For example, grain and hay-fed dairy products in the United States had far less vitamin and mineral content when compared with dairy products from the Swiss in Loetschental Valley, which was produced from cows grazing on “rapidly growing green grass” in the spring and summer and chlorophyll-rich hay in the fall and winter. Price determined that the color of the butterfat from such dairy products could accurately predict the nutrient-density: a deep yellow or orange color reliably indicated high vitamin content. The laboratory tests of traditional foods further bolstered his confidence in the “wisdom of the primitives.”

Thursday, July 16, 2009

The Reality of Primitive People's Lifespan

Human lifespan is one of the topics that frequently comes up in my discussions with others about primitive nutrition and health. In our day and age, this subject has become a trendy factor in gauging the overall health of any given population or individual. If a person lives a long time -- say 100 years -- they are considered long-lived and must have lived a healthful life to reach such an impressive age. Yet, there are always anomalies to this assumption. Comedian George Burns lived to be 100 while smoking between 10-15 cigars a day. At 98, he joked, "If I'd taken my doctor's advice and quit smoking when he advised me to, I wouldn't have lived to go to his funeral."

Western cultures' obsession with lifespan has existed for a very long time. The Bible cites people living for hundreds and thousands of years in ancient times. More recently, researchers were fascinated by claims of the Hunzakats commonly reaching ages of 120 and beyond (this myth is dispelled quite well by this website). On the other end of this spectrum, many experts and laymen agree that primitive humans' lifespan was nothing to be impressed about: old age during those times was thought to be around forty years old.

Recently, I came across a study that blows all these distortions, assumptions, and obsessions out of the water. The study is a meta-analysis -- meaning it draws off of the research of many other related studies, and is titled "Longevity Among Hunter-Gatherers: A Cross-Cultural Examination." I suggest you give the full study a read, as there are many fascinating tidbits in it. The authors, Gurven and Kaplan, assembled lifespan and mortality data from around the world that included isolated hunter-gatherers (the closest living relatives to our paleolithic ancestors that we have), acculturated hunter-gatherers, isolated neolithic cultures, Western modern civilizations, and even chimpanzees for comparison. The authors focused solely on reliable demographic data from a handful of cultures. The table below sums up the results of the study well:

This data may come as a surprise to both romanticists of the ancients' supposed longevity, as well as to those that claim primitive human beings lived a life that was "nasty, brutish, and short." Here we have numbers that secure a middle ground amidst these two extremes. The authors of the study sum up their compiled information as follows:

The average modal age of adult death for hunter-gatherers is 72 with a range of 68-78 years. This range appears to be the closest functional equivelent of an "adaptive" human lifespan.

So there you have it. Convincing research suggesting that our hunter-gatherer ancestors are not at all far-removed from modern civilized human beings in terms of lifespan.

Monday, June 15, 2009

Health Profile: Geronimo

A well-known Chiricahua Apache and leader of his people, Geronimo is most recognized for his bouts with -- and escapes from -- Mexican and U.S. military troops in the mid to late 1800s. Among the Apache, Geronimo was thought to have great powers, including the ability to see into the future and leave no tracks when moving through the mountains and deserts of his tribe's territory. His band of Apache warriors were among the last Native American peoples to surrender to the U.S. government and live on reservations.

Tales of Geronimo's cunning retreats from his military pursuers abound. One story holds that Geronimo and his band disappeared without explanation when trapped in a cave that had no second entrance. On horseback, he and his warriors were able to keep ahead of the U.S. cavalry -- with its horses and loads of supplies -- at a pace of 70 miles a day while carrying very little and living on wild plants and animals, even resorting to killing their own horses for sustenance. During battles, Geronimo was shot and wounded several times yet never succumbed to death from a bullet wound.

In short, the man was -- like most traditional native peoples of his time -- quite a specimen.

Looking at his photos, Geronimo's beautiful facial structure -- round face, square jaw, prominent cheek bones, wide flaring nostrils -- is readily apparent. This indicates a full and proper development during his formative years as an infant, young boy, and teenager. (We can't comment on his teeth as he never smiled in photos, but he probably had all of them.) His broad shoulders and upright posture suggest agile movement and strength. Like a wild animal, Geronimo was optimally built for his rugged environment of high mountain sky islands and vast seas of low desert. Having lived near, and backpacked through, the Chiricahua Mountains in southeastern Arizona (Geronimo's former stomping grounds) for several months, I can attest to the ruggedness of this landscape.

Lifestyle plays a major role in the fitness levels of Geronimo. Traveling on foot or horseback for up to 70 miles, stalking wild game, and crafting tools and shelters from his surroundings, he spent his life using his body. This lifetime "use" was certainly a major contributing factor to his physical capabilities. Yet, perhaps he wouldn't have been as capable -- his body not as supported, his build not as solid, his immunity and ability to recover from bullet wounds diminished -- if he hadn't also eaten the natural, primitive diet of his people. What kind of diet was that? Here's a list of some of the staple foods that the Apaches ate and the nutritional qualities that make them supportive:
  • Wild game: deer, elk, quail, rabbit, etc. --> utilizable proteins and fats, which provide amino acids, b-vitamins, fat-soluble vitamins, and, when using the whole animal (as was common in Geronimo's day), every single needed nutrient the human body needs. (Interstingly, the Apaches had taboos against eating snakes, frogs, fish, and bears.)
  • Corn, beans, and squash--> starchy carbohydrates traditionally processed to eliminate anti-nutrients (fermented, roasted, soaked, leached, etc.) providing supplemental energy and sparing fat loss; additional vitamins and minerals (for an interesting account of how Apaches prepared a fermented corn drink called tizwin, see bottom of this post)
  • Agave--> heart of the plant pit-roasted, young stalks eaten; provides supplemental starch and sugars in the diet; spares fat loss ... but gives horrible gas (I can attest to this myself after eating a pit-roasted agave -- yeesh!)
  • Acorns & Pine nuts--> roasted, soaked, leached, pounded, or eaten fresh (some species); beneficial proteins and fats; particularly rich in monounsaturated fatty acids
  • Prickly Pear Cactus--> fruit cooked into syrup or eaten fresh and young pads boiled or roasted (high in oxalic acid raw); fruits rich in electrolytes for a hot, dry climate; pads rich in calcium and vitamin A beta-carotene
Really, if we break it down, we find that the Apaches were quite omnivirous much like other hunter-gatherer tribes across the world (Australian Aborigines and Bushmen of the Kalahari come to mind). Geronimo's very supportive, nutrient-dense Apache diet of meat and properly prepared plant foods allowed for the full facial and skeletal development -- as well as the mental sharpness and alertness -- common to traditional peoples eating a traditional diet (see Weston Price's studies for more on this).

So, it seems that the famous Apache leader lived healthfully with vigor and "fierceness" (as many accounts report) throughout his life. But what of his lifespan? Does it fit the description, "nasty, brutish, and short?" Not in the least. Geronimo lived from 1829-1909, dying at age 79 from pneumonia after drunkenly falling off his horse and contracting a severe cold. Had his life not been cut short by this accident, perhaps he would have lived well into his 80s or 90s.

"I cannot think that we are useless or God would not have created us. There is one God looking down on us all. We are all the children of one God. The sun, the darkness, the winds are all listening to what we have to say."

Making "Tiznin" -- An Apache Fermented Corn Drink

"First, they soaked the corn overnight in water. They dug a long trench and lined it with grass, placed the soaked corn in the trench, and covered it with another layer of grass. Sometimes they covered the whole with earth or a blanket. After sprinkling the corn with water morning and evening for ten days, during which it sprouted, they took it out, ground it with their grinding stones (mano and metate), and then boiled it for five hours. Finally, they strained off the liquid and set it aside. After about twenty-four hours, when it stopped bubbling, it was ready to drink." (From Geronimo: The Man, His Time, His Place by Angie Debo, p. 22)

Thursday, June 11, 2009

Back From the Wild

I recently returned from my first 8 day shift (and hiatus from blogging) as a field guide with a local wilderness therapy organization here in Utah. For those unfamiliar, the wilderness therapy industry is made up of organizations -- private, non-profit, and corporate branches -- which treat clients with behaviorial and substance-abuse issues by removing them from civilization and plopping them in the desert or woods or mountains for several weeks of backpacking and therapeutic work. Field guides (like me) in these programs backpack with a group of 2-10 clients (both teens and adults) for a period of 8 days at a time in the wilderness with 6 days off between shifts. If you're a guide like me, you do it all in homemade tire sandals (see picture). The particular program that I now work for specializes in addictions of all kinds, incorporating a 12-Step model (i.e. Alcoholics Anonymous) as the centerpiece of their approach.

In addition to this, primitive living skills are utilized and encouraged as important metaphors. A fire-by-friction bow-drill, for example, provides clients with an opportunity to interact with their surroundings in a practical and creative way to make fire for cooking, warmth, and comfort. When a client has met such a challenge, the accomplishment can be a significant confidence-builder, supporting the difficult recovery from addiction as the client says, "Hey! I just made fire with sticks! Maybe I do have the ability to stop using drugs." Other skills include general backpacking know-how (tarps, sleeping gear, cooking, etc.), caring for pack llamas (yes, each group has a few of these disgustingly lovable creatures), and wayfinding in the wilderness.

Of course, as someone who has a keen eye for nutrition and how it relates to health, I observed the foods being eaten by both the field guides and clients in the program. To my surprise, the foods weren't all that bad. Aside from the typical wheat products (and the potentially problematic gluten therein), I was pleased to see that each client was given a pound of cheese every four days, tuna, fresh meat once a week, and mostly starchy carbohydrates (the sole exceptions being dried fruit, sweetened granola, and "gookinaid" -- a powdered, sugary, electrolyte drink). The group foods included a pound of butter. Questionable foods that one might lump under the "good-not-great" category included: peanut butter with hydrogenated palm oil, spam, and "instant" refried beans.

While this wilderness therapy program isn't tailored to incorporate nutritional therapy, they do so without knowing it by providing the clients with a low-fructose diet. This in and of itself can go a long way towards restoring health, in my opinion. With such a diet, as well as the daily physical activity of backpacking and camping, I found the clients to be quite stable, even those coming off of hard drugs like heroin.

That's not to say that things couldn't be better. I'm a big believer in the power of nutritional therapy and would love to see some use of vitamins, minerals, and amino acids. Particularly for a population like addicts and alcoholics who are in a physically depleted and/or unbalanced state, it would be great refuel their bodies and alter their addictive brain chemistry with the help of supplements. As for food, it would be ideal if clients had access to pemmican (which I personally made and brought out for myself), more fresh meat, and perhaps some fresh raw milk, cheese, cream and butter -- all preferably from grass-fed animals. Supplying digestible, low-toxin foods (such as white rice) and eliminating many of the canned meats and commonly allergenic foods (such as wheat) might help immensely as well. A wilderness therapy program that incorporates these things could be far more successful in terms of graduating clients' continuing sobriety. With such results, the program might be more financially stable as it attracts publicity and recommendations due to its higher success rates.

One such program -- the only one in existence that I know of -- is Open Sky Wilderness Therapy based in Durango, Colorado. These folks have a constant flow of clients. Why? A big reason is their use of all organic and grass-fed foods -- something that people look for nowadays with all the media attention and rising popularity of such products. (*cough* Michael Pollan *cough*) To me, this attention to quality nutrition is the wave of the future in wilderness therapy, and I am hearing more and more talk about it. However, from what I gather from others who have worked with Open Sky, my only criticism is their minimal use of animal products (little to no meat and a lot of rice or quinoa or beans in group stews with little added butter or coconut oil is common -- check out their food menu) and their belief in unprocessed "whole foods," which means nutrient-robbing toxins bound up in whole grains, nuts, seeds, and beans aren't eliminated during cooking. Phytates for breakfast, anyone?

Saturday, May 23, 2009

So What's For Dinner?

Part 5 from my paper, "Modern Health, Primitive Wisdom: American Health History and the Findings of Weston A. Price."

"Meat, potatoes, and gravy. I don't like vegetables; I can't hardly eat any of them. The potatoes take care of all the vegetables."

-- Lena Stanley, Centenarian (Edelman 1999, p. 378)

The world of human health and nutrition is a bewildering labyrinth at times. Just about everybody has their own idea of what is and isn't healthy, and there are plenty of diet books, doctor's recommendations, health gurus, and dieticians out there to guide the way. Who is right? Who is wrong? What is the optimal human diet? It is questions such as these that can stir up confusion and debate. Yet, nutritional science is still in its infancy, having only been in the public light since the late 18th century. There is plenty of room for confusion and debate. As one nutritionist says, "It's all theory" (A. Minear, personal communication, January 17, 2007).

If we are to only work with what the last 100 hundred years of research and science has told us about how the foods we eat affect our health, we are left with but a small period of time upon which to base our ideas -- we only see how food has affected human health over a millisecond of the time that people have been eating. During this brief period of history, we have conducted multitudes of studies that make very convincing arguments for or against certain aspects of nutrition. An interested, research-oriented individual can find in books, articles, and journals many studies supporting a low-fat, high-carbohydrate way of eating, for example. That person can also find many resources that support the complete opposite -- extolling the benefits of high-fat, low-carbohydrate diet. Add in the varied interpretations by the scientists involved in these studies, as well as the opinions of independent researchers, the media, doctors, nutritionists, friends and family -- and that's when bewilderment arises.

This is where Weston A. Price comes in. His research and conclusions are drawn from a combination of modern and ancient dietary wisdom. The traditional population groups he studied had all been eating a certain way for thousands of years. With the aid of modern science, Price found that the foods these people ate provided needed nutrients in consistent quantities to allow for optimal growth and development -- and these foods worked for these cultures over thousands of years. In studying nutritional science, why work with only a miniscule piece of human health history (as in the last hundred years) when there exists a firm foundation in the dietary wisdom of primitive peoples -- a foundation built over thousands of years? In interpreting our own health, why not look to our ancestors and ask what kept them free from degenerative diseases? Dr. Barry Groves, a health researcher and author, puts it this way:

We should not be looking for answers to the diseases we suffer from today, but why many peoples in the world don't get them at all. That way we might stand a better chance of an answer to the dreadful plague of ill-health we are beset with.

It is extremely important for our modern world to acknowledge the findings of Weston A. Price. In considering Price's discoveries of healthy traditional cultures, we have the basis for a logical advancement in modern medicine: the creation of a benchmark that describes what true health looks and feels like. This is something that does not currently exist in the medical establishment. Though we have many tests and procedures to determine whether or not a patient is "normal" or "at-risk" for disease, we have no set standards for optimal human development. This was Price's idea in the first place: he wanted to find "control groups" of healthy populations who were not suffering from the physical and mental malfunction of his day -- he wanted to define what it meant to be truly alive and healthy:

Instead of the customary procedure of analyzing the expressions of degeneration, a search has been made for groups to be used as controls who are largely free from these affections (p. 1).

And this is what Weston Price found in primitive peoples across the world. He found in these people a new standard for human potential. But how can we define such a standard in a world where disease and deformities are the norm?

Like Price, we simply observe the people who are actually healthy. When we look at the photos that Dr. Price took during his travels, we witness a level of physical and mental well-being simply unknown to most modern human beings. When we see those broad faces, perfect teeth, and -- as Price stated again and again -- high moral character of primitive peoples, we are observing a higher degree of human health. It is readily apparent that primitive peoples have many qualities that modern people do not possess. Through the observation of these ancient cultures, whether through books, photographs, documentaries, or travel, it isn't hard to see that they are different -- and not just culturally. We can gain immense benefit from observing these differences and determining what they possess in health and well-being that we do not.

Minds and Hearts

Let us consider the way primitive people use their bodies and minds: how they respond to excitement or danger, the values they live by, the nature of their temperaments, and the way they breathe, eat, play, and live. Are they hyper-anxious? Do they steal, cheat, and murder? Do they have nagging physical problems, such as back, neck, and shoulder tension? In large part, the answers are: No, no, and no. We moderns can use the answers to such questions -- and the implications therein -- in the betterment of our own health. In addition to subjecting the "control groups" of healthy indigenous people to medical tests, let us also communicate with these people and sense with our hearts the degree of their well-being. Let us observe closely what separates them from us in body, mind, and spirit. And let us ask what we can learn from these differences.

Perhaps a good start would be to eat the way our ancestors did. In returning to the food traditions of antiquity in the United States, we have a chance to restore our health. Much has changed in American food habits over time. Most people would say that our nutrition has improved immensely in modern times; after all, we have progressed in technology, medicine, and hygiene -- isn't it obvious that we would have enhanced our nutrition as well? With all of the knowledge that we have accumulated in the sciences, children are still being born with facial and dental deformities. These deformities are not questioned so much as they are accepted. In fact, they aren't even referred to as deformities as they were in Weston Price's day, and they are not at all believed to be connected with nutrition as Price's research revealed.

In the U.S. these days, it is just part of life to have your wisdom teeth removed, have a narrow face, get braces, or develop a chronic health condition. We assume we are advanced enough to know if something isn't right with human growth and development. Yet again, how can we know that something isn't right if we don't have any clue as to what is "right" in the first place?

Once again, traditional peoples like our American ancestors paint a picture of how human beings are meant to be. Our ancestors provide -- through their facial and skeletal development and lack of degenerative disease -- an example of close-to-optimal health. I say "close-to" because Americans at the turn of the century still did not match up to the vibrant glow of the aforementioned primitive cultures of Price's studies, all of which were completely free from disease and deformity. However, early Americans were far healthier in many ways than we are today. And, as was suggested earlier, all things in consideration: early Americans' lifespan closely matches the life expectancy of today.

Once we observe the characteristics -- physical, mental, and spiritual -- in traditional peoples across the world, it is readily apparent that modernized populations are sorely lacking. It is only sensible then to ask how traditional peoples attained such refined attributes. It was apparent to Weston Price that diet was a key factor, and this is my assertion as well. Centuries of nutrient-dense foods have allowed for the creation of superb human beings in traditional societies:

One immediately wonders if there is not something life-giving in the vitamins and minerals of the food that builds not only great physical structures within which their souls reside, but builds minds and hearts capable of a higher type of manhood ... (p. 27).

The "minds and hearts" of primitive people provide our modern civilizations with a lucid, inspiring picture of what true health is. We are given a wonderful opportunity to observe these examples and ask how we can attain this health through employing traditional food habits and following the wisdom that our ancestors have left for us.


Edelman, Bernard. (1999). Centenarians: The Story of the 20th Century by the Americans Who Lived It. New York: Farrar, Straus & Giroux.

Groves, Barry (2005). Our love affair with fat -- a historical perspective.

Price, Weston A. (2003). Nutrition and Physical Degeneration. La Mesa, CA: Price- Pottenger Nutrition Foundation.

Friday, May 15, 2009

What About Cholesterol?

Here is Part 4 in a series of blog posts adapted from my paper, "Modern Health, Primitive Wisdom: American Health History and the Findings of Weston A. Price."

It is commonly believed that saturated fat and cholesterol are primary culprits in the current heart disease epidemic in the United States. We have already taken a look at saturated fat, but what about cholesterol? Is there any substantiation behind the claim that a high-cholesterol diet leads to clogged arteries or dangerously high blood cholesterol levels? What about early Americans and other traditional people who ate cholesterol rich foods and did not suffer from heart disease?

Returning to Weston Price's studies of traditional cultures, one finds that the most prized foods were very rich in fat and cholesterol. Some of these foods include liver, butterfat, fish eggs, and a variety of rendered animal fats, such as lard, tallow, and chicken fat. These cholesterol-rich foods are also rich in fat-soluble vitamins, the catalysts responsible for proper protein and mineral assimilation, and may be the key to rearing healthy children with round faces. Price found no evidence of heart disease, cancer, and other degenerative diseases in the people who enjoyed such nutrient-dense foods. On the contrary, he found primitive people to be the most vibrant and healthy people he'd ever seen.

Like other traditional cultures, early Americans saw no reason to avoid cholesterol-rich foods. They savored hearty, nutrient-dense foods that were high in cholesterol. Butter, cream, egg yolks, lard, tallow, and untrimmed animal meats (including organs) were not disdained -- these foods were thoroughly enjoyed and used extensively in recipes of all kinds. Before the advent of nutritional sciences and USDA food pyramids, turn-of-the-century Americans were enjoying such foods while having no knowledge of cholesterol and its function in the human body. They were unknowingly supplying their bodies with a nutrient that is very supportive to good health.

Cholesterol is not a fat -- it is a waxy alcohol that is not utilized for energy by the body; it does not supply calories. Rather, it is absorbed directly by the intestinal wall without needing to be broken down like fats, proteins, and carbohydrates. The amount that is absorbed by the body generally amounts to less than 50%. Cholesterol plays a key role in brain growth and development (especially in infants), cell membrane integrity, and the healing processes of the body -- thus, it is usually present in scar tissues where there is a repair process happening, as in arteriosclerosis (scarring of the arteries). In addition to all of these functions in the body, cholesterol is the raw material for our hormones. This includes sex hormones and adrenal hormones (also known as "stress hormones"). The adrenal hormones are especially needed in the modern world where stress is a constant part of our lives (Enig, 2000, pp. 48-50, 56-58).

The story of how such a vital nutrient went from being revered in the early century, to being maligned and feared in modern times, is well-documented in The Cholesterol Myths, by Uffe Ravnskov. In this book, Ravnskov makes it clear that there is very little reason for Americans to fear the theory that dietary cholesterol contributes to heart disease and high blood cholesterol levels (the diet-heart theory). (Interestingly, according to the author, blood cholesterol levels may not have any pertinence in evaluating the risk of heart disease.) Several population groups are cited by Ravnskov that consume substantial amounts of cholesterol-laden foods and do not have heart disease or high blood cholesterol levels. This includes both modern and primitive populations. It is also noted by Ravnskov that the studies upon which the diet-heart theory are based are either flawed or skewed to prove the truth in the theory.

One of the pillars of the diet-heart theory is The Framingham Heart Study, which is often referred to as proof that high cholesterol levels lead to a greater risk of heart disease. Ravnskov found this study to be flawed in a number of different ways. The director of The Framingham Heart Study had this to say about the final results of the project:

In Framingham, Massachusetts, the more saturated fat one ate, the more calories one ate, the lower the peoples' serum cholesterol ... we found that the people who ate the most cholesterol, ate the most saturated fat, ate the most calories weighed the least and were the most physically active (as cited in Enig & Fallon, 2001, 5).

Such inconsistencies in the diet-heart theory have spawned an entire network of scientists, researchers, and health professionals who call themselves The International Network of Cholesterol Skeptics, or THINCS. This organization has among its members many reputable individuals who have taken it upon themselves to disseminate unbiased information and engage in discussions concerning the science and health effects of dietary cholesterol. Armed with this information, the interested individual can arrive at his or her own conclusions about whether or not one should worry about cholesterol in the diet, as well as determine whether or not high-cholesterol foods were detrimental to early Americans.


Enig, Mary G. (2000). Know Your Fats : The Complete Primer for Understanding the Nutrition of Fats, Oils and Cholesterol. Silverspring, MD: Bethesda Press

Ravnskov, Uffe (2000). The Cholesterol Myths: Exposing the Fallacy that Saturated Fat and Cholesterol Cause Heart Disease. Washington, D.C.: New Trends Publishing.

Next up, the thrilling conclusion: "So What's For Dinner?"

Saturday, May 9, 2009

Big, Fat Changes in American Foods

Part 3 in a series of posts adapted from my paper, "Modern Health, Primitive Wisdom: American Health History and the Findings of Weston A. Price."

Over a hundred years, we have conquered tuberculosis and pneumonia, improved safety measures in work environments, developed methods to increase food supply, and improved infant survival rates. Yet the quality of our lives is now diminished through conditions such as diabetes, heart disease, and cancer. These changes in disease patterns over the last century correlate strongly with changes in diet. Aside from an increase in processed food consumption since the early 1900s, our consumption of fats and oils has shifted quite dramatically in terms of quality (not so much quantity, contrary to popular belief). In other words, the type of fats (animal, fruit, and vegetable) and specific fatty acids (saturated, monounsaturated, and polyunsaturated) that Americans ate one hundred years ago were very different from those that are eaten today. The table below provides an overview of the changes that have taken place.

Much of the fats that Americans have eaten for centuries -- and that many traditional cultures have eaten for thousands of years -- have been mostly saturated and monounsaturated. Animal fats top the list in 1890 as the fat of choice for cooking, baking, and spreading. Yet, heart disease (commonly believed to be caused by animal fat consumption) was far less prevalent during this era. In 1990, we see vegetable oils are leading the way, while animal fat consumption is so minimal that it does not make the list. For the first time in history, people are ingesting large amounts of polyunsaturated oils extracted from seeds and grains. These oils are often unknowingly eaten in prepackaged foods, as they are the oil of choice for the modern food industry due to their cost-effectiveness. Most potato chips, for example, use corn, canola, or soybean oils -- none of which were consumed in significant amounts by our ancestors.
As the type of fats and oils that the United States consumes has changed in the last 100 years, degenerative diseases have became more and more common. Saturated fat consumption has been blamed for causing many modern diseases. Modern Americans are claimed to be eating too much saturated fat and have been encouraged to cut back as much as possible to prevent disease. Is saturated fat to blame? If it were, one would expect the consumption of saturated fat to have increased substantially since the early century. This has not been the case, however:

Over the course of the 19th century, as heart disease has increased and cancer has become a common cause of death in the United States, saturated fat consumption has remained quite stable. Monounsaturated fat consumption has increased substantially. Yet this change pales in comparison to the growing popularity of polyunsaturated fat in the American diet. In the 1950s, with polyunsaturated vegetable oils gaining favor by the edible oil industries (whose primary motivation was to make a profit), Americans began eating more and more of these unnatural, man-made fats -- fats which were traditionally only consumed as whole foods, such as grains, vegetables, seeds, and nuts.

In her book, American Food Habits in Historical Perspective (1995), Elaine N. McIntosh states: "Essentially, the consumption of animal fat has declined since 1940, and the consumption of vegetable oils has increased steadily since 1909, overtaking animal fats in 1950" (p. 210). The transition from a diet rich in saturated and monounsaturated fatty acids (mostly from animal fats) to a diet high in polyunsaturated fatty acids (vegetable oils -- many of which are hydrogenated trans fats) has been one of the most significant changes in human nutrition in the past 100 years.

Source for above tables:
Enig, Mary G. (2000). Know Your Fats : The Complete Primer for Understanding the Nutrition of Fats, Oils and Cholesterol. Silverspring, MD: Bethesda Press

Next, Part 4: "What About Cholesterol?"

Friday, May 1, 2009

Were Early Americans Really Living Shorter Lives?

This is Part 2 in a series adapted from my paper, "Modern Health, Primitive Wisdom: American Health History and the Findings of Weston A. Price."

Looking deeper into the life expectancy statistics that are used to gauge our country's health status, one quickly finds that it is not a simple black-and-white procedure. Many factors create discrepancies in the data. One prime example is the role that infant mortality rate plays in determining life expectancy data. As the mortality numbers of the overall population are added up, every infant death contributes a "0" to the tally, significantly impacting the final average. Below is a graphic representation of the result of this phenomenon. (blue = infant mortality; yellow = average lifespan.)

In the above figure, we see that infant mortality rates in 1900 are quite high at 14 %. Correspondingly, average life expectancy of newborns in 1900 is very low at 47.6 years. In 1992, with infant deaths (along with infectious disease, undernourishment, and death from injury) being largely controlled by medical technological advancements, the infant mortality rate drops drastically to less than 1%. For that year, we find that life expectancy has risen by nearly 30 years compared to data from the year 1900.

It is also important to note that the data for life expectancy in the above figure is only representative of the number of years a newborn infant is expected to live. In other words, at age "0" a white person in 1900 is expected to live up to 48 years; in sharp contrast, a white person born (age "0") in 1992 is expected to nearly 77-years-old. However, if the 1900 person escapes infectious disease, injury, miscarriage, and undernourishment and manages to reach 40-years-old and beyond, the numbers shift significantly (blue = 40+ life expectancy in 1900; yellow = 40+ life expectancy in 1992):

Here we see that if a white American in 1900 reaches age 40, he/she can expect to live 28 years longer (age 68). A white American in 1992 is expected to live 39 years longer (age 79). This is a difference of 11 years. Furthermore, if the 1900 person should live to age 80, he/she is expected to reach age 85. If the 1992 man lives to 80 years, he can expect to see age 87. This is a difference of 2 years. Thus, it can be seen in the above figure that as the age of the individual increases, the gap between the life expectancy data of 1900 and 1992 diminishes. Returning to the first figure, which is based on newborn (age 0) life expectancies, we find a much larger gap in the data -- a difference of nearly 30 years.

Once again, we must remind ourselves of the many changing factors over the century that play an important role in interpreting this data: better hygiene, control of infectious disease, increased food supply, and improved infant outcome. Such influential factors must be taken into consideration when using lifespan data to analyze the health of the United States population throughout the century.

American Food Habits in Historical Perspective (McIntosh 1995, 219-220)

National Vital Statistics Reports (Department of Health and Human Services, National Center for Health Statistics 2006)

Next Post, Part 3: "Big, Fat Changes in American Foods."

Wednesday, April 29, 2009

American Health Then & Now

This blog post is adapted from a paper I wrote in 2007 called, "Modern Health, Primitive Wisdom: American Health History and the Findings of Weston A. Price." Enjoy!

One dietary characteristic that was readily apparent in the traditional primitive cultures of Weston A. Price’s studies was the lack of modern, processed foods. There was in these cultures a widespread use of foods high in nutrient content and comparatively low in calories. When processed foods replaced traditional foods, physical deformities and ill health followed. This pattern of less consumption of nutrient-dense foods and greater consumption of processed modern foods has happened in the United States just as it has in the primitive cultures of the world.

A look around the American population with Dr. Price’s discoveries in mind quickly reveals the state of health in our country today. It's common for modern Americans to require braces and other dental corrections, as well as frequent visits to the doctor due to general illness, such as flu, colds, and other symptoms of lowered immunity. As Dr. Price noted in Nutrition and Physical Degeneration, it only takes one generation of men and women who regularly displaced their native foods with processed foods to give birth to children displaying physical irregularities and exhibiting lowered immunity to disease. These deteriorations are now commonly observed in modernized societies, like the United States, where a large proportion of the foods eaten are processed and devitalized.

Comparing American eating and disease patterns in 1900 to those in 2000, one can find several significant changes that have taken place over the last century. In 1900, Americans ate mostly whole foods, although substantial amounts of processed foods were eaten as well. The whole foods eaten by turn-of-the-century Americans included untrimmed meats from pasture-fed animals, fresh vegetables and fruits, grains, and fats like butter, lard, and coconut oil. Of course, this diet differs significantly from the foods Americans eat today: vegetable oils, refined and “enriched” grains, simple starches, sugar, and factory-reared grain-fed meats (in a few words: fast food). Table 1 provides an overview of the changes in American disease patterns before (1900) and after (2000) these processed foods fully crept into the food supply.

Table 1. Leading Causes of Death in 1900 & 2000*

1900_________________% Total_______2000__________________% Total
Tuberculosis.......................11.3...........Diseases of Heart.................31.4
Pneumonia.........................10.2...........Cancer .................................23.3
Diarrheal Diseases..............8.1............Stroke...................................6.9
Heart Disease......................8.0...........Lung Disease........................4.7
Liver Disease.......................5.2...........Accidents..............................4.1
Injuries.................................5.1..........Pneumonia & Influenza.........3.7
Stroke..................................4.5...........Diabetes Mellitus...................2.7
Bronchitis............................2.6...........Kidney Diseases....................1.0
Diphtheria...........................2.3...........Liver Disease & cirrhosis..........1.0
*Adapted from Food Politics (Nestle 2002, 32)

In Table 1, it can be seen that the shift in American health over the last century has been one of major transition. In 1900, many deaths were due to infectious diseases or injury (6 of 10). This was an era when hygiene was not fully understood and such untimely deaths were commonplace. Heart disease and cancer -- the diseases of modern man -- are seen occurring in moderate percentages during this time. Flash forward to 2000, following many years of processed food consumption, and a very different picture is painted. Almost all of the top diseases (8 of 10) causing death at the time of the millennium are chronic degenerative conditions. The numbers for heart disease and cancer top the list, increasing by four and six times respectively compared to 1900.

The drastic changes in American disease patterns require a broad perspective to fully comprehend. As stated before, in addition to dietary changes, many environmental changes (crowded living conditions and an inadequate food supply) no doubt had a prime role to play in the shift in health that took place over the last hundred or so years. This is one reason why the average lifespan in the United States in 1900 is commonly believed to be far less than it is today. The average lifespan data during this time is also influenced by untimely deaths, such as infectious disease, injury, lack of food, and miscarriages.

Stay tuned for the next post titled, "Were Early Americans Really Living Shorter Lives?"