This is Part 2 in a series adapted from my paper, "Modern Health, Primitive Wisdom: American Health History and the Findings of Weston A. Price."
Looking deeper into the life expectancy statistics that are used to gauge our country's health status, one quickly finds that it is not a simple black-and-white procedure. Many factors create discrepancies in the data. One prime example is the role that infant mortality rate plays in determining life expectancy data. As the mortality numbers of the overall population are added up, every infant death contributes a "0" to the tally, significantly impacting the final average. Below is a graphic representation of the result of this phenomenon. (blue = infant mortality; yellow = average lifespan.)
In the above figure, we see that infant mortality rates in 1900 are quite high at 14 %. Correspondingly, average life expectancy of newborns in 1900 is very low at 47.6 years. In 1992, with infant deaths (along with infectious disease, undernourishment, and death from injury) being largely controlled by medical technological advancements, the infant mortality rate drops drastically to less than 1%. For that year, we find that life expectancy has risen by nearly 30 years compared to data from the year 1900.
It is also important to note that the data for life expectancy in the above figure is only representative of the number of years a newborn infant is expected to live. In other words, at age "0" a white person in 1900 is expected to live up to 48 years; in sharp contrast, a white person born (age "0") in 1992 is expected to nearly 77-years-old. However, if the 1900 person escapes infectious disease, injury, miscarriage, and undernourishment and manages to reach 40-years-old and beyond, the numbers shift significantly (blue = 40+ life expectancy in 1900; yellow = 40+ life expectancy in 1992):
Here we see that if a white American in 1900 reaches age 40, he/she can expect to live 28 years longer (age 68). A white American in 1992 is expected to live 39 years longer (age 79). This is a difference of 11 years. Furthermore, if the 1900 person should live to age 80, he/she is expected to reach age 85. If the 1992 man lives to 80 years, he can expect to see age 87. This is a difference of 2 years. Thus, it can be seen in the above figure that as the age of the individual increases, the gap between the life expectancy data of 1900 and 1992 diminishes. Returning to the first figure, which is based on newborn (age 0) life expectancies, we find a much larger gap in the data -- a difference of nearly 30 years.
Once again, we must remind ourselves of the many changing factors over the century that play an important role in interpreting this data: better hygiene, control of infectious disease, increased food supply, and improved infant outcome. Such influential factors must be taken into consideration when using lifespan data to analyze the health of the United States population throughout the century.
American Food Habits in Historical Perspective (McIntosh 1995, 219-220)
National Vital Statistics Reports (Department of Health and Human Services, National Center for Health Statistics 2006)
Next Post, Part 3: "Big, Fat Changes in American Foods."