Amid a May 2022 baby formula shortage, dangerous discussions began about homemade formula — along with myriad social media posts asserting that Baby Boomers’ parents didn’t have access to formula, and they “turned out just fine.”
‘Our Grandparents Didn’t Have Formula in the 1950s and 1960s’ Posts
As is often the case, others objected to the common refrain that “our grandparents didn’t have formula” and their kids were “just fine”:
A May 12 2022 Facebook post encapsulated the second thread of discussion, and described additional childrearing practices that had evolved since the 1960s:
Yes I’ve seen the ingredients to make baby formula. Yes I’m happy you survived on Karo syrup and canned milk. Yes I would make it if I had no other option. I’d put a nipple on a Red Bull can until she was hyped up and moonwalking across the living room floor if it was all I had. But understand this. We are smarter now than we were in 1950. Just because something was common practice then doesn’t mean it is a good idea. Lead was in paint and gasoline. The houses were filled with asbestos. Aunt Betty chain smoked Salems in the station wagon with the windows rolled up while the kids were unbuckled in the back seat. Don’t confuse nostalgia for good practices. For all that survived and brag about it, there were those that did not. But they can’t Facebook because they are dead. It’s called survivorship bias. Some women can’t breastfeed and some babies require special formulas that are hard to find. Just help the families that are struggling and check your opinions at the door.
As we explained in an earlier fact check about a 1960 homemade formula recipe, claims about use of ad hoc formulas weren’t incorrect — they just lacked some extremely basic and important context: infant mortality.
The Formula Shortage and Survivorship Bias
As those arguing against using homemade formula often stated, posts about babies surviving to old age represents a cognitive fallacy known as “survivorship bias,” or the logical error of acknowledging survivors of events and overlooking those that did not, typically because the selection process was not immediately visible.
One of the most prominent examples of thinking around survivorship bias comes from World War II:
At the time, the American military asked mathematician Abraham Wald to study how best to protect airplanes from being shot down. The military knew armour would help, but couldn’t protect the whole plane or would be too heavy to fly well. Initially, their plan had been to examine the planes returning from combat, see where they were hit the worst – the wings, around the tail gunner and down the centre of the body – and then reinforce those areas.
But Wald realised they had fallen prey to survivorship bias, because their analysis was missing a valuable part of the picture: the planes that were hit but that hadn’t made it back. As a result, the military were planning to armour precisely the wrong parts of the planes. The bullet holes they were looking at actually indicated the areas a plane could be hit and keep flying – exactly the areas that didn’t need reinforcing.
In short, not only did “survivors” of a practice or circumstance not create a clear instructive picture, they distorted assessments focused on those who survived. Survivorship bias doesn’t just overcount the victims of poor prior practice, it overlooks problems that could well need addressing.
Or, in other words, it would be best to look at the babies who did not survive an absence of commercial infant formula, and rates of mortality in those years.
U.S. Infant Mortality in 1950, U.S. Infant Mortality in 2020
Broadly, infant mortality rates are influenced by innumerable factors — absence or presence of prenatal care being one, and global location being another of many.
Given that the 2022 baby formula shortage largely affected the United States, trends in American infant mortality were the most relevant. Statistics website Statista.com featured a chart tracking infant mortality from 1935 through 2020.
Infant mortality was measured per thousand babies on that chart, and in 1935, 61 babies per thousand did not survive:
That number dropped to 35.3 in 1950, 27 in 1960, 22 in 1970, 17 in 1980, and visibly began leveling off between 1980 and 1985. As of 2020, the infant mortality rate in the United States was six infants per every 1,000 babies born.
To put it another way, infants were nearly 500 percent likelier to die in 1935 than in 2020, and 350 percent likelier to die in 1960 than in 2020. For further context, that viral “homemade baby formula” recipe was from 1960.
When Was Commercial Formula Invented? When Did It Become Common?
Typically, the period between invention of a novel product and its widespread adoption varies. The first commercial baby formula was developed in the 1860s:
With mother’s milk as the ideal, many scientists tried to formulate nonhuman milk to resemble human milk (Radbill, 1981). In 1865, chemist Justus von Liebig developed, patented, and marketed an infant food, first in a liquid form and then in a powdered form for better preservation. Liebig’s formula — consisting of cow’s milk, wheat and malt flour, and potassium bicarbonate — was considered the perfect infant food (Radbill, 1981).
Many other commercial products and formulas were rapidly introduced after the marketing of Liebig’s infant food and the invention of evaporated milk (Radbill, 1981). By 1883, there were 27 patented brands of infant food (Fomon, 2001). These commercial products came in powdered form and consisted of carbohydrates such as sugars, starches, and dextrins that were to be added to milk. Name brands for the products included “Nestlé’s Food®, Horlick’s Malted Milk®, Hill’s Malted Biscuit Powder®, Mellin’s Food®, Eskay’s Food®, Imperial Granum®, and Robinson’s Patent Barley®” (Radbill, 1981, p. 619). The foods were fattening but lacked valuable nutrients like protein, vitamins, and minerals. Over time, the nutrients were individually added (Radbill, 1981).
A February 2003 article in the peer-reviewed journal Contemporary Pediatrics, “A concise history of infant formula (twists and turns included),” began with a reference to homemade formula. Notably, that preface described a slightly more rigid procedure than the Facebook post circulating in May 2022:
If you are a “mature” pediatrician — one older than 40 years or so [as of early 2003] — there is a good chance that, if you were not breastfed as an infant, you were fed a formula created by mixing 13 oz of evaporated milk with 19 oz of water and two tablespoons of either corn syrup or table sugar. Every day, parents prepared a day’s worth of this formula, transferred it to bottles that they had sterilized in a pan of boiling water, and stored it in a refrigerator until used. In addition to formula, infants received supplemental vitamins and iron.
After noting that baby bottles were developed during the Industrial Revolution, a section described an early but brisk market for infant formula in the mid-to-late 1800s:
… In the early 19th century, it was observed that infants fed unaltered cow’s milk had a high mortality rate and were prone to “indigestion” and dehydration compared with those who were breastfed. In 1838, a German scientist, Johann Franz Simon, published the first chemical analysis of human and cow’s milk, which served as the basis for formula nutrition science for decades to follow. He discovered that cow’s milk had a higher protein content and a lower carbohydrate content than human milk. In addition, he (and later investigators) believed that the larger curds of cow’s milk (compared to the small curds of human milk), were responsible for the “indigestibility of cow’s milk.”
Empirically, physicians began to recommend that water, sugar, and cream be added to cow’s milk to render it more digestible and closer to human milk. By 1860, a German chemist, Justus von Leibig, developed the first commercial baby food, a powdered formula made from wheat flour, cow’s milk, malt flour, and potassium bicarbonate. The formula, which was added to heated cow’s milk, soon became popular in Europe. Leibig’s Soluble Infant Food was the first commercial baby food in the US, selling in groceries for $1 a bottle in 1869.
In the 1870s, Nestle’s Infant Food, made with malt, cow’s milk, sugar, and wheat flour, became available in the US, selling for $.50 a bottle. In contrast to Leibig’s Food, Nestle’s formula was diluted with water only, requiring no cow’s milk to prepare, and thus was the first complete artificial formula available in this country.
Several cow’s milk modifier formulas were introduced over the next 20 years, and by 1897 the Sears catalogue was selling no fewer than eight brands of commercial infant foods, including Horlick’s Malted food ($.75 per bottle), Mellin’s Infant Food ($.75 per bottle), and Ridge’s Food for Infants ($.65 per bottle). Despite their widespread availability, these proprietary formulas realized only modest sales in the late 19th century because they were expensive in comparison to cow’s milk. Most mothers continued to breastfeed their infants.
As noted in that excerpt, formula’s initially slow adoption rated was coupled with the condition that most mothers continued breastfeeding. The author went on to explain rigorous efforts by physicians to develop a safe breastmilk substitute, indicating that the purportedly carefree attitude to formula didn’t even exist at the turn of the twentieth century:
Thomas Morgan Rotch of Harvard Medical School developed what came to be known as the “percentage method” of infant formula feeding, which was popular among medical professionals from 1890 to 1915. He taught that because cow’s milk contains more casein than human milk, it must be diluted to lower the percentage of casein … Cow’s milk formulas prescribed by the percentage method were compounded by a milk laboratory or, more often, by a home method that was time and labor intensive. Physicians were taught to monitor growth carefully and to examine the infant’s stool and modify the formula based on its appearance.
The piece explained that the labor-intensive “percentage method” began to fall out of favor around 1920, and physicians “eventually began to recommend either commercial formulas or simple homemade formulas made with evaporated milk”; that timeframe coincided with the appearance of additives on the market (from 1912 onward). It noted that corn syrup fell into favor during the hardscrabble years of the Great Depression, adding:
By the 1940s and through the 1960s, most infants who were not breastfed received evaporated milk formula, as well as vitamins and iron supplements. It is estimated that, in 1960, 80% of bottle-fed infants in the US were being fed with an evaporated milk formula.
Moreover, important context included the fact that increasing availability of commercial formula influenced the rate of breastfeeding:
During the 1960s, commercial formulas grew in popularity, and by the mid-1970s they had all but replaced evaporated milk formulas as the “standard” for infant nutrition. During this time, the percentage of women who breastfed their newborn reached an all-time low (25%), in part because of the ease of use and low cost of commercial formula and a belief that formulas were “medically approved” to provide optimal nutrition for young infants.
A major factor in the acceptance of commercial formulas was their use in hospitals to feed newborn infants during the 1960s and 1970s. To encourage acceptance, formula companies began to provide inexpensive or free formula to hospitals in ready-to-feed bottles, enabling the phasing out of hospital formula preparation rooms. Mothers who witnessed how well their newborns accepted these easily prepared formulas were often convinced to continue this practice at home. Moreover, although pediatricians did not dissuade mothers from nursing, it was not strongly encouraged, as it is today.
Breastfeeding rates dropped in popularity because safe infant formulas were introduced and were accessible, but before that was the case, breastfeeding was less of a choice than it was in 2022 — and in the 1960s and 1970s, hospitals began promoting formula to new parents.
The proverbial grandparents so often referenced in 2022 formula shortage anecdotes were likelier to breastfeed when possible, and used the widely available substitutes of the day.
A sneakily inaccurate element of baby formula shortage discussion on social media was that babies in the 1950s and 1960s were fed “homemade formula” and thrived. The claim was fallacious on several levels — babies were nearly 500 percent likelier to die before the widespread availability of infant formula, and breastfeeding was considered the primary source of infant nutrition. In addition to survivorship bias, the claims often falsely suggested that parents were less diligent with their sparser options; a well-researched article explained that the “homemade formula” was mixed and stored adhering to strict instructions. Finally, commercial formulas were available in the mid-to-late 1800s (even in the Sears catalog), and physicians at the turn of the 20th century were rigorous about breastmilk alternatives. The only partly accurate aspect of the claims were about the widespread adoption of infant formula in the 1950s and 1960s; prior to that, parents were likely to turn to formula out of necessity, not choice.