GET THE MAGAZINE

Subscribe today and save up to 66%*, plus get free access to the iPad and iPhone editions.

Subscribe

A Juicy History

From scurvy-ridden sailors to civilian hens, the discovery of vitamins has been a series of medical mysteries that doctors are still unraveling...

<< Back to Robb Report, Robb Report Health and Wellness January 2015

From scurvy-ridden sailors to civilian hens, the discovery of vitamins has been a series of medical mysteries that doctors are still unraveling.

Some medical breakthroughs arrive in a lightning-strike moment of clarity; others are pieced together bit by bit by people who are not fully aware of the puzzle they are solving. So it was with vitamins. The notion that human beings need to eat tiny amounts of certain nutrients to survive, and failing to eat foods with these nutrients can sicken and kill us, was overlooked and dismissed for centuries until the delightfully named Polish biochemist Casimir Funk advanced the concept of vital amines, or vitamines, in 1912. When it was later discovered that some vitamines lacked the nitrogen that defines an amine, scientists dropped the “e” from the word (though Funk stubbornly clung to it). Those who proved that vitamins were real and necessary were as clever as the fictional icon Sherlock Holmes, who solved a mystery by noticing when a dog did not bark. “Figuring it out was one of the great feats of medical detective work ever,” says Frances Frankenburg, MD, professor of psychiatry at Boston University School of Medicine and author of the book Vitamin Discoveries and Disasters: History, Science, and Controversies.

The story of scurvy is particularly maddening, as its remedy—foods containing what we now call vitamin C—was discovered and rediscovered but rarely gained the notice of ship suppliers. Some two million seamen are believed to have died from scurvy in the 17th and 18th centuries, yet during that same period, reports of the scurvy-stricken rebounding after eating fresh fruits, vegetables, and herbs were piling up. In 1747, James Lind, a surgeon in the British Navy, began studying a variety of scurvy cures. He had learned the tale of a sailor who became desperately ill with scurvy. Though he was so sick he could not walk, his captain coldly abandoned him on Greenland. Hunger drove the sailor to eat the island grass, and lo and behold, it cured him. Meals of so-called “scurvy grass” made the man well enough that he was able to eventually flag down a ship and return home.

Lind thought of that story when scurvy began to weaken the legs and rot the gums of the men aboard the Salisbury in May 1747 and devised an ingenious test. He selected 12 sufferers, paired them off, secluded them in the same part of the ship, and gave each pair a different anti-scurvy treatment to take with their standard rations. After six days, the sailors who ate two oranges and one lemon per day had run through the ship’s entire supply of the fruits, but it did not matter—the lucky men were almost completely healthy again. While a second pair assigned to drink cider showed minor improvement, no one else in the group enjoyed anything like those happy results.

Lind described his test in his 1753 tome Treatise of the Scurvy, but it took the British Navy more than 40 years to act. In its defense, only four pages in the 450-page book detail the Salisbury experiment; the rest natters on about other theories on scurvy, some rooted in ancient medical ideas. “It’s so hard for us, looking backward,” says Dr. Frankenburg. “You want to yell, ‘You have the answer!’” It fell to Scottish doctor Gilbert Blaine to find the gold lurking in Lind’s Treatise and counsel the British Navy to add citrus juice to sailors’ diets, which they did, starting in 1795.

(Continues on next page…)

Trial and Error

The Salisbury experiment was prob–ably one of the earliest versions of a clinical trial, making it a milestone in the history of vitamins and the history of modern science. Other shared milestones would follow. In 1907, Norwegian researchers accidentally discovered that scurvy was caused by a nutritional deficiency (the missing substance was later named vitamin C) when they removed fresh produce from the diets of guinea pigs, one of the few animals whose bodies cannot produce vitamin C. Elmer McCollum, the American who discovered vitamins A and D in the early 20th century, popularized the use of white rats by medical researchers. While his superiors at the University of Wisconsin-Madison denied him the funds to replace cows with rodents, they let him use his own money to purchase 12 rats and from there he cultivated a colony. His groundbreaking work prompted Johns Hopkins University to recruit him as the first head of its department of chemical hygiene in 1917.

The history of vitamins has shaped the foundation of modern science in less obvious ways as well. It shows that smart people who are right about one thing can be catastrophically wrong about another; that accepting established knowledge uncritically can hinder progress; and how crucial it is to set aside preconceptions and assess evidence on its own merits. The struggles of the search for vitamins spotlight these hard-won truths.

One prominent scientist who was dead wrong about vitamins was Louis Pasteur. The immortal who gave us pasteurization and the rabies vaccine was convinced that the only organic nutrients were proteins, fats, and carbohydrates. That, plus Pasteur’s belief that only bacteria and microorganisms caused disease, scared his peers away from pursuing vitamin theories. Lind buried his scurvy cure in his Treatise partly because the idea that illness can arise from not eating certain foods did not fit the ancient Greek physician Galen’s theory of the humors. Nor did it fit German physician Robert Koch’s 1890 theory of germs, which explained a lot, but could not address pellagra, rickets, beriberi, or pernicious anemia, no matter what its advocates believed. This was doubly sad because nutritional deficiencies produce symptoms that are easy to mistake for those of infectious diseases.

Dutch scientist Christiaan Eijkman, who studied beriberi in Indonesia, received a peerless education in the importance of following the evidence where it leads, even if it flouts the prevailing wisdom. At first, he and his team assumed that beriberi was caused by a microbe, but by 1889 he started to think otherwise. While conducting research on chickens at a military hospital, his birds started behaving oddly, as if they were sick with beriberi, and then mysteriously improved. He investigated and found that the chickens’ diet had been altered. They had been fed day-old, milled white rice from the officers’ meals for about five months and gradually sickened during that time, until a new cook arrived and refused, as Eijkman recalled, “to give military rice to civilian hens.” Instead, he offered cheaper, unmilled brown rice, and the birds recovered. Wondering if a white rice diet caused beriberi in humans, too, Eijkman’s colleagues ran tests on inmates in Javanese prisons. The results were clear: diets heavy in white rice led to beriberi, while diets heavy in brown rice reversed it.

Far from welcoming his findings, Eijkman’s colleagues mocked them, and him.

One Dutch physician, convinced that white rice carried a neurotoxin, claimed that the work of doctors stationed in Indonesia was worthless because they had eaten the spoiled rice and it had rotted their brains. Hobbled by malaria and demoralized by the criticism, Eijkman returned to Holland and stopped studying beriberi. But he lived long enough to savor his vindication, sharing the 1929 Nobel Prize for medicine for his research that lead to the discovery of thiamin, also known as B1, which is present in brown, but not milled white, rice.

(Continues on next page…)

D for Modern Deficiency

Americans typically do not succumb to scurvy or beriberi anymore. Postwar enrichment of bread, milk, and other staple foods, along with the spread of transportation networks and affordable refrigerators, effectively banished the diseases of malnutrition. But that does not mean that we are out of the woods. Michael Holick, MD, a professor of medicine, physiology, and biophysics at Boston University School of Medicine, says that almost a third of American adults and children are deficient in vitamin D, and perhaps as many as 60 percent have insufficient levels of D. It is possible that a lack of D could not only affect bone health, but raise their risk of contracting breast cancer, colon cancer, multiple sclerosis, type 1 diabetes, and other ailments. To combat this, Dr. Holick recommends that children take 600–1,000 International Units (IUs) of vitamin D every day, and recommends that teenagers and adults take 1,500–2,000 IUs daily.

A vitamin D deficiency is definitely a bad thing; small children can suffer rickets, which renders them bowlegged, and a recent study suggests that seniors whose D stores are severely low are more likely to get Alzheimer’s disease. A lack of D may play a role in depression, but a recent meta-analysis of seven studies concluded that there was not enough proof of a link. Although Seasonal Affective Disorder (SAD) is mainly caused by shifts in melatonin, it is possible that by boosting vitamin D intake, one could indirectly alleviate SAD. Because D is naturally present in few foods, Dr. Holick advocates going out in the sun to prompt the body to generate its own stores of the vitamin.

The connection between sunlight and strong bones has a deep past; the Greek historian Herodotus unwittingly noted the sun’s power to stimulate production of vitamin D when he visited a century-old battlefield in 425 BCE and observed that the Persian skulls, which belonged to men who wore turbans, broke when hit with a pebble, while the skulls of Egyptians, who did not shield their heads from the sun’s rays, held firm. Dr. Holick’s pro-sun position did not sit well with his colleagues in BU’s department of dermatology, which dismissed him from its staff in 2004. He helped develop dminder (dminder.info), an app that allows users to calculate how much time they should spend in the sun daily based on their latitude, the time of year (anyone who lives north of Atlanta’s latitude will not generate D come November and will not regain that ability until April arrives), their skin color, and how much clothing is covering their bodies.

Dr. Holick remains unruffled, though his fellow doctors are not entirely sold on his arguments. As recently as June, the United States Preventative Task Force decided against advocating for routine screening of vitamin D levels for the general public, concluding that there was not enough evidence to favor it over case-by-case screenings. And with the American Academy of Dermatology estimating that one in five Americans will develop skin cancer during their lives and that one in 50 will suffer its deadliest form, melanoma, it is clear why dermatologists back taking vitamin D supplements over frolicking sans sunscreen. “While the risk of dying from basal cell or squamous cell carcinoma is low, the risk of melanoma is real. Why would you chance it?” says David Leffell, MD, the David P. Smith professor of dermatology at Yale’s School of Medicine in New Haven, Conn. “If one’s level [of Vitamin D] is low, take oral supplements.”

Scientists will likely bring to light all vitamin D’s nuances, even as they discover new mysteries to unravel.

(Continues on next page…)

Multi-Sided

While the debate over vitamin D rages on, the issue

of multivitamins also remains a battlefield. Johns Hopkins University’s Edgar Miller, MD, co-authored a paper published in the Annals of Internal Medicine titled “Enough Is Enough: Stop Wasting Money on Vitamin and Mineral Supplements”—the language is as blunt as can be in the realm of scientific journals. “It’s a provocative title, intentionally,” he says of the article, which discusses the disappointing results of three trials involving multivitamins. To date, the science does not back the notion of the healthy average American needing to take the daily pill.

Lee McDowell, PhD, author of Vitamin History, gently arches an eyebrow at this. “The concept is good, that you do not need to take extra vitamins if you have a well-balanced diet. But often, we don’t have a well-balanced diet.” McDowell, an admitted vitamin fan, does take a daily multivitamin. Dr. Miller counters that “even the typical American lousy diet has enough vitamins, minerals, and antioxidants sufficient for one’s needs.”

Joy Dubost, a registered dietician and spokesperson for the Academy of Nutrition and Dietetics, views multivitamins as a nutritional insurance policy. “As we age, we cannot assume

a less-than-perfect diet provides adequate amounts of nutrients,” she says. But she advocates taking one’s vitamins as wholesome foods rather than as a cherry-flavored effigy of Fred Flintstone when possible.

More Health & Wellness

Comments