Sharp rise in problem blamed on kids indoors playing computers and parents using too much sunscreen
By Owen Bowcott January 22, 2010
Computer-obsessed children who spend too long indoors and over-anxious parents who slap on excessive sunscreen are contributing to a sharp rise in cases of the bone disease rickets, doctors are warning.
Vitamin D deficiency, which causes the condition, could be rectified by adding supplements to milk and other food, a research team at Newcastle University suggests.
There are several hundred cases of the preventable condition among children in the UK every year, according to a clinical review paper in the British Medical Journal by Professor Simon Pearce and Dr Tim Cheetham.
“More than 50% of the adult population [in the UK] have insufficient levels of vitamin D and 16% have severe deficiency during winter and spring,” they say. “The highest rates are in Scotland, Northern Ireland and northern England. People with pigmented skin are at high risk as are the elderly, obese individuals and those with malabsorption.”
Most vitamin D is synthesised in the body by absorption of sunlight. Some comes from foods such as fish oil. People with darker skins need more sunlight to top up their vitamin D levels.
One of the main reasons for the reappearance of rickets – once considered a disease of the industrial poor in 19th-century cities – is the changing ethnic makeup of the population, Pearce explained.
The most commonly affected are people of Asian or African descent who live in northern cities. He has examined cases among young Somali speakers who live in east Newcastle. But changing lifestyles are also contributing to lowering vitamin D levels in the general population.
“Some people are taking the safe sun message too far,” Pearce said. “It’s good to have 20 to 30 minutes of exposure to the sun two to three times a week, after which you can put on a hat or sunscreen.
“Vitamin D levels in parts of the population are precarious. The average worker nowadays is in a call centre, not out in the field. People tend to stay at home rather than going outside to kick a ball around. They stay at home on computer games.”
Pearce has written to the Department of Health proposing that vitamin D is added to milk. It is already added as a supplement to artificial baby milk. He has also asked the Royal College of Paediatrics to record cases of rickets but said figures were not being collected.
“A more robust approach to statutory food supplementation with vitamin D (for example in milk) is needed in the UK,” the paper concludes.
Meanwhile, figures obtained by the Tories show the number of patients leaving hospital with malnutrition has hit record levels in the last year. Those affected are primarily elderly people. The NHS figures show that last year 175,000 people were malnourished on entry to hospital but nearly 185,500 were in a similar condition on discharge, meaning more than 10,000 patients were more malnourished after medical treatment.
Low levels of vitamin D in lymphoma patients are associated with cancer progression and even death, according to findings from a Mayo Clinic and University of Iowa study reported by ScienceDaily.
“These are some of the strongest findings yet between vitamin D and cancer outcome,” said lead investigator, Matthew Drake, MD, PhD, an endocrinologist at Mayo Clinic in Rochester.
Researchers studied 374 newly diagnosed cancer patients suffering from diffuse large B-cells lymphoma (a fast-growing, aggressive form of non-Hodgkin’s lymphoma) and found half had vitamin D deficiency. Patients deficient in vitamin D had a greater risk of cancer progression and were more likely to die than patients with optimal vitamin D levels.
“The exact roles that vitamin D might play in the initiation or progression of cancer is unknown, but we do know that the vitamin plays a role in regulation of cell growth and death, among other processes important in limiting cancer,” Drake said.
These findings support the growing connection between vitamin D and cancer risks and outcomes as well as reinforce other field research about the vitamin’s overall health benefits, Drake added.
Vitamin D is obtained from sunlight and converted into its active form by the skin. It is also found in food (naturally or fortified as in milk) and is available in supplement form.
If this vitamin isn’t in your medicine cabinet, it probably should be.
By Linda B. White, M.D.
What do the following conditions have in common: osteoporosis, multiple sclerosis, high blood pressure, diabetes and cancer? Give up? Experts suspect that insufficient levels of vitamin D raise your risk of getting these diseases. Unfortunately, most of us probably are vitamin D deficient.
About all I was taught in medical school is that vitamin D keeps bones strong. Recently however, this area of study has exploded as scientists uncover the vitamin’s far-reaching effects. Because it increases calcium levels, vitamin D indirectly fortifies bones and teeth. It also regulates cells all over the body, which explains vitamin D’s disparate roles, such as influencing insulin production and immune function, as well as helping prevent inflammation and cancer.
The scary thing is that vitamin D deficiency appears to be quite common. A recent British study found that 87 percent of volunteers had low blood levels of the vitamin in winter and spring, and 61 percent had low levels in summer and fall. Why the seasonal variation? Our chief source of vitamin D is sunshine.
Why We’re D-ficient
In response to ultraviolet B (UVB) rays in sunlight, our skin transforms a derivative of cholesterol normally found in the skin into vitamin D3 (cholecaliferol). The liver, kidneys and other tissues further activate this molecule. Given that the skin is a veritable vitamin D factory, why is deficiency so rampant? History — ancient and recent — holds the answers.
Humans evolved near the equator and spent days outdoors, allowing the skin to generate ample amounts of this vitamin. About 50,000 years ago, some of our ancestors migrated toward the poles, where winter sunlight isn’t intense enough for vitamin D production. However, their diet of vitamin D-rich fish compensated for the deficit.
But rickets became prevalent in the 18th century during the Industrial Revolution, when people shifted to indoor labor and the skies darkened with pollution. This manifestation of severe vitamin D deficiency causes skeletal deformities, such as bowed or knocked knees and bony knobs along the ribs, known as rachitic rosary. During the 1930s, the decision to add vitamin D to milk nearly eradicated rickets in the United States. But nowadays, kids and adults drink less milk and more juice and sodas, and sadly, rickets is making a comeback in American children according to a study released last year.
Starting about 30 years ago, another cultural shift deepened our vitamin D deficit: public health campaigns to avoid the midday sun, cover up and apply sunscreen. They were justified attempts to save our skins from sun-induced aging and cancer, but now we’re not making enough vitamin D. These days, vitamin D deficiency has become commonplace, even in the tropics. For instance, a sampling of adults in sunny Honolulu showed that half were low in D.
Of course, we can take supplements, but current government recommendations are cautious — 200 IU a day for young adults, 400 for people 51 to 70, and 600 for those over 70. Vitamin D expert Bruce W. Hollis, M.D., of the Medical University of South Carolina, says such doses might be enough to prevent rickets, but aren’t sufficient to fulfill other important functions.
Most of us don’t even meet these inadequate guidelines. A German study found that 80 percent of sampled adults didn’t get recommended amounts, and nearly 60 percent had low blood levels of vitamin D, a statistic that rose to 75 percent in women over 65 years old. Furthermore, those women with low blood levels of vitamin D were more likely to have high blood pressure, cardiovascular disease and type 2 diabetes.
Results of D-ficiency
So what are the dangers of too little vitamin D in your system? A whole host of chronic conditions.
Weak bones and muscles. Rickets was the first disease tied to vitamin D depletion. This severe deficiency during childhood can prevent kids from reaching their potential for full height and peak bone mass. (Bone mass peaks in early adulthood; after that it slowly declines.)
In adults, vitamin D deficiency can lead to osteoporosis (thin, brittle bones) and osteomalacia (rubbery, demineralized bones). The latter causes bone pain, and both elevate the risk of broken bones.
Additionally, vitamin D deficiency causes muscle weakness and discomfort. One study found that patients with aches and weakness were often severely vitamin D deficient. Hollis says he’s hearing from doctors that vitamin D supplementation often resolves these aches and pains, adding, “A lot of ‘fibromyalgia’ is probably D deficiency.”
Weakened muscles increase the risk of falls and fractures — a dangerous combination for the elderly. The research shows that, although the recommended dose of 600 IU a day doesn’t prevent falls and fractures in older adults, doses over 800 IU do. In fact, consuming 700 to 800 IU of vitamin D a day (plus or minus calcium) could prevent a quarter of hip fractures in older people, according to a study published in the Journal of the American Medical Association.
Cancer. Vitamin D deficiency has been linked to several types of cancer, including breast, prostate, colon and melanoma. In fact, for more than 60 years, research has found that people living at higher latitudes with less exposure to sunlight showed an increased risk of cancer mortality. Adequate vitamin D levels seem to protect against some cancers. In a recent study, researchers followed healthy postmenopausal women whom they assigned to take either 1,400 to 1,500 milligrams a day of supplemental calcium plus 1,100 IU a day of vitamin D3, or a placebo for four years. After the first year, vitamin D supplementation led to a 57 percent reduction in cancer.
Cardiovascular disease. In addition to cancer and bone disease, vitamin D may also be healthy for your heart. Vitamin D levels are inversely associated with the risk of high blood pressure and congestive heart failure. Exposing people with high blood pressure to ultraviolet light has been shown to improve the condition.
Asthma. Preliminary studies show that vitamin D also may help alleviate respiratory problems, such as asthma. According to one study published in the American Journal of Clinical Nutrition, children of mothers with lower intakes of vitamin D during pregnancy are more likely to develop asthma.
Autoimmune disorders. Vitamin D reduces inflammation and plays a role in the maturation of the immune system. Deficiency is common in autoimmune diseases where the immune system attacks normal cells, such as type 1 diabetes, rheumatoid arthritis and multiple sclerosis (MS). Emerging research shows that vitamin D may have a preventive effect. One study examined two large groups of women for 10 years and found a reduced risk of MS was associated with vitamin D supplementation. A study of Finnish children taking 2,000 IU a day (10 times the current recommendation) showed they had a decreased risk of developing type l diabetes. In an analysis of the Iowa Women’s Health Study, women consuming higher levels of vitamin D showed a reduced risk for rheumatoid arthritis.
Mental health. Psychiatrist John Cannell, M.D., founder of the nonprofit Vitamin D Council, says that vitamin D may contribute to several emotional disorders. In a study of elderly people, mood and cognitive skills deteriorated with lower levels of D. Cannell points out that seasonal affective disorder (SAD) is a type of depression whose onset follows the waning daylight of autumn and winter. An Australian study found that vitamin D supplements lifted the mood of people with SAD.
How to Get Enough D
Expose yourself. Your skin can tackle much of your vitamin D needs. If you’re young, fair, scantily clad and near the equator, 10 to 15 minutes of peak sunshine produces 20,000 IUs.
However, Hollis says a dark-skinned person requires 10 times that exposure to make an equivalent amount of D. And a 70-year-old person makes only a quarter of the vitamin D that a 20-year-old can produce. During the fall and winter in higher latitudes (above 37 degrees latitude — San Francisco is just above 37 degrees), the levels of UVB fall below the threshold needed for even a fair-skinned person to produce enough vitamin D. Additionally, complete cloud coverage cuts UV energy in half, and shade reduces it by 60 percent.
Sunscreens also block UVB waves, the wavelength that stimulates the skin’s vitamin D production. According to Michael F. Holick, M.D., Ph.D., of the Boston University School of Medicine, a sunblock with SPF 8 reduces the skin’s vitamin D production by 95 percent. “If you wear sunscreen ‘properly,’ you’ll become vitamin D deficient,” he says.
But what about skin cancer? Despite increased sunscreen usage, skin cancer rates have risen. One reason is that, until recently, sunscreens didn’t impede deeply penetrating UVA light, and presumably, our false sense of security led to more time in the sun and an increase in skin cancer.
What should you do? “Be sensible,” Holick advises. “Know your own skin sensitivity.” For instance, if you turn pink after 30 minutes in the summer sun, thenspending five to 10 minutes (in a bathing suit) in the sun should generate plenty of vitamin D. After that, apply sunscreen, cover up and seek shade.
Eat D-licious foods. Only a few foods contain much vitamin D. Sources of vitamin D include cod liver oil (1,360 IU per tablespoon); oily fish such as salmon, sardines and mackerel (about 350 IU per 3.5 ounces); eggs (about 20 IU per yolk); and fortified milk, soy milk and orange juice (98 IU per 8-ounce serving). (We’re testing pasture-raised chicken eggs for vitamin D as part of our 2007 egg testing project. See October/November 2007 for the initial results. — Mother)
Shiitake mushrooms can be an exceptional source of vitamin D, as noted in research published in Paul Stamets’ book, Mycelium Running. Shiitake mushrooms grown and dried indoors have only 110 IU of vitamin D per 100 grams. But when the shiitakes were dried in the sun, the vitamin D content rose to 21,400 IUs per 100 grams. Even more surprising, when the mushrooms were dried with their gills facing up toward the sun, their content rose to 46,000 IU!
Take supplemental D. Most North Americans can’t maintain healthy blood levels of D from sunlight and good diet. Therefore, many experts recommend 800 to 1,000 IU a day — several times the government guidelines of 200 to 600 IU.
The exact amount depends upon several things. If you’re dark-skinned or spend little time outdoors, you’ll obviously need more than a Caucasian lifeguard. And if you’re already deficient in vitamin D, you’ll need hefty doses just to get your blood levels up to normal.
If you’re pregnant or nursing, you’ll also need more. Hollis and colleagues are currently researching the effects of different vitamin D doses in pregnant women of various races. Until the results of that trial are finalized, he can’t recommend more than 2,000 IU per day.
When asked how much vitamin D they normally take, Hollis says he takes 4,000 IU a day, while Holick says each member of his family takes 1,000 IU of D3 a day. Holick also spends reasonable amounts of time outdoors.
Be aware that many supplements provide vitamin D as ergocalciferol (vitamin D2), rather than cholecalciferol (vitamin D3). D3 is the form naturally occurring in our bodies and is more effective.
No one really knows how much vitamin D might be too much; however, toxicity is exceedingly rare. The Food and Nutrition Board sets the upper level for daily dietary intake at 2,000 IU, though Hollis thinks that’s not enough to maintain health at northern latitudes. Accumulated research demonstrates 10,000 IU of vitamin D3 to be a more realistic upper limit.
Who’s at Risk?
The only way to measure vitamin D blood levels is to check a form of vitamin D called 25-hydroxyvitamin D. Doctors don’t routinely perform this test, and Holick thinks universal screening would be too expensive. If you’re at risk for, or already have symptoms of, deficiency, then you might want the blood test.
Just who’s at risk? Research shows the following populations face greater risk of vitamin D deficiency:
Dark-skinned people. Melanin darkens skin and absorbs UV light, which protects against sun damage and limits vitamin D production. Holick’s research shows that 80 percent of African-Americans studied in Boston over age 65 were vitamin D deficient — at the end of summer!
Northerners. People who live at higher latitudes where winters are long and dark run a higher risk of vitamin D deficiency. Holick notes that even fair-skinned people living above 37 degrees latitude make little vitamin D during the winter.
Older adults. The skin production of vitamin D and its activation in the kidneys declines with age. Further, the elderly typically spend more time indoors. Vitamin D deficiency in this age group contributes to osteoporosis and falls.
Breast-fed infants. Research in Iowa by Hollis and colleagues found that vitamin D deficiency, including severe deficiency, was common among breast-fed infants without vitamin D supplementation. Vitamin D deficiency in nursing mothers is the reason breast milk is D deficient. Unfortunately, early deficiency can have lifelong consequences.
People with intestinal disorders. Disorders that interfere with fat absorption include celiac disease, Crohn’s disease, pancreatic insufficiency, liver disease or cystic fibrosis. Fat-soluble vitamins such as D are absorbed from the intestine with dietary fat, so people with low ability to absorb fat may need vitamin D supplements.
Sun avoiders. People who cover up for religious, cultural or health reasons also run the risk of deficiency. Clothing blocks UVB waves, interfering with or preventing the skin’s formation of vitamin D.
The obese. In a British study, obese people were twice as likely as those of normal weight to be low in vitamin D. Hollis explains it’s because fat sponges up vitamin D and stores it, but doesn’t release it.
By Gina Kolata
The nation is in the grip of what looks like a terrifying melanoma epidemic: melanoma is being diagnosed at more than double the rate it was in 1986, increasing faster than any other major cancer.
But why the numbers are increasing is a contentious subject, so touchy that one dermatologist called it “the third rail of dermatology.”
Many dermatologists argue that melanoma, the most deadly of the skin cancers, is in fact becoming more common. And they recommend regular skin cancer screening as the best way to save lives. But some specialists say that what the numbers represent is not an epidemic of skin cancer but an epidemic of skin cancer screening, and a new study lends support to this view.
In the study, published in the current issue of The British Medical Journal, Dr. H. Gilbert Welch of the Department of Veterans Affairs in White River Junction, Vt., and Dartmouth Medical School and his colleagues analyzed melanoma’s changing incidence and death rate over time.
The researchers used Medicare data to track the swift rise in melanoma cases since 1986 and data compiled by the National Cancer Institute to track the death rate and the number of people with early and late-stage disease.
They found that since 1986, skin biopsies have risen by 250 percent, a figure nearly the same as the rise in the incidence of early stage melanoma. But there was no change in the melanoma death rate. And the incidence of advanced disease also did not change, the researchers found.
Dr. Welch and two colleagues, Dr. Steven Woloshin and Dr. Lisa M. Schwartz, argue that if there was really an epidemic of melanoma – for example, if something in the environment was causing people to get the skin cancer, scientists should see increases in cancers at all stages. This is what happened with lung cancer caused by smoking, and with other cancers caused by toxic substances.
The fact that the increase was seen only in very early stage disease was a tip-off that the epidemic might be less than it seemed, Dr. Welch said.
And that, he says, leads to a difficult question. The point of screening for melanoma is to reduce the death toll from the cancer. But if screening has not altered the number of patients with advanced disease or lowered the death rate, what is its benefit?
“That’s the million dollar question,” Dr. Welch said. “It certainly raises questions about whether we’re doing any good.”
The researchers hastened to add that people who notice suspicious moles or spots should not hesitate to see a doctor. But skin cancer screening, they said, is directed at healthy people who have no reason to suspect that anything is wrong.
The federal Preventative Services Task Force, which makes screening recommendations, has said that there was insufficient evidence to recommend either for or against skin screening.
But the American Cancer Society recommends regular skin screening, as does the American Academy of Dermatology, which sponsors Melanoma Mondays and free skin screening clinics that see more than 200,00 people a year.
Speaking for the dermatology academy, one of its past presidents, Dr. Darrell Rigel, a dermatologist in New York, said it only made sense to look for melanomas and remove them before they spread. “As dermatologists, we see people die every day from melanoma,” he said. “And there’s another thing we know with melanoma that’s very clear. The earlier you find it and treat it, the better the survival.”
More and more people are having skin biopsies, Dr. Rigel said, but he questioned Dr. Welch’s conclusion that the biopsies were leading to excessive diagnoses of melanoma. “I would say the inverse is more likely,” Dr. Rigel said. “There are more melanomas and therefore more biopsies.”
At the American Cancer Society, Dr. Len Lichtenfeld, an oncologist, said his group reviewed the same data as Dr. Welch and came to a different conclusion. Screening, he said, appears to be saving lives.
As evidence Dr. Lichtenfeld pointed to a trend in the data indicating that the death rate from the disease rose slightly year by year until about a decade ago. That is consistent with an increase in serious cases of melanoma.
Now, he said, “there has been a suggestion in the data that the death rates in the Medicare age group are going down,” an effect that would be expected if screening was working.
He added, “We agree that some of the melanomas are biologically indolent, but we also feel that when we look at the trend in the data and the suggestion of decreased mortality that there has been a benefit from increased surveillance for the disease.”
Dr. Welch disagrees. He said the cancer society was “taking tiny, tiny differences” in death rates from year to year and “putting a huge microscope on it.”
In fact, he said, the death rate has been basically flat since 1986, although it bounces around slightly from year to year as a result of statistical fluctuations.
“We don’t disagree about the data,” Dr. Welch said. “We disagree about the interpretation. We are not arguing that there is zero change in disease burden. We are arguing that most of the newly diagnosed cases are the result of increased screening.”
In a 1997 article, two dermatologists, Dr. Robert Swerlick and Dr. Suephy Chen of Emory University and the Atlanta Veterans Affairs Medical Center, wrote that while some people might be saved by screening, there also are risks from a melanoma diagnosis.
“After a patient has received the diagnosis of melanoma, obtaining insurance can be extremely difficult,” they wrote. “The diagnosis of melanoma also results in heightened scrutiny of all first-degree relatives and family members of the patient, and if increased surveillance leads to increased diagnosis, this process may also put them at risk for the diagnosis of melanoma.”
Others who study cancer screening said that Dr. Welch’s arguments were convincing and that he had raised issues about the national melanoma epidemic that could not easily be dismissed.
Dr. Barnett Kramer, associate director of the Office of Disease Prevention at the National Institutes of Health, said that, of course, the ideal way to know if a screening program works is to do a randomized clinical trial, assigning some people to screening and not others, then seeing if the screening saved lives. Absent such a study, he said, he finds Dr. Welch’s paper convincing.
“It’s doesn’t look like our melanoma awareness campaigns have made an impact on mortality or on late-stage disease,” Dr. Kramer said.
Dr. Russell Harris, a professor of medicine at the University of North Carolina and a member of the Preventive Services Task Force, said the new paper “should certainly make us worry about screening.”
That also is the view of Dr. A. Bernard Ackerman, emeritus director of the Ackerman Academy of Dermatopathology in New York. Dermatologists have gone too far, he said, with screening clinics, removing innocuous moles and diagnosing melanoma too freely.
It makes sense for a doctor to look at your skin during a regular physical exam, Dr. Ackerman said, but screening programs have led to an excessive zeal for skin biopsies and for diagnosing melanoma.
“There has been a mania for taking off these moles that are of no consequence,” Dr. Ackerman said. “We’re talking about billions and billions of dollars being spent, based on hype.”
While there may be questions about screening programs, Dr. Swerlick said that few in his field wanted to discussion their merits. He and Dr. Chen tried to open the debate themselves a few years ago but were met with hostility or disdain, he said.
“My colleagues in private practice know what we have written and they can’t imagine that it could be correct,” Dr. Swerlick said.
“This is a very touchy subject,” he added.
And he appreciates why. “Many well-intentioned people have focused their clinical careers on this,” he said, “and I can understand how unnerving it might be to be faced with the prospect that their efforts have been directed toward something ineffectual.”
For his part, Dr. Welch says that early detection “is a double-edged sword and people need to remember that.”
A few people might be saved because a cancer is found early, he said, but many, many more will be thrown into the medical mill when there is nothing wrong with them.
“People should realize that is the price we pay for screening,” Dr. Welch said, and although screening is widely promoted, “we ought to know whether it helps.”