By Marc Sorenson, EdD, Sunlight Institute
Sometimes we come across research, conducted many years ago, that carries a great message of health for those who seek the sun. In this case the research was done in 1988 and involved a study on mice that were given a chemical protocol designed to induce skin cancer. Half of the mice were also given ultraviolet B (UVB) irradiation during that protocol. After 20 weeks of cancer a cancer initiation-promotion protocol with two carcinogenic chemicals, there were 75% fewer cancerous tumors per mouse in the mice that were irradiated UVB.
Another 24-week study reported in 1992 showed that 12 weeks of UV radiation, applied either before or after chemical initiation of cancer, resulted in a 61% reduction in the mice that were irradiated before the chemical treatments, and 50% in the mice that were irradiated during the treatments.
The message is this: exposure to UV light from sunlamps or sunshine may be protective against skin cancer development. So what’s new? Many of us have been promulgating that message for many years, and this research simply shows that we were not the first to understand the cancer-preventive influences of sunlight.
 Gensler HL Prevention of chemically induced two-stage skin carcinogenesis in mice by systemic effects of ultraviolet irradiation. Carcinogenesis. 1988 May;9(5):767-9.
 Gensler HL, Simpson PJ, Powell MB. Inhibition of 12-O-tetradecanoylphorbol-13-acetate-induced tumor promotion in murine skin by systemic effects of ultraviolet irradiation. Photochem Photobiol. 1992 Jul;56(1):25-30.
By Marc Sorenson, EdD, Sunlight Institute
Whereas melanoma, the deadly skin cancer, is inversely associated with sunlight exposure (more sunlight exposure, less melanoma) the same is not true for NMSC, which is directly associated with sunlight exposure. It is a rarely fatal disease unless the immune system is compromised due to other diseases or anti-rejection drugs. It has been shown that NMSC associates to a lower risk of melanoma and many other cancers.
I am not suggesting that we contract NMSC in order to prevent melanoma. Correct nutritional habits can also reduce the risk of both NMSC and melanoma, and it should be remembered that in the case that someone contracts an NMSC, it can be easily removed. Melanoma, however, can be deadly. The best bet is to eat wisely and obtain plenty of regular sun exposure so that risk of melanoma is dramatically decreased.
NMSC is often used as a marker for sunlight exposure and is compared with various diseases beyond cancer to determine if sunlight exposure associates to those diseases. Dr. Bill Grant just sent me a paper showing that among people over 70 with NMSC, the risk of Alzheimer’s disease (AD) is profoundly decreased; in fact those with NMSC had a 79% reduced risk of Alzheimer’s. Or stated another way, those without NMSC had about five times the risk of the disease. Of course, this demonstrates the value of sunlight in reducing AD.
Let’s protect our minds as we age by getting plenty of non-burning sunlight! Search the Sunlight Institute site to learn more about how Alzheimer’s is influenced by sunlight and vitamin D.
 White RS, Lipton RB, Hall CB, Steinerman JR. Nonmelanoma skin cancer is associated with reduced Alzheimer disease risk. Neurology. 2013 May 21;80(21):1966-72.
By Marc Sorenson, EdD, Sunlight Institute
In an online Newspaper, Irish Examiner, there is a provocative headline: Why a sunscreen can put your health in the shade. Helen O’Callaghan, the author, starts out well by talking about how sunscreens block vitamin D production from sun exposure. She then progresses through a series of diseases that are related to vitamin D deficiency: bone weakness, compromised immune system, cancer, cardiovascular disease, diabetes, inflammatory diseases, adverse pregnancy problems and allergies.
By Bob Berman–
Vitamin D, produced when skin is exposed to light, is essential for our bodies. Unfortunately, modern lifestyles have minimized our time we spend under the sun. The Sun’s Heartbeat explains why a tan isn’t as bad as previously thought.
The first scenes in one Sun-tragedy unfolded long before there were written records of any kind. Spurred by events we can only guess at, a human exodus began 50,000 to 70,000 years ago, when our ancestors migrated away from the tropics and the equatorial region’s strong sunlight. Immediately, people developed vitamin D deficiencies.
Our bodies make vitamin D when our skin is struck by the Sun’s ultraviolet rays. Because UV intensity declines dramatically with lower Sun angles, people in temperate regions, and especially those in even higher latitudes, receive as little as 10 percent of the UV experienced by those near the equator. As our ancestors migrating north developed vitamin D deficiencies, the results were swift and brutal. They were removed from the breeding pool by a cruel Darwinian process: the fetus inside a woman with rickets (a disease resulting from low vitamin D) is unable to emerge from her body, and both die in childbirth.
Within just a few thousand years, natural selection had turned some people’s skin white, and they were now able to manufacture ample vitamin D even from the reduced Sun intensity of the higher latitudes. (Dark skin color, called melanin, is a sunblock, needed because naked bodies near the equator can suffer from too much ultraviolet exposure.) In North America and northern Europe, the climate is sufficiently warm that their skin was almost fully exposed for more than half the year, and their bodies stored vitamin D in the muscle and fat. A new balance had been restored.
But starting a century ago, everything changed. First, the United States and Europe went from a mostly outdoors agrarian society to a mostly indoors manufacturing one. Then people started driving around in vehicles surrounded by windows. Glass prevents any vitamin D production because it blocks the Sun’s UV. When air-conditioning became widely available starting in the late 1950s and then got cheaper in the 1970s, people stopped keeping their windows open. Fixed- pane units became increasingly popular. The only sunlight that reached us in our homes and workplaces came through UV-stopping glass.
The last straw was sunblock. It did not even exist until thirty years ago. The initial UV- reducing creams, which cut exposure only in half, were marketed in the 1950s to promote tanning, not totally screen out ultraviolet rays. Then, in the 1980s, a new product came on the market: sunblock. With SPF (sun protection factor) numbers such as 30 and 45, sunblock essentially stops the body’s vitamin D production cold. At the same time, people were advised to cover themselves with these lotions throughout the summer months. Even the medical establishment urged hiding from the Sun as a way to counter skin cancer.
The metamorphosis was complete: we had become like the Morlocks in H. G. Wells’s book The Time Machine, shielded almost totally from sunlight’s UV.
Enter modern vitamin D researchers such as John Cannell, MD, executive director of the Vitamin D Council, a nonprofit educational corporation that believes that “many humans are needlessly suffering and dying from Vitamin D Deficiency.” Cannell is no ordinary medical doctor. He’s no ordinary researcher either. He is a proselytizer, the first in the theater to shout “Fire!” when the smoke appears, while there’s still time to get out. And these days, he’s very, very passionate. He believes that human beings have unwittingly transformed themselves into something uniquely and self- destructively unnatural.
“We are the first society of cave people,” he lamented to me in 2010. “In the development process of creating the skin, nature never dreamed that we’d deliberately avoid the Sun so thoroughly.”
What Cannell and a growing legion of researchers are decrying are the past three decades of newspaper and TV scare stories that have made the public afraid of the Sun. The consequence, they believe, is that our blood’s natural vitamin D levels are just a tiny fraction of what nature intended. And this is producing an avalanche of horrible consequences that include vastly increased rates of cancer.
That vitamin D is super-important is no longer in doubt. It has become the new needed supplement, recommended increasingly by family doctors and the popular media alike. The March 2010 Reader’s Digest calls vitamins in general “a scam” and urges people to take no daily supplements whatsoever – with the single exception of 1,000 international units (IU) of vitamin D3, the form most recommended as a supplement.
This sudden interest has been sparked by a spate of studies strongly indicating that vitamin D is the most powerful anticancer agent ever known. Robert Heaney, MD, of Creighton University, a vitamin D researcher, points to thirty- two randomized trials, the majority of which were strongly positive. For example, in a big study of women whose average age was sixty-two, subjects who were given a large daily vitamin D supplement enjoyed a whopping 60 percent reduction in all kinds of cancers after just four years of treatment compared to a control group.
The skeptical might well wonder how, when cancer typically takes decades to develop, such a huge drop can be detected after just a few years. Heaney believes it’s because vitamin D prevents tiny predetectable tumors from growing or spreading. “That’s the kind of cancer I’d want to have – one that never grows,” he told me in June 2010.
The Canadian Cancer Society raised its vitamin D intake recommendations to 1,000 IU daily in 2009. But Cannell, Heaney, and others think that even this is still way too low.
“I went to a conference and asked all the researchers what they themselves take daily and give to their families,” Heaney said. “The average was 5,500 IU daily. There is certainly no danger in doing this, since toxicity cannot arise in under 30,000 IU a day.”
Why is this vitamin D craze happening now? It sounds suspiciously familiar – like the antioxidant craze of the 1990s, when everyone was gobbling vitamin E to guard against “free radicals.” Or the Linus Pauling– led vitamin C frenzy of the 1970s. Recent studies have shown that all those vitamins have no effect on mortality whatsoever. Indeed, a multivitamin a day now seems to be no better for your health than gobbling a daily Hostess Twinkie. Perhaps our bodies were not designed to get flooded with vitamins. Or maybe the couple of dozen known minerals and vitamins are only the tip of the health iceberg, and what’s important are hundreds, or perhaps thousands, of trace substances of which we are not yet even aware.
Yet it is here, in a discussion of the natural environment in which our bodies were fashioned, that vitamin D makes so much sense. After all, our bodies create it naturally out of the Sun’s ultraviolet rays.
Spending just ten minutes in strong sunlight – the kind you get from 11:00 AM to 3:00 PM between April and August – will allow your body to make as much vitamin D as you would get from drinking two hundred glasses of milk. This is astonishing. Asks John Cannell rhetorically, “Why does nature do this so quickly? Nature normally doesn’t do this kind of thing.”
The implied answer, of course, is that we were designed to have a high and steady level of this vitamin in our bodies. Yet as more and more people are tested, researchers are finding serious vitamin D deficiencies in virtually all of the population of the United States, Canada, and northern Europe. The reason? According to Cannell and the other doctors on the Vitamin D Council, we have been hiding from the Sun for decades.
The results may be even worse than we realize. Many researchers now fear that the explosive increase in autism is a result of pregnant mothers having close to no vitamin D in their bodies and then young babies and infants being similarly shielded from the Sun. The Centers for Disease Control (CDC) says that virtually no infants are getting enough vitamin D. The inadequacy figures, even using the CDC’s pre-2011 lower recommendations of what they thought the body should have, was that 90 percent of infants are deficient.
According to Cannell, the highest autism rates occur in areas that have the most clouds and rain, and hence the lowest blood levels of vitamin D. A Swedish study has strongly linked sunlight deprivation with autism. Moreover, blacks, whose vitamin D levels are half those found in whites living at the same latitudes, have twice the autism rates. Conversely, autism is virtually unknown in places such as sunny Somalia, where most people still spend most of their time outdoors. Yet another piece of anecdotal evidence is that autism is one of the very few afflictions that occur at higher rates among the wealthier and more educated – exactly the people most likely to be diligent about sunscreen and more inclined to keep their children indoors.
As we saw in assessing links between earthly events and sunspot fluctuations, it’s perilous to assign connections too quickly, and autism in particular is a can of worms. Nonetheless, these early threads should set off alarms: it might be wise for pregnant women and mothers of small children to immediately start exposing themselves and their kids to more sunlight.
When Cannell was in medical school in 1973, he was taught that human breast milk contains little or no vitamin D. “This didn’t make sense,” he said during a phone conversation with me in 2011. “Why would nature ever deprive a nursing infant of this vital substance?” Then it came to him: “When pregnant women start taking 5,000 international units of vitamin D daily, their milk soon contains enough vitamin D for a breast-feeding baby. So there’s the key to how much a woman should naturally be getting every day.”
In contrast to all this, and to the great annoyance of physicians and researchers on the Vitamin D Council, the FDA continued to advise only 400 IU of D3 daily as of early 2011. The agency officially regards most vitamin D studies as “incomplete” or “contradictory” and clearly has taken a cautious, go-slow approach.
In November 2010, the National Academy of Sciences’ Institute of Medicine issued its first new recommendations about the vitamin since 1997, and many people were disappointed. The institute did boost its recommended daily amounts to 400 IU for infants, 600 IU for most adults, and 800 IU for those over age seventy. It also said there was no harm in taking up to 10,000 IU daily, although it conservatively adopted 4,000 IU as the official recommended upper limit.
According to Cannell, the new recommendations are still “irrelevant dosages.” Michael Holick, MD, of Boston University, another vitamin researcher, agreed, saying that he personally takes 3,000 IU daily.
Cannell told me that the National Academy of Sciences report was a “scandal” and that four physicians had disgustedly resigned from the committee that put out the paper. “Commonsense aspects are totally lacking,” he said. “For example, they urge infants to get 400 IU daily, but adults just 600 IU. Yet this vitamin is distributed in muscle and fat. The more you weigh, the more you should be getting. It doesn’t make sense.”
“Listen,” he added, “everyone knows that there is an explosion of childhood cases of autism, asthma, and autoimmune disease. It all began when we took our children out of the Sun. Starting twenty-five years ago, a perfect storm of three events has changed how much sunlight children get. First came the scare of childhood sexual predators in the early eighties, then the fear of skin cancer, and finally the Nintendo and video game craze. Nowadays, kids do not play outdoors. Playgrounds are empty. You’re a bad mother if you let your child run around. And it’s almost a social services offense if your kid gets a sunburn. Never before have children’s brains had to develop in the absence of vitamin D.”
Since this is not a medical book, I can only pass on the recommendations of those in the forefront of vitamin D research. Their best advice is to go in the Sun regularly without burning. Wear as little clothing as you can. You know how much Sun you can han-dle without turning red. Unless you have a very light complexion and blond or red hair, you should be able to expose yourself safely to ten to twenty minutes of strong sunlight at a time. Lie out in the Sun in shorts for five to ten minutes on each side. The key to UV intensity is Sun height. If your shadow is shorter than you are, your body will produce a good amount of vitamin D.
After experiencing twenty minutes of unprotected midday Sun from May to July, or a full hour or more during March, early April, and late August through October, you can certainly use sunblock. The experts say to buy the kind whose active ingredient is either zinc or titanium oxide. Most other kinds will be absorbed by the skin, then enter the bloodstream and circulate. “You might as well drink the stuff,” Cannell says disdainfully.
During the low-Sun winter months, you need to spend much more time sunbathing and probably take a vitamin D supplement. The experts are currently urging 2,000 to 3,000 IU daily.
Why not skip the Sun altogether and just pop the pills year- round? Some doctors, including those responsible for the 2010 National Academy of Sciences report, suggest doing exactly that. They figure that you can have it all – nice, high vitamin D serum levels plus no UV exposure, with its skin cancer risk. But others believe that’s a bad idea. “Some of my colleagues think D3 supplements are enough,” Cannell says. “But that supposes we know everything. I suspect that we do not know everything. Natural sunlight has to be the preferred route whenever possible.”
Everyone should use solar power wisely and not go totally bonkers. There’s no need to fry. But whatever extra skin cancer risk we might assume certainly seems to me to be a reasonable price to pay, considering the benefits. It now appears that adequate sunlight- mediated vitamin D might prevent as many as 150,000 cancer deaths a year in the United States alone and also reduce infections, bone problems, and perhaps, though more science is needed, even autism and asthma rates. Of course, on the other side of the balance beam, melanoma causes 8,500 US deaths a year. Every activity from bicycle riding to barroom brawling involves some balancing of risks, and the decision of what trade- offs to make is, of course, yours alone.
Tomorrow is a new day. As the Sun rises, its orange beams will cast magical rays in the morning mist. Is the Sun our enemy or our friend? Will it take our life or save it?
A young woman who avoided the sun has died after contracting a rare form of skin cancer.
Cerys Harding, 21, knew nothing about her devastating illness until it had spread across her body into her brain, chest and spine and it was too late for doctors to act.
Her family have struggled to come to terms with the tragedy as Cerys did not like sunbeds or even sunbathing.
Mum Beverley, 50, said: “She was a girl who never ever sat in the sun, she hated the sun.
“She had dark hair and dark eyes and it didn’t make sense, none of it did.
“She didn’t use sunbeds or anything like that.
“She was the only person on the beach that had a towel on her as well as under her.”
University student Cerys had only recently celebrated her birthday in November when doctors discovered the cancer.
Beverley said her daughter was at the family home in Severn Grove, Canton, Cardiff, when the initial signs of what was to come appeared.
“We came home from town because she rang us saying she’d had this funny turn and she didn’t know what had happened.
“She remembered getting off the floor and then waking up again 45 minutes later in bed and she’d been sick,” Beverley said.
“I just rang NHS Direct because she didn’t seem really ill and we made an appointment for later that day.
“At the doctor’s we were told she needed a brain scan but we still weren’t really worried. We then went up to Llandough Hospital to have the scan and it all snowballed from there.”
The family described Cerys as an “angel” and “a parent’s dream” and said her first response was concern for them.
Dad David, 52, said: “She was worried about leaving us. That’s the first thing she said. We were all blown to bits but that was just how she was, always thinking of other people.
“All she said was that it wasn’t meant to be.”
Beverley added: “The only person she ever fell out with in her life was a boyfriend at uni. And after the diagnosis she said the one thing she had to do was to go and make her peace with him, which she did.
“That was the kind of girl she was, she didn’t like upsetting anyone.
“When she was younger if I ever shouted at Lloyd it would be Cerys who would be the one crying, she cared so much about other people.”
Both David and Beverley took time off work to care for Cerys and took her on one last holiday to Paris following the news – while extended family also flew over from Australia for one final Christmas together.
And while the occasion was extremely difficult for the family, Cerys made sure they all enjoyed it.
“It was as normal as it could be,” her brother Lloyd, 24, said.
“She told us ‘I don’t want everyone crying all over me, I just want to carry on as normal’, so that’s what we did.”
The family said Cerys, a former pupil of Severn Road Primary School and Cantonian High School, had aspirations to be a primary school teacher.
They added she had always been academic and said even though she did not finish the full three years of her history degree, Swansea University awarded it to her on the merit of the work she had already done.
“At that age you’re supposed to be indestructible,” Beverly added.
“Parents worry about their children when they’re babies but when they get to a certain age, never in a million years did we think anything like this would ever happen to Cerys as she still had so much to give.
“She was as honest as the day was long, she never did anything wrong. She said remember the good times, but the thing is we never had any bad times.”
By: Jonathan Benson–
Exposing skin to natural sunlight every day is the best way for the body to receive adequate levels of health-promoting and disease-preventing vitamin D. But in what can only be described as carrying on a legacy of pseudo-scientific stupidity, the National Council on Skin Cancer Prevention (NCSCP) — whose members include the American Cancer Society (ACS) and the American Academy of Dermatology (AAD) — has recently come out urging people to avoid sunlight exposure at all costs, and instead take vitamin D supplements.
While vitamin D supplements can be helpful in treating cases of vitamin D deficiency, or during the winter months when sunlight exposure is limited, it is sheer lunacy to suggest that the sun is dangerous and must be avoided, as did NCSCP. Henry W. Lim, MD, co-chair of NCSCP and chairman of dermatology at Henry Ford Hospital, actually stated that he believes it is “not appropriate” to expose skin to ultraviolet (UV) radiation from sunlight.
UV radiation, however, is precisely what skin needs to be exposed to in order to produce vitamin D. And numerous recent studies have confirmed that not only is natural sunlight beneficial for skin, but avoiding it is responsible for all kinds of diseases, including the reemergence of rickets.
In other words, the NCSCP recommendations are a public health hazard, as not getting enough UV radiation from natural sunlight will destroy health. Even the supplementation recommendations from NCSCP are dismally low, constituting a mere 600 international units (IU) of vitamin D a day. In fact, a recent study published in the journal Anticancer Research explains that the average adult needs around 8,000 IU of vitamin D a day to protect against serious diseases like diabetes and cancer.
To put it bluntly, avoiding the sun out of fear is pure stupidity. With proper antioxidant intake and a gradual conditioning of your skin to accept sunlight without burning, virtually everyone can benefit from regular sunlight exposure without having to accept the fear mongering and lies that have for too long kept people lathering themselves in toxic sunscreens and avoiding the sun at all costs.
To learn more about the many benefits of vitamin D, visit:
By Richard Gray–
Researchers studying how sun exposure affects the risk of developing melanomas discovered that those who spent between four to five hours in the sun each day over the weekend were less likely to develop tumours.
The findings appear to contradict the commonly-held belief that longer time spent in the sun increases the risk of skin cancer.
Instead, the study shows that while excessive exposure to the sun – and particularly sunburn – can lead to melanomas, regular doses of sun for up to five hours a day at weekends can be protective.
The study comes just days after Andy Flowers, the England Cricket team head coach, underwent surgery to remove a malignant melanoma below his right eye.
Professor Julia Newton Bishop, an epidemiologist who led the research at Leeds University, said it seems regular exposure helps the skin adapt and protect itself against the harmful affects of sunshine. Increased levels of vitamin D made in the skin while exposed to sunlight may also be protective.
Professor Newton Bishop said: “The relationship between the amount of sun we are exposed to and the risk of melanoma is complicated – we have known for a long time that melanomas are something to do with sun exposure and fair skin.
“Our paper suggests that moderate regular sun exposure may actually reduce the risk. We are talking about quite high levels of sun exposure for the protective effect with an average of four to five hours a day at weekends.
“It appears that in moderation, sun exposure can be protective, but it is when you have extreme sun exposure that it becomes a problem. So in the UK sunburn is a potent risk factor because we have a habit of not getting much sun at home and then suddenly exposing our skin when we go abroad.”
Malignant melanoma is the most serious type of skin cancer and around 10,000 people in the UK are diagnosed with the disease each year. The incidence of the disease is rising faster than any other cancer in the UK and has quadrupled since then 1970s. Around 2,000 die each year in this country from skin cancer.
Public health experts blame the rise in skin cancer in the UK on growing use of sun beds and an increase in the number of holidays people take abroad.
Harmful ultraviolet light from the sun is thought to trigger skin cancer by causing damage to the DNA in the skin.
But the new study by Professor Newton Bishop and her colleagues, which is published in the European Journal of Cancer, suggests that regular sun exposure can help the body prevent this damage.
The researchers examined the sun exposure behaviour and skin type of 960 melanoma patients and 687 controls who had not been diagnosed with skin cancer.
After adjusting the results to account for deprivation, they found that participants with fair skin, freckles and blonde or red hair, were most at risk of developing melanomas as where those who had suffered severe sunburn.
But they also found that regular exposure to the sun at weekends of more than five hours had the most significant effect that protected the participants from developing melanoma.
Unfortunately for those with sensitive skin, this protective effect was not seen in people who had red hair and freckles, perhaps due to their tendency to burn far more quickly.
The researchers also measured levels of vitamin D in 1,167 of the participants, who were aged between 18 and 76 years old, and found that those who received regular doses of sun exposure at weekends also had raised levels of vitamin D.
Professor Newton Bishop said: “There is some evidence from other studies that suggests that vitamin D may help to reduce melanoma size and improve prognosis, but it could be that there is some adaptation going on in the skin which reduces the damage from ultraviolet light.
“Melanoma, in the UK, is a cancer of people who work inside who have short bursts of sunshine when they are on holiday. If they are working in offices all week, then when they go sunbathing on holiday, they don’t have the protection that might naturally develop.
“Regardless, people need to take steps to avoid getting sunburnt – particularly at this time of year when the days are shorter and there is much less sunshine around. People who go away for winter sun holidays are particularly at risk.”
Researchers at Mayo Clinic have found a significant difference in cancer progression and death in chronic lymphocytic leukemia (CLL) patients who had sufficient vitamin D levels in their blood compared to those who didn’t.
In the Mayo Clinic study, published online in the journal Blood, the researchers found that patients with insufficient levels of vitamin D when their leukemia was diagnosed progressed much faster and were about twice as likely to die as were patients with adequate levels of vitamin D.
They also found solid trends: increasing vitamin D levels across patients matched longer survival times and decreasing levels matched shortening intervals between diagnosis and cancer progression. The association also remained after controlling for other prognostic factors associated with leukemia progression.
The finding is significant in a number of ways. For the first time, it potentially offers patients with this typically slower growing form of leukemia a way to slow progression, says the study’s lead author, Tait Shanafelt, M.D., a hematologist at Mayo Clinic in Rochester, Minn. “This finding may be particularly relevant for this kind of leukemia because although we often identify it at an early stage, the standard approach is to wait until symptoms develop before treating patients with chemotherapy,” Dr. Shanafelt says. “This watch and wait approach is difficult for patients because they feel there is nothing they can do to help themselves.” “It appears vitamin D levels may be a modifiable risk factor for leukemia progression. It is simple for patients to have their vitamin D levels checked by their physicians with a blood test,” he says. “And if they are deficient, vitamin D supplements are widely available and have minimal side effects.”
This research adds to the growing body of evidence that vitamin D deficiency is a risk factor for development and/or progression of a number of cancers, the researchers say. Studies have suggested that low blood vitamin D levels may be associated with increased incidence of colorectal, breast and other solid cancers. Other studies have suggested that low vitamin D levels at diagnosis may be associated with poorer outcomes in colorectal, breast, melanoma and lung cancers, as well as lymphoma.
Replacing vitamin D in some patients has proven to be beneficial, the researchers say. For example, they cite a placebo-controlled clinical trial that found women who increased their vitamin D intake reduced their risk of cancer development.
In this study, the research team enrolled 390 CLL patients into a prospective, observational study. They tested the blood of these newly diagnosed patients for plasma concentration of 25-hydroxyl-vitamin D and found that 30 percent of these CLL patients were considered to have insufficient vitamin D levels, which is classified as a level less than 25 nanograms per milliliter. After a median follow-up of three years, CLL patients deficient in vitamin D were 66 percent more likely to progress and require chemotherapy; deficient patients also had a two-fold increased risk of death.
To confirm these findings, they then studied a different group of 153 untreated CLL patients who had been followed for an average of 10 years. The researchers found that about 40 percent of these 153 CLL patients were vitamin D deficient at the time of their diagnosis. Patients with vitamin D deficiency were again significantly more likely to have had their leukemia progress and to have died, Dr. Shanafelt says.
“This tells us that vitamin D insufficiency may be the first potentially modifiable risk factor associated with prognosis in newly diagnosed CLL,” he says.
The study was funded by the National Institutes of Health, Gabrielle’s Angel Foundation for Cancer Research, the Henry J. Predolin Foundation, Vysis, Inc., and the Mayo Hematologic Malignancies Fund. The authors declare no conflicts of interest.