Daniel sent us this one — he wants to talk about vitamin D, but not in the usual "take your supplements" way. His point is that vitamin D receptors are distributed throughout basically the entire body, and that sunlight has these far-reaching effects on health that go way beyond bone metabolism. The real question he's asking is, why do we still call it a vitamin when it functions more like a hormone, and what are we missing when we reduce sunlight exposure to just popping a pill?
This is one of those topics where the standard public health messaging has created a weird blind spot. People hear "vitamin D" and they think it's like vitamin C — something you get from food, maybe take a supplement, done. But structurally it's a secosteroid hormone. The "vitamin" label is a historical accident from the nineteen twenties when they were classifying rickets-preventing factors and didn't know what they were dealing with.
A secosteroid hormone. I'm going to let you unpack that, but I want to note — you just did the thing where you dropped a term that sounds like a basketball statistic.
A secosteroid just means the molecule has one of its carbon rings broken open — structurally it's in the steroid family, like testosterone or cortisol, but with that one ring cracked. And the body manufactures it through a multi-step process that starts in the skin with a cholesterol derivative and ultraviolet B radiation. That's not how vitamins work. Vitamins are cofactors we can't synthesize and have to eat. Vitamin D is something we produce endogenously through sun exposure, and then it gets converted in the liver and kidneys into its active hormonal form.
The entire framing is wrong from the jump. When did we figure this out?
The receptor side really exploded in the last twenty to thirty years. There's a landmark review by Holick and Wacker — Michael Holick is basically the godfather of vitamin D research at Boston University — and they catalogued that the vitamin D receptor, the VDR, is expressed in essentially every tissue. Brain, heart, pancreas, prostate, breast tissue, immune cells, skeletal muscle, vascular smooth muscle, you name it. Something like two hundred different genes are under the direct control of the active form of vitamin D.
If the receptor is everywhere, the effects are everywhere. It's not a single-purpose molecule.
This is where the sunlight-versus-supplement distinction becomes really important, because sunlight does things that a vitamin D capsule simply doesn't. There's a separate pathway — worked out by a group at the University of Edinburgh led by Richard Weller — where UV-A radiation hitting the skin triggers the release of nitric oxide from nitrite stores in the skin. Nitric oxide is a vasodilator, and that directly lowers blood pressure. This has nothing to do with vitamin D.
Wait, so sunlight lowers blood pressure through a completely vitamin-D-independent mechanism?
Weller's group exposed people to about twenty minutes of summer sunlight and measured a significant drop in blood pressure lasting about an hour, traceable to nitrite mobilized from the skin into the circulation. The vitamin D synthesis pathway uses UV-B. The nitric oxide pathway uses UV-A. Two different wavelengths, two different mechanisms, both from the same source.
When someone says "just take a supplement and avoid the sun," they're missing that entire second pathway.
There's beta-endorphin production in the skin triggered by UV exposure — that's why sunlight can be mildly addictive and why people report mood elevation that's not just psychological. There's melatonin regulation through the suprachiasmatic nucleus, which is responsive to bright light exposure, particularly morning light. There's even evidence that UV modulates the skin microbiome in ways we're only beginning to understand. Sunlight is a complex biological signal, not just a vitamin D delivery system.
By the way, before we go deeper — DeepSeek V four Pro is generating our script today, so if anything sounds unusually coherent, that's why.
I'll try to keep up my usual level of digression regardless.
I believe in you. So let's go back to the receptor distribution. You said it's in immune cells. That seems like a big deal.
It's huge. The VDR is expressed in basically all immune cell types — T cells, B cells, macrophages, dendritic cells. And when the active form of vitamin D binds to those receptors, it directly induces the expression of antimicrobial peptides like cathelicidin and defensins. These are essentially natural antibiotics that our own cells produce. Cathelicidin is particularly interesting because it's effective against a broad range of pathogens including bacteria, viruses, and fungi.
Your innate immune system is partly solar-powered.
That's not a bad way to put it. There's a reason respiratory infections peak in winter when UV-B exposure drops and vitamin D levels bottom out. The seasonal pattern of influenza has been correlated with vitamin D status in multiple studies. Now, correlation isn't causation, and there are other seasonal factors — humidity, crowding indoors — but the mechanistic evidence through cathelicidin induction is pretty compelling.
What do the actual blood levels look like across the population?
The NIH Office of Dietary Supplements tracks this. In the US, something like forty percent of adults are considered insufficient — serum levels below twenty nanograms per milliliter. But that number jumps dramatically in winter and at northern latitudes. Above about thirty-seven degrees north — roughly the line from San Francisco to Richmond, Virginia — there's essentially no vitamin D synthesis from sunlight between November and February. The UV-B photons just don't make it through the atmosphere at the right angle.
I'm above thirty-seven degrees north and I'm a sloth who rarely goes outside. I should probably be concerned.
You absolutely should. But here's where the supplement question gets tricky. The Institute of Medicine set the recommended dietary allowance at six hundred to eight hundred international units per day for most adults, aiming for a blood level of twenty nanograms per milliliter. The Endocrine Society came out with different guidelines suggesting fifteen hundred to two thousand IU per day and a target of thirty. There's genuine disagreement among experts.
What's driving that gap?
Partly it's about what endpoints you're optimizing for. The Institute of Medicine was focused primarily on bone health — calcium absorption, fracture prevention, rickets. Twenty nanograms per milliliter is sufficient for that. The Endocrine Society was looking at the broader pleiotropic effects — immune function, cardiovascular outcomes, cancer risk — where the evidence suggests higher levels might be beneficial, but the randomized controlled trial data is thinner.
That's the thing I always wonder about. Observational studies show all these associations between low vitamin D and bad outcomes — cancer, autoimmune disease, cardiovascular events, depression, cognitive decline. But then the randomized trials with supplements often show null results or very modest effects. How do we square that?
This is probably the central tension in vitamin D research right now. There are a few possible explanations. One is confounding — people with low vitamin D also tend to be less healthy in other ways. They're older, they spend less time outdoors, they may have poorer nutrition, higher body fat percentage. Vitamin D is fat-soluble, so it gets sequestered in adipose tissue, meaning obese individuals often have lower circulating levels. So low vitamin D might be a marker of poor health rather than a cause.
Right, the classic "sick people have low vitamin D, therefore low vitamin D makes you sick" reversal.
There's another possibility I find more interesting, and it connects back to what Daniel was getting at about sunlight versus the vitamin itself. It's possible that vitamin D produced in the skin through sun exposure behaves differently than vitamin D taken orally. When you produce it in the skin, it's bound to vitamin D binding protein in a particular way, and it may have different tissue distribution kinetics.
The delivery mechanism matters.
There's also the fact that sun exposure gives you a sustained, gradual release of vitamin D from the skin into the circulation over hours or even days. With an oral supplement, you get a spike. The pharmacokinetics are completely different.
Then there's the whole question of whether vitamin D is even the active agent in those observational studies, or whether it's a proxy for sunlight exposure, which is doing a dozen other things through those pathways you mentioned — nitric oxide, endorphins, circadian entrainment.
That's the million-dollar question. The VITAL trial was supposed to settle some of this. It was a massive randomized controlled trial — nearly twenty-six thousand participants, run out of Brigham and Women's Hospital, published in twenty nineteen. They gave two thousand IU of vitamin D per day versus placebo and looked at cancer and cardiovascular outcomes. The primary results were essentially null.
Null across the board?
For the primary endpoints, yes. No significant reduction in invasive cancer or major cardiovascular events. But there were some secondary signals. They saw a possible reduction in cancer mortality — not cancer incidence, but death from cancer — that emerged after a couple of years. And there was a hint of benefit for autoimmune disease, published separately. But overall, it was not the slam dunk that vitamin D advocates were hoping for.
Which brings us back to the sunlight-versus-supplement distinction. If the VITAL trial gave people vitamin D pills and mostly got null results, but observational studies of sun exposure show consistent benefits, maybe vitamin D is the wrong part of the story to focus on.
Or at least it's incomplete. Let me give you a concrete example from dermatology. There's a well-known paradox where outdoor workers — farmers, construction workers, lifeguards — have lower rates of melanoma than indoor workers, despite massively higher cumulative sun exposure. The standard narrative is "sun causes skin cancer, therefore more sun equals more melanoma." But the epidemiology doesn't actually support that in a straightforward way.
That seems to contradict everything we hear.
It's more nuanced than the public messaging suggests. Intermittent, intense sun exposure — the kind where you work in an office all week and then get burned at the beach on Saturday — that pattern is strongly associated with melanoma. Chronic, regular occupational exposure actually seems to be protective or at least neutral for melanoma risk, though it does increase risk of non-melanoma skin cancers like basal cell and squamous cell carcinoma. And those are far less deadly.
The messaging that collapsed all of that into "sun is bad, avoid it at all costs" may have been counterproductive.
There's a legitimate debate about this. A paper out of Sweden a few years back, from the Karolinska Institute, looked at all-cause mortality in about thirty thousand women followed for twenty years. They found that women with active sun exposure habits had lower all-cause mortality than sun avoiders. And the effect was dose-dependent — more sun exposure, lower mortality. The reduction in cardiovascular disease mortality was particularly striking.
They controlled for confounding?
They controlled for income, education, physical activity, smoking, alcohol, BMI — the usual suspects. The effect persisted. Sun avoiders had a mortality rate about double that of the highest sun exposure group over the twenty-year follow-up. That's comparable to the effect of smoking.
That's not a rounding error.
No, it's not. And a Danish study using national health registries found that a diagnosis of non-melanoma skin cancer — strongly associated with cumulative sun exposure — was actually associated with lower all-cause mortality compared to the general population. The interpretation being that the sun exposure that gave them the skin cancer also gave them cardiovascular and other benefits that outweighed the cancer risk.
This is going to make dermatologists very uncomfortable.
It already does. And to be fair, the dermatology community's position isn't unreasonable. Skin cancer is real, it's increasing in incidence, and UV radiation is a known carcinogen. The question is about balance and nuance. Blanket sun avoidance may be throwing the baby out with the bathwater, especially at northern latitudes where vitamin D deficiency is endemic.
Let's talk about autoimmune disease. You mentioned the VITAL trial had a signal there.
Yeah, the autoimmune findings from VITAL were published separately and they were actually pretty interesting. Over about five years of follow-up, the vitamin D group had a twenty-two percent reduction in incident autoimmune disease compared to placebo. That's not trivial. And when you look at the mechanistic evidence, it makes sense. The active form of vitamin D is a potent modulator of the adaptive immune system. It tends to shift T cell responses away from the pro-inflammatory Th1 and Th17 phenotypes and toward regulatory T cell phenotypes.
It's not just boosting immunity, it's regulating it. Keeping it from going haywire.
Autoimmunity is essentially the immune system attacking self-tissue. Vitamin D helps maintain self-tolerance. There are epidemiological studies showing that multiple sclerosis risk increases with latitude — the further you are from the equator, the higher your MS risk. This gradient is one of the strongest and most consistent findings in neurology. And low vitamin D levels in childhood and adolescence are associated with increased MS risk later in life.
MS specifically, or other autoimmune conditions too?
Type one diabetes, rheumatoid arthritis, inflammatory bowel disease, systemic lupus erythematosus — all show evidence of association with vitamin D status or latitude or both. The Finnish type one diabetes story is particularly interesting. Finland has one of the highest rates of type one diabetes in the world, and also very low UV-B exposure for much of the year. In the nineteen sixties, they had a public health program of high-dose vitamin D supplementation for infants — two thousand IU per day. When they reduced that recommendation to four hundred IU per day in the nineteen nineties, type one diabetes incidence started climbing. It's not proof, but it's suggestive.
I want to circle back to the latitude thing, because it's such a strong natural experiment. If vitamin D were just a minor player, you wouldn't expect such consistent geographic gradients for so many different conditions.
The Atlas of Cancer Mortality in the United States, published by the National Cancer Institute, shows clear north-south gradients for multiple cancer types — colon cancer, breast cancer, prostate cancer — all with higher mortality in the north. This was actually one of the observations that got the Garland brothers interested in vitamin D and cancer back in the nineteen eighties. They proposed that sunlight and vitamin D might be protective against colon cancer, and they were initially met with a lot of skepticism.
Now the mechanistic evidence is substantial. Colon cancer cells express the vitamin D receptor. Activation of the VDR in colonocytes promotes differentiation and apoptosis and inhibits proliferation and angiogenesis — all the things you want to suppress in a developing tumor. The observational epidemiology has been largely consistent, though again, the randomized trial evidence from VITAL didn't show a reduction in cancer incidence.
It showed a reduction in cancer mortality after a few years, which is arguably the more important endpoint.
If vitamin D doesn't prevent cancer from starting but helps keep it from becoming aggressive or metastatic, that's still a meaningful benefit. And there's biological plausibility for that — vitamin D's effects on angiogenesis and immune surveillance would be expected to matter more for tumor progression than for initiation.
Okay, let's shift to something more immediate. Mood and cognition. There's a reason seasonal affective disorder is treated with light boxes.
Seasonal affective disorder is interesting because the light box treatment uses visible light, not UV. It's primarily mediated through the eyes and the circadian system, not through vitamin D synthesis in the skin. But vitamin D does seem to play a role in brain function independently. The VDR is expressed throughout the brain, including in the hippocampus, prefrontal cortex, and substantia nigra — regions involved in memory, executive function, and motor control.
There's clinical evidence?
Observational studies consistently find associations between low vitamin D and depression, cognitive decline, and dementia. The randomized trial evidence for depression is mixed but leans toward a modest benefit. A meta-analysis a few years ago found that vitamin D supplementation was more effective than placebo for depressive symptoms, with an effect size that was small but statistically significant. For cognitive decline and dementia, the trials are less conclusive, partly because you'd need very long follow-up to see an effect.
One of the challenges with all of this is that vitamin D deficiency in the modern world isn't just about latitude. It's about lifestyle. We're indoors all the time.
This is a point that doesn't get enough attention. The human body evolved under conditions of fairly constant sun exposure in equatorial Africa. Our skin pigmentation, our vitamin D physiology — all of it is calibrated for high UV environments. Then we migrated to higher latitudes, and our skin lightened to compensate, to allow more UV-B penetration for vitamin D synthesis. But the really dramatic change has been in the last hundred years. Industrialization, urbanization, office work. We've gone from spending most of our time outdoors to spending something like ninety percent of our time indoors.
Then when we do go outside, we're told to wear sunscreen, which blocks UV-B and therefore blocks vitamin D synthesis.
Sunscreen with an SPF of thirty reduces vitamin D synthesis by about ninety-five percent if applied properly. And most people don't apply it properly, so the real-world effect is probably less dramatic. But the principle holds. We've simultaneously reduced our sun exposure through lifestyle and then blocked what little exposure we get. It's a double hit.
What's the practical takeaway? Because I don't want people to hear this conversation and think we're saying "throw away your sunscreen and bake yourself.
The sweet spot seems to be short, regular, non-burning sun exposure. The amount of time needed varies enormously by skin type, latitude, season, and time of day. But as a rough rule of thumb, exposing arms and legs to the sun for about ten to fifteen minutes during midday in summer, for a fair-skinned person at mid-latitudes, is enough to generate something like ten thousand to twenty thousand IU of vitamin D. After that, the vitamin D precursor in the skin starts to degrade as fast as it's produced — there's a self-limiting mechanism that prevents vitamin D toxicity from sun exposure.
That's an important point. You can't overdose on vitamin D from the sun. The body has a built-in governor.
But you can absolutely overdose on oral vitamin D supplements. Vitamin D toxicity, with hypercalcemia and all its complications, is a real thing. It's rare, but it happens when people take very high doses for extended periods. The sun doesn't carry that risk.
Whereas with supplements, people hear "more is better" and start taking fifty thousand IU a week because some influencer told them to.
That's dangerous. The tolerable upper intake level set by the Institute of Medicine is four thousand IU per day for adults. Some experts think that's conservative, and doses up to ten thousand IU per day are probably safe for most people. But beyond that, you're in uncharted territory. I've seen case reports of people taking fifty thousand IU daily for months and ending up with kidney failure from hypercalcemia. It's not something to mess around with.
Sunlight has a built-in safety mechanism that supplements don't. Another point for the sun.
There's the circadian benefit. Morning sunlight exposure is one of the most powerful zeitgebers for the human circadian clock. The intrinsically photosensitive retinal ganglion cells in the eye respond most strongly to blue-enriched light in the morning, and that signal goes directly to the suprachiasmatic nucleus, the master clock. That sets the timing of cortisol release, melatonin onset in the evening, body temperature rhythms — all of it.
If you're indoors under artificial light all morning, you're not getting that signal at the intensity it needs.
Indoor lighting is typically around three hundred to five hundred lux. Outdoor light on a cloudy day is around ten thousand lux. Direct sunlight is a hundred thousand lux. The difference is orders of magnitude. Our circadian systems evolved for outdoor light intensities, and indoor lighting is basically a perpetual twilight. It's not enough to properly entrain the clock.
We've got circadian disruption, vitamin D deficiency, missing the nitric oxide pathway, missing the endorphin pathway. All from the same lifestyle shift.
It's probably not a coincidence that all these conditions rising in prevalence — autoimmune diseases, mood disorders, sleep disorders, certain cancers — all have plausible connections to sunlight deficiency through multiple mechanisms. We can't pin it all on vitamin D alone, and that's exactly Daniel's point. Calling it "vitamin D deficiency" frames the problem too narrowly.
It's like calling scurvy "vitamin C deficiency" and then being surprised that eating an orange is better than taking a vitamin C pill. The orange has flavonoids and fiber and other compounds that the pill doesn't. Sunlight is the whole orange.
The research community is increasingly recognizing this. There's a concept called "sunlight exposure recommendations" that some researchers are advocating for, distinct from vitamin D supplementation guidelines. The idea is that public health should have separate recommendations for safe sun exposure, not just tell everyone to take supplements and stay out of the sun.
What would those recommendations look like?
It's still being worked out, and it would have to be highly individualized by skin type and geography. But the Australian model is interesting. Australia has the highest skin cancer rates in the world, so they've been appropriately aggressive about sun protection. But they've also recognized that some sun exposure is needed for vitamin D, and their guidelines now include a statement that a few minutes of sun exposure on most days during summer is sufficient, and that people at risk of deficiency may need more.
Australia is an interesting case because they're dealing with a primarily fair-skinned population living at latitudes much sunnier than what that population evolved for. The skin cancer risk is real. But they still found room for nuance.
Skin type makes a huge difference. Someone with very fair skin, Fitzpatrick type one or two, might need five to ten minutes of midday summer sun to make adequate vitamin D. Someone with darker skin, Fitzpatrick type five or six, might need thirty minutes to an hour or more, because melanin is essentially a natural sunscreen. This is a real health equity issue. Darker-skinned individuals living at high latitudes are at dramatically higher risk of vitamin D deficiency.
That's probably underdiagnosed because the symptoms are non-specific — fatigue, muscle aches, low mood. Easy to dismiss.
Very easy to dismiss. And it's particularly concerning for pregnant women and infants. Maternal vitamin D deficiency during pregnancy is associated with increased risk of preeclampsia, gestational diabetes, preterm birth, and low birth weight. The vitamin D receptor is expressed in the placenta and plays a role in placental function and fetal development.
Let's go back to something you mentioned earlier about the skin producing vitamin D gradually over hours after sun exposure. I want to understand the mechanism there.
When UV-B hits the skin, it converts seven-dehydrocholesterol — a cholesterol precursor found in the cell membranes of keratinocytes in the epidermis — into previtamin D three. That previtamin D three then undergoes a thermal isomerization at body temperature to form vitamin D three, cholecalciferol. This process takes time. The cholecalciferol then gets picked up by vitamin D binding protein and enters the circulation.
Why is the gradual release important?
Because the liver can only hydroxylate so much cholecalciferol at a time. The first step in activation is twenty-five hydroxylation in the liver, producing twenty-five-hydroxyvitamin D, which is what we measure in blood tests. If you take a huge oral dose, you flood the system with cholecalciferol that the liver has to process. With sun exposure, the release is more gradual, and it more closely mimics the evolutionary pattern that our physiology is adapted to.
Then the kidney does the second hydroxylation step?
Right, the kidney converts twenty-five-hydroxyvitamin D to the active hormonal form, one twenty-five-dihydroxyvitamin D, through the enzyme one-alpha-hydroxylase. But here's something fascinating — that enzyme isn't only in the kidney. It's also expressed locally in many tissues, including immune cells, the prostate, breast tissue, and the colon. So those tissues can take the circulating twenty-five-hydroxyvitamin D and convert it to the active form right where it's needed.
It's not just that the receptor is everywhere. The activation machinery is everywhere too.
This is a locally regulated hormonal system. Different tissues can fine-tune their own vitamin D activity independently of what's happening systemically. This is completely different from how we think about most vitamins.
Which brings us back to the naming problem. Calling it a vitamin undersells it so dramatically that it distorts public understanding and probably distorts research priorities.
The name is a historical artifact. When Elmer McCollum discovered it in the early nineteen twenties, he called it vitamin D because he had already named vitamins A, B, and C, and this was the next one in the sequence. They knew it prevented rickets, but they didn't know its chemical structure or mechanism of action. By the time we figured out it was a secosteroid hormone, the name was entrenched.
Renaming it now would just confuse everyone. So we're stuck with the misleading label.
We're stuck with it. But conversations like this one can at least help people understand that what's in their supplement bottle is not the whole story. It's a piece of a much larger physiological system that evolved in concert with regular sun exposure.
One thing we haven't talked about is the skin microbiome. You mentioned it briefly earlier.
This is a really new area of research. The skin has a complex microbial ecosystem, and UV exposure seems to influence its composition. There's some evidence that UV can promote the growth of beneficial bacteria while suppressing pathogenic ones. And the skin microbiome in turn influences local immune function. It's another pathway through which sunlight might affect health that has nothing to do with vitamin D.
We've got at least four or five independent pathways from sunlight to health. Vitamin D synthesis, nitric oxide release, beta-endorphin production, circadian entrainment, microbiome modulation. And probably others we haven't discovered yet.
There's also emerging evidence about UV effects on the release of other neuropeptides from the skin. The skin is essentially a neuroendocrine organ. It's not just a barrier. It's actively sensing the environment and producing signaling molecules that affect the whole body.
The skin as a neuroendocrine organ. That's a framing most people have never heard.
It's not how dermatology has traditionally conceptualized it. But the evidence is there. The skin produces and responds to a wide range of hormones, neuropeptides, and cytokines. It's in constant two-way communication with the nervous system and the immune system. Sunlight is one of the primary environmental inputs that this system evolved to process.
When we live indoors and avoid the sun, we're depriving a major sensory and regulatory organ of its primary input.
And we're doing it at a population scale. The World Health Organization estimates that indoor air pollution and sedentary indoor lifestyles are major contributors to the global burden of non-communicable disease, but the sunlight deficiency piece of that story is rarely discussed in those terms.
Let's talk about what optimal blood levels actually look like, because there's genuine confusion about this.
The most commonly used cutoff for deficiency is below twenty nanograms per milliliter of twenty-five-hydroxyvitamin D. Insufficiency is typically defined as twenty to thirty. Sufficiency is above thirty. But the optimal range for the pleiotropic effects — immune function, cancer risk reduction, cardiovascular health — might be higher, somewhere in the forty to sixty nanograms per milliliter range. The problem is that the randomized trial evidence for those higher targets is limited.
How much supplementation does it take to get to those levels?
It varies enormously between individuals. Body weight, body fat percentage, genetics of vitamin D binding protein, baseline levels — all of it matters. As a rough rule, each thousand IU per day of additional supplementation raises blood levels by about five to ten nanograms per milliliter, but that's a population average. Individual responses can vary by a factor of three or more.
Which makes blanket dosing recommendations problematic.
The only way to know your level is to test. And even then, you're measuring the circulating storage form, not the active hormonal form, and not the tissue-level activity. Serum twenty-five-hydroxyvitamin D is a useful but imperfect biomarker.
What about food sources? Because most people think they can get enough from diet.
You really can't, unless you're eating fatty fish multiple times a day. A serving of salmon has about six hundred to a thousand IU. Fortified milk has about a hundred to a hundred twenty IU per cup. Egg yolks have about forty IU each. UV-exposed mushrooms can have a few hundred IU. But to get to even the modest RDA of six hundred IU solely from unfortified foods is genuinely difficult.
Diet alone is not a realistic strategy for most people.
It's not. The options are sun exposure, supplements, or both. And the supplement approach misses the non-vitamin D benefits of sunlight, while the sun exposure approach carries skin cancer risk. It's a trade-off that each person has to navigate based on their own risk factors.
Which is a more nuanced message than "take this pill" or "avoid the sun.
Nuance is hard to communicate in public health. Simple messages work better for behavior change. "Wear sunscreen" is simple. "Get some sun but not too much, and the right amount depends on who you are and where you live and what time of year it is" is harder to put on a poster.
Oversimplification has consequences. We've spent decades telling people to avoid the sun, and we've got an epidemic of vitamin D deficiency and maybe some other problems we don't fully understand yet.
The pendulum is swinging, though. More researchers are calling for a balanced approach. The challenge is getting that nuance into clinical practice, where doctors have about twelve minutes per patient and a hundred things to cover.
What's the one thing you'd want a listener to take away from this?
That vitamin D is not really a vitamin, sunlight is not just a cancer risk, and the relationship between the two is more complex and more interesting than the standard public health messaging suggests. If you're going to take a supplement, fine — but don't think of it as a complete substitute for what the sun does. And if you're going to avoid the sun entirely, understand that you're making a trade-off, not just eliminating a risk.
That's a good summary. I'd add that the receptor distribution tells the story. When a receptor is in every tissue, the effects are going to be everywhere, and reducing the whole thing to bone health or a blood test number is missing the big picture.
The body didn't evolve this elaborate system of solar-powered hormone production for no reason.
Now: Hilbert's daily fun fact.
Hilbert, what do you have for us today?
Hilbert: The rubber used in real tennis balls — the original indoor racket sport, not lawn tennis — was historically vulcanized using sulfur at precisely one hundred forty degrees Celsius, a process that cross-links the polyisoprene chains. In the eighteen sixties, a short-lived attempt was made on the Japanese island of Hokkaido to produce tennis balls using sulfur extracted from local volcanic fumaroles, which contained trace amounts of selenium that gave the rubber a faint reddish tint and a slightly different bounce characteristic compared to European-manufactured balls.
I did not know Hokkaido had a tennis ball moment.
Volcanic selenium rubber. Someone should bring that back as an artisanal thing.
This has been My Weird Prompts. Thanks to our producer Hilbert Flumingtop, and thanks to Daniel for the prompt that got us here. If you want more episodes, head over to myweirdprompts dot com or find us on Spotify. I'm Corn.
I'm Herman Poppleberry. Go get some sunlight — sensibly.