The Overlooked Office Space Heater Test

Recently, a research team from the U.S. National Institutes of Health looked into whether men and women have different reactions to hot and cold ambient room temperatures. According to one news report, the study concluded that there were “very slight gender differences in temperature perception of a room at ambient conditions and very few gender differences in physiological response to a perceived chill.”

“Very slight gender differences”? Is this what passes for science these days?

You can read about how the study reached that erroneous conclusion, but all I can say is that they ran the wrong kinds of tests. If they had attempted more practical, real-world analysis, they would have reached the correct conclusion: women tend to be far more sensitive to cold than men, and it really isn’t even a close question. Here are three obvious tests that the research team should have included in their study:

The Office Space Heater Test. Surprisingly, the research team did not ask which gender is more likely to have a space heater in their office. Based on personal experience, I’d say it’s got to be women, by a factor of about 999 to 1. In some of the offices in our firm, space heaters are cranked up to maximum output and it is so hot you could grow African violets in there, and the female occupants are nevertheless complaining of the cold.

The Sweater Test. Another obvious oversight was the failure of the research team to go into the closets of study participants, count the number of sweaters they owned, and evaluate the bulkiness of those sweaters. I think that exercise in the scientific method also would have yielded a clear result: women tend to own more sweaters, and bulkier sweaters, and wear them more often.

The Fleece Blanket Test. In the most egregious omission, the study did not conduct a simple but conclusive experiment: put a study participant on a couch in front of a TV in a reasonably cool room, put a folded fleece blanket on the couch, and see whether men or women are more likely to use the fleece blanket. A reasonable follow-up would be to again look at where the study participants live and count how many fleece blankets they own, and whether they are strategically positioned in every room.

This kind of news story does raise troubling questions about the validity of the scientific research results we are getting these days.

80 Can Be Weighty

On the health front, it can be hard to know what to do sometimes. Confusing and often contradictory studies that can influence lifestyle choices seem to abound.

For example, I’ve always understood that, as you get older, a big part of maintaining good health is working to keep your weight down, because excessive weight is associated with many problematic health conditions that can affect mortality–not to mention causing trouble for aging joints. Now I’ve seen a study that suggests that for people over 80–a group the study, incidentally, refers to as the “oldest-old,” which seems a bit harsh–maintaining more weight and a higher body mass index number is associated with decreased mortality risk.

So, what’s a person who’d like to make it to that “oldest-old” category to do?

Apparently successful, long-term aging is an exercise in threading the needle. In your 60s and 70s, stay focused on the scale and the beltline, and keep that weight off. But, at the same time, don’t get too weak and scrawny, either, because if you make it to 80 you might need to bulk up a bit more. But if you do make it to 80, let the party begin!

It also means you should keep both the “fat clothes” and the “skinny clothes” in your closet, because you’re probably going to need them all at some time or another.

Cicadas On The Cusp

This year will be a big cicada year in the Midwest.

We live with cicadas every summer, thanks to the “annual cicada” nymphs that emerge from their underground homes, climb the nearest tree, and molt into their adult form. They then make an unholy racket as part of their mating process. Every so often, one or more of the cicada species in the broods that remain underground for much longer periods–13 or 17 years–also emerge, and the cicada love call noise level increases accordingly. This year, multiple species in both of the longer span cicada broods will emerge, for the first time since 1803, so we’ll probably need earplugs.

It will be prime time not only for enduring the calls of the cicadas, but also for studying these interesting–albeit loud–creatures. One of the things scientists are interested in examining is a fungus called Massospora that infects only cicadas. The fungus replaces their abdomens and genitals with fungal tissue and fills their systems with chemicals, causing them to engage in unusual sexual behavior to spread the fungus even more. Among other things, scientists are interested in seeing whether the fungus can be used for medicinal purposes in humans. (Speaking only for myself, I’d be leery of ingesting any medicine created with a cicada fungus that has the effects described above, but then I’m not very adventurous.)

I’ve lived through a number of these periodic cicada brood emergences, and it’s really no big deal. It’s loud at night when the cicadas are getting busy, but they soon die, making for crunchy walks in the cicada zones and helping to enrich the soil. I’d never travel to see even more cicadas, but if you want to get maximum cicada exposure, scientists say that Illinois will be ground zero.

When Patience Blooms

Gardening is an exercise in patience, physical labor, resilience, and attention to detail–but mostly patience. You place your plants, tend to them, water them, remove weeds, try to protect them from disease and deer and ravaging insects and other pests, accept failure and try again, and hope that your labors are fruitful.

With some plants, you need more patience than others.

Consider the “sapphire tower” plant (Puya Alpestris) being tended in the Birmingham Botanical Gardens in Birmingham, Alabama, which is blooming for the first time in a decade. The plant is native to Andean region of Chile, where it grows in higher elevations and is pollinated by hummingbirds, who can hover by the plant between the spiky talks and dip their beaks into the fluted flowers. Removed from its native habitat (and hummingbirds), the plant grows slowly, and must be pollinated by hand, using paintbrushes. Its blooms also last for only a short while.

But when it does bloom, as it is doing now and for the next few weeks, the “sapphire tower” is a magnificent sight. I’m sure the horticulturalists in the Birmingham Botanical Gardens are feeling a tremendous satisfaction right now, and fellow gardeners can vicariously share in that feeling. It’s great to see patience and painstaking effort rewarded.

Watching The Eclipse

Watching the solar eclipse that passed through Columbus yesterday was, in a word, awesome. It’s an overused word, to be sure, but it definitely captures the amazement we experienced as we stood back by the parking garage behind the firm, tilted our heads to the sunny, fortunately mostly cloudless sky, wore our eclipse glasses, and watched the moon slide slowly past the sun.

I wasn’t particularly excited about the eclipse, but once I put on the glasses and saw the moon blocking a sliver of the sun, I was hooked. It was as if a magnificent celestial production was being staged just for our benefit. As the moon continued its journey, and the visible sun shrank to a crescent, you could feel the light fading incrementally. The air cooled noticeably as more and more of the sun’s warming rays were blocked. Just before the point closest to totality, as the sky darkened and we reached twilight dimness, all of the automatic outdoor lights in the alley went on. It was a bit startling, but nothing could distract us from the great show in the sky..

Everyone who came out to look at the eclipse seemed to share the same jaw-dropping reaction to the spectacle. There’s a “wow” factor to an eclipse that isn’t like anything else I’ve experienced. I can definitely understand why other people will travel to watch one. In fact, now that I’ve got the glasses, I might just do that myself.

Neck Betrayal

You might think you look younger than you actually are. You might initially focus on your face and think it really doesn’t look all that bad, given the mileage–but then your eyes travel down to the neck and all pretensions vanish. The neck is the great betrayer of age, with the truth revealed in the saggy skin that makes you wonder why you haven’t started gobbling.

Why does neck skin sag? A lot of the answer has to do with basic wear and tear on your skin and the hard work put in over the years by the platysma muscles, which are the thin layer of muscles that run from your jaw down the neck to your collarbones. The California Skin Institute website has a pretty good description of what happens:

“As with all skin, factors like genetics, extreme weight loss, collagen and elastin breakdown, and sun exposure can affect how your neck ages. However, there are additional factors that can act specifically on the neck to make it look older than the rest of you.

Thin, weak and delicate skin and muscle cover the neck. Year after year, twisting, stretching, and the pull of gravity and any pockets of subcutaneous fat have a cumulative aging effect. Most people notice neck skin beginning to significantly sag and wrinkle around the age of 40. That’s also when underlying platysmal muscles start to detach and loosen, their edges showing through thinning skin as vertical bands from the chin to collarbone.”

In short, by the time you’re into your 40s and beyond, the cake is baked. When you think about it, the area directly under the chin has done a lot of work by then, fiercely resisting the direct downward pull of gravity and engineering years of vigorous nodding and head turning. By then, your platysma muscles and skin are exhausted and unable to snap back as they did when you were younger. There’s really not much you can do about it, either, other than try to avoid excessive sun and cycles of significant weight gain and loss that stretch the layer of skin out even more and put you ever more firmly into turkey neck territory.

The California Skin Institute passage quoted above says your neck may “look older than the rest of you.” I’d guess your neck would beg to differ, and might argue instead that it is the most accurate reflection of the years you’ve logged.

A Person Who Made A Difference

I like reading about people whose lives really made a difference. Recently I ran across an article about one such person: Dr. Norman Borlaug, shown above, who would have turned 110 last week. Dr. Borlaug is one of only six people in history to win the Nobel Peace Prize, the Congressional Gold Medal and the Presidential Medal of Freedom, and he is credibly said to have “saved more lives than any other person who has ever lived.”

Dr. Borlaug was an American who was a leader of the “Green Revolution.” He combined extensive agricultural know-how and political savvy to help increase food production in countries that had been struggling with starvation and famine. He focused on developing approaches to food production that could be readily employed in those countries, drawing upon his extensive knowledge of different varieties of seeds, irrigation, plant pathology, genetics and breeding, soil science, fertilizers, pesticides, and mechanization. He also developed a high-yielding, short-strawed, disease-resistant form of wheat that was key to the effort, and that helped produce enormous increases in production. He won the Nobel Peace Prize in 1970, and his Nobel Prize biography noted that his wheat strain and agronomic practices had produced revolutionary advances in Mexico, Pakistan, and India and had been adopted by other countries in Central America, Africa, and the Middle East.

Interestingly, Dr. Borlaug was not an ivory tower theorist, but a tough, practical farmer who worked in the fields and got dirt under his fingernails. He also had a gift for convincing governmental officials to try his methods. It says something about Dr. Borlaug’s continuing impact that the African Journal of Food, Agriculture, Nutrition and Development publishes pages of Norman Borlaug quotes, one of which states: “the first essential component of social justice is adequate food for all mankind.”

I’d heard about the Green Revolution but was not aware of the specifics of Dr. Borlaug’s career and accomplishments–which shows, again, how one person can made a profound difference in people’s lives. You wonder how many people like Norman Borlaug are out there in the world right now, working under the radar yet having a huge impact in their communities. I’m pretty sure there are a lot of them.

Standardized Tests

Recently I completed my annual physical exam with my doctor. A few weeks earlier I had given a blood sample and urine sample for testing, gotten weighed, had my gait and grip strength measured and my blood pressure checked, completed eyesight and hearing tests, and done some of the other preliminaries. In my second visit, my doctor went through all of the test results gathered from the blood and urine samples and performed some of the other checks and probes that are part of this annual ritual.

For a few years now, my annual physical has included memory and cognition tests designed to identify early signs of dementia or other mental issues that are common among aging people. The tests are pretty basic. One involves him giving me five words to remember, then we talk for a few minutes about something else, then he sees if I can repeat the words back for him. Another involves listening to a story and then answering questions about it. Other parts of the test involve answering math questions. It’s become part of the routine.

I thought about this part of my physical when I read about President Biden’s annual physical, which did not include any cognitive or memory tests. His press secretary says he doesn’t need one, because “he passes a cognitive test every day” on the job. Some people disagree and question that omission.

Let’s set aside President Biden, Donald Trump, and politics of the moment. Doesn’t it seem strange that there isn’t agreement on the tests that will be administered and reported on as part of the annual physicals of incumbent Presidents and candidates for the office–and that the check-up will include standard cognition and memory tests given as a matter of course? Serving as President is obviously a difficult, challenging job, and when individuals seek the office voters deserve to have reasonable information about their physical and mental capabilities. To be sure, it intrudes upon their privacy–but anyone who runs for President accepts that, in doing so, they will be giving up some personal privacy in the process.

Part of the problem with the current hyperpolitical atmosphere is that people find it difficult to step back from the individual circumstances of the current candidates and look at the big picture. We’ve admittedly come a long way since the days when President Woodrow Wilson’s stroke and disability were kept secret, but I think there is more to be done in serving the interests of transparency. I’d like to see Congress or a bipartisan commission agree on what physical and mental tests are material to the office of the Presidency and should be included in the annual physicals, and then require every candidate to agree that they will undergo those tests, and to authorize their doctors to release the results of those tests in a standardized report for public inspection.

Regrettably, there will probably always be weird conspiracy theories and wild speculation about presidential candidates, but coming up with a standardized approach to assessment of their physical and mental fitness for the job should help to refute the crazier assertions while also giving voters meaningful information as they make their choices.

Canine Kisses

It’s pretty standard for an affectionate dog to try to lick the face of a human pal. Some people welcome a slobbery lick–but, should they? Is it healthy?

I recently ran across this article that summarizes some of the science related to dog licks. It seems clear that licking is instinctive, and important, behavior for the dog, and is part of their naturally empathetic makeup. In short, dogs like to lick. For the human recipient of a lick, however, there is some risk,

The risk arises from the fact that most dogs are not exactly careful about what they put in their mouths. As a result, dog saliva may contain bacteria and other microbes that could cause serious health problems for humans, although the chances of that happening apparently aren’t all that significant. The risk seems to be greatest for people who are immunocompromised or who have open wounds, although older people, young children, and pregnant women also are urged to be cautious about accepting a canine kiss.

The most likely result of a dog lick therefore isn’t illness or infection, but rather getting a big whiff of foul dog breath in the process. Most dog owners gladly accept those risks in exchange for the affection and companionship that their dogs bring to their lives. Still, I try to avoid the lick and opt for a friendly pat on the head instead.

Tale Of Our Tail

Our distant relatives, monkeys, have tails. Humans, and our closer relations the great apes, don’t. We’ve got a tailbone, to be sure, but we lack the unit extending from it.

So, what gives? How did homo sapiens, chimpanzees, and gorillas become de-tailed?

A new study concludes that the answer to that question, as is true of so many questions about human development, lies in our genes. Researchers determined that a snippet of DNA found in apes and humans, but lacking in monkeys, affects tail development. The DNA segment is called AluY, and it is inserted into a gene called TXBT. When scientists tested the segment on laboratory mice, they found it affected tail development and led to some mice born without tails.

The ancestors of humans and the great apes are believed to have lost their tails about 25 million years ago, when they evolved away from the ancestors of modern monkeys. The evolutionary change resulted in fewer tail vertebrae and the development of the coccyx–our tailbone. Scientists believe the loss of the tail may have been an evolutionary advantage that better allowed our ancestors to shift from living in trees to living on the ground–a key development in human history that led eventually to walking upright, freeing hands for tools, and other activities that are associated with human brain development. Scientists also believe, however, that the insertion of the DNA snippet may have also resulted in an increase in neural tube birth defects, like spinal bifida.

What the study doesn’t tell us is why the loss of a tail might have been an advantage to our ancestors. We know from the theory of natural selection that evolution is all about surviving to reproduce and pass on your genetic code. That suggests that tail loss either made our ancestors better able to defend themselves against predators, or better able to hunt and forage and obtain food, or it made them more attractive to potential mates. Would a prehensile tail that could be snagged or grabbed be a disadvantage in a struggle on the ground, or did ancient females find tails off-putting and a key determinant in mate selection? We’ll probably never know.

Happy Prolong Day

Well, it’s February 29, 2024–the “leap day” in this “leap year.” And that gives rise to a question: why do we use the word “leap” in describing the calendar manipulation that happens every four years to account for the fact that the Earth doesn’t take precisely 365 days, and not a second more, to complete its lap around the Sun?

Here’s the problem, from my view: it’s the end of February, and no one really feels like “leaping” anywhere. “Leaping” contemplates springing ahead with force and enthusiasm and perhaps a bit of youthful hopefulness and exuberance–which is why the saying “look before you leap” came about. But at the end of February, most of us aren’t really brimming with qualities like enthusiasm and hopefulness and exuberance, are we? Instead, we’d prefer that the month would be over and it would be March, already–but instead we’re saddled with another day in February, and we’re not exactly “leaping” about it.

In short, “leap” is not only inapt, it’s kind of a slap in the face.

Why the use of “leap”? Here’s how the National Air and Space Museum explains it: “a common year is 52 weeks and 1 day long.  That means that if your birthday were to occur on a Monday one year, the next year it should occur on a Tuesday. However, the addition of an extra day during a leap year means that your birthday now “leaps” over a day.  Instead of your birthday occurring on a Tuesday as it would following a common year, during a leap year, your birthday “leaps” over Tuesday and will now occur on a Wednesday.”

So, it’s the days of the week that are “leaping,” not us. Well, I say the heck with that! I say we should come up with a more people-centric term that describes how this day affects human beings, not inanimate squares on a calendar.

I suggest that we use “prolong” rather than “leap,” as in “prolong the agony”–because that’s how we feel about adding another unwelcome day to an unwelcome month in an unwelcome time of year. So, happy Prolong Day! Let’s get through it, and get on to March.

A Month Thanks To Numa

February is upon us. Weather-wise, it’s probably the most despised month of the calendar in the northern hemisphere. By the time February rolls around, people are sick to death of winter and eager to welcome the promise of spring that arrives with March. But there is crummy old February and its traditionally awful weather, blocking the way.

February is also weird, because it’s got fewer days than every other month. When I was a kid, my grandmother taught me a rhyme to help me remember the length of the various months that started “30 days hath September, April, June, and November . . . .” but the rhyme kind of petered out when it hit February, which “stands alone.” All of which makes you wonder: why does February have only 28 days in most years? With 365 days and 12 months in a year, you could have easily have at least 30 days in each month to even things up, and then have five months with 31 to make up the difference. So what gives?

According to sources like the Encyclopedia Brittanica, we’ve got an ancient Roman king named Numa Pompilius, pictured above, to thank for this strange discrepancy. Before Numa took over, the Roman calendar consisted of ten months–six with 30 days and four with 31. That calendar obviously would quickly get out of sync with the seasons, so Numa decided to add two months to the calendar–January and February–to bring the Roman calendar in line with the lunar year, which is 355 days long. Ancient Romans apparently were superstitious about even numbers, so Numa made all of the months 29 days long–except February, which had the unlucky number of 28 days. 

Numa apparently picked February as to oddball month because it had two festivals linked with purification, or februum, which gave the month its name.One of the festivals was Feralia, in which ancient Romans brought food and gifts to cemeteries to keep the dead content to remain in their graves and not rise to haunt the living. In short, February already had its challenges, so why not stick it with 28 days?

The Julian calendar and the Gregorian calendar later changed the number of days in a year to adopt a solar year approach, because the lunar year followed by Numa is shorter than the actual amount of time needed for Earth to make its annual lap around the sun, which meant Numa’s calendar also got out of sync with the actual seasons. The Julian and Gregorian calendars added a few days to the months to make up the difference and adopted the leap year concept to make sure the calendars aligned with the seasons–but kept the twelve months, kept the name February, and continued to stick February with the fewest days. 

It’s strange to think that, thousands of years later, the modern world still follows a calendar with an oddball month named for ancient Roman activities and a number of days assigned due to ancient Roman superstitions. But really, it all works out. Every northerner is grateful that dismal February is the shortest month–even during a leap year like this one. The sooner we get to March, the better.

The Lake In Winter

Each of the Great Lakes has its own unique characteristics. Lake Erie, running on an east-west axis along the north coast of Ohio, is the warmest and shallowest of the five Great Lakes, and is a delightful place to spend a sunny and sultry summer day. 

Winter works a profound change on the character of Ohio’s Great Lake, however. Every person who has lived in northeastern Ohio is familiar with the “lake effect” snow that occurs when storms passing west to east roll over the lake, pick up moisture, and deposit huge amounts of snow as they move along. But the annual “lake effect” blizzards are not the only winter phenomenon that can occur on Lake Erie. Because the lake is so shallow, strong prevailing winds can cause “sieches” (pronounced “sigh-shhs”), which are oscillations in the lake’s water levels that are caused when the wind pushes the water from west to east. 

When a really significant sieche occurs, as happened earlier this month when 65-mile-and-hour winds hit the lake, the bed in the western part of the lake can be exposed–leaving huge rock formations that typically are water covered visible to the naked eye and creating opportunities for cool photographs like the one above. The biggest sieche in history was a 22-foot shift that occurred in 1844 and killed 78 people. This year’s sieche, fortunately, was not as destructive. But when a serious sieche occurs, be careful about venturing too far out onto the lake bed–because the water always comes back after the wind stops.  

The Jinx Factor On Judgment Day

There I was, standing nervously in front of the podium of St. Peter next to the Pearly Gates. He looked down at me with a knowing expression, twirled his key on his index finger, and then spoke in a solemn voice.

“Before we can consider whether you might gain entrance, we have a few things to discuss. There is a lot to talk about, but we’ll start with sports,” St. Peter said.

“Sports?” I asked. Surprised but thinking quickly, I added: ”I’m sorry for all of the cursing and anger issues when I played golf.”

St. Peter chuckled with a sound like rolling thunder. ”Hah! Don’t worry about that–it’s why we enticed the Scots to invent the infernal game in the first place. Golf was designed to get under people’s skin and provoke them to outbursts of temper and profanity. We figured people generally, and the Scots specifically, needed to get that out of their systems, and golf is a pretty harmless way to do it.”

“Well, that’s good to know,” I said with relief. ”But if it’s not golf, what sports issue do I need to address with you?”

“Specifically, it’s about your commitment to the sports teams for which you claimed to be a fan.” After a glance at a great, leather-bound volume, St. Peter added: ”Your record indicates you were not sufficiently attentive to avoiding jinxes that affected your teams.”

“Wait . . . what?” I stammered. ”Are you saying that jinxes are real, and that my clothing choice, the seat I was sitting in, my decisions on whether to record games, and whether I was wearing a lucky hat and consumed the right number of beers actually influenced the outcome of games? I thought that was all just silly superstition that humanity outgrew in the age of science.”

St. Peter shook his head sadly. ”Actually, the reverse is true. You know from your exposure to quantum physics and the thought experiment with Schrodinger’s cat that an event can exist in a state of superposition, where any outcome is possible, until the event is observed. You’ve heard of the observer effect and the concept of the butterfly effect, where the flapping of a butterfly’s wings can contribute to the generation of a hurricane. In short, the science of your time is just beginning to glimpse the great truth: we are all in this together, and the actions and thoughts of one person can alter the zeitgeist and the karmic forces that affect everyone and can have a definite effect on the results of athletic contests.” 

“Okay, I think I can grasp that,” I said, “but sports? Isn’t being a sports fan kind of . . . trivial in the grand scheme of things?”

St. Peter tapped his key on the lectern, shifted in his seat, and looked down at me with another rueful shake of his head. ”That view is also wrong,” he said. ”In fact, sports are extremely important to the human story. As one of our residents here used to say, they allow people to vicariously experience ‘the thrill of victory and the agony of defeat.’ And they also reinforce some important points that humans need to be reminded of–that the world isn’t necessarily fair, but the important thing is to remain dedicated, keep the faith, and do what you can to try to ensure a better outcome the next time. Sports fans can do that by continuing to support their chosen teams, even through the rough times–and also make sure that they take personal actions that will help to positively influence the outcome.”

St. Peter looked down at his great book again, and added: ”You’ve had some failures and some successes on that score that we need to discuss. Those two Ohio State national championship games you attended–your behavior in those instances was flawless. You did everything you needed to do, from wearing the right clothes, carrying a lucky buckeye, and imparting respectful and positive energy in favor of the Men of the Scarlet and Gray, and the outcomes reflected that.”

I grinned at those positive memories, with a welling sense of pride at my individual contribution to two great days for Buckeye Nation.

“And then there’s the Cleveland Browns, and The Drive and The Fumble,” St. Peter continued, turning to more painful topics. You already know what you did to cause The Drive, when you let one of your friends leave his seat in Cleveland Municipal Stadium when the contest was in the balance. You can’t imagine how upsetting that was to the energy forces that day. And The Fumble happened because you neglected to wear the right sweatshirt, and in watching the game you showed an unseemly overconfidence that the Browns would win that also roiled the kismet in an unfavorable way.” 

I grimaced at these devastating memories, ashamed that my conduct harmed my team.

“So your record shows some good and some bad,” St. Peter noted, as he turned a page. ”Now, let’s talk about what you did on January 13, 2024, the day of the playoff game between the Cleveland Browns and the Houston Texans.”

And then I woke up.

The Multivitamin Question

Should you take a multivitamin tablet every day? Many people do. For some, it is out of force of habit that has its roots in childhood vitamin consumption; others have seen the ads and figured that one pill a day providing a few extra vitamins can’t hurt. After all, vitamins are associated with ruddy good health, right?

In June 2022 the U.S. Preventive Services Task Force–an independent volunteer panel of experts that attempts to make “evidence-based recommendations about clinical preventive services”–released a report on its review of studies related to the effects of multivitamins and mineral supplements. The evaluation considered 84 studies on the impact of vitamin and mineral supplements on cardiovascular disease and cancer, which are the two leading causes of death in the United States, with the objective of determining “the benefits and harms of vitamin and mineral supplementation in healthy adults”–that is, those without identified vitamin or mineral deficiencies–“to prevent cardiovascular disease and cancer.”

The USPSTF’s report on its review of the 84 studies includes the following summary of its “conclusions and relevance”: ”Vitamin and mineral supplementation was associated with little or no benefit in preventing cancer, cardiovascular disease, and death, with the exception of a small benefit for cancer incidence with multivitamin use. Beta carotene was associated with an increased risk of lung cancer and other harmful outcomes in persons at high risk of lung cancer.” 

This conclusion has led some to argue that, for most people who don’t have specific, existing conditions where additional, targeted vitamin intake has an appreciable health benefit, the daily taking of a multivitamin pill has no impact on health–so why not save the money and get the daily allotment of vitamins through diet? And if you do have a specific vitamin deficiency issue, get your doctor’s recommendation on supplements to address that particular issue, rather than ingesting other vitamins you might not need.

A lot of companies pay a lot of money seeking to convince us that taking a multivitamin pill is an easy road to good health and a long life. As the USPSTF report suggests, however, that’s not necessarily the case. A one-size-fits-all pill is no magic substitute for attention to diet, exercise, fresh air, and consideration of your specific health circumstances.