Live Fast, Die Young

A Chicago writer, Willard Motley, coined the phrase “live fast, die young, and leave a good-looking corpse” in his debut novel. Unfortunately, American mortality statistics are echoing that sentiment–except for the good-looking part, The U.S. is doing worse than other wealthy countries, and last year, American life expectancy dropped for the second year in a row even as the statistics in other countries rebounded from the COVID pandemic.

NPR has an interesting article on this phenomenon that is worth reading in full. Among other things it discusses the “why” question–namely, how can it be that a rich, scientifically advanced country that spends buckets of money on health care fares so poorly in comparative mortality data? The NPR article cites a study done 10 years ago by the National Academy of Sciences called Shorter Lives, Poorer Health. The study tried to identify systemic factors that contribute to the bad statistics.

A few things stand out: first, Americans are more likely to die before age 50, thanks to factors like the opioid epidemic, suicides, other drug use, criminal gun violence, teen pregnancies, and highway deaths. Second, Americans are far more likely to be obese, to smoke, to have bad diets, and to have sedentary lifestyles that contribute to poorer health. These societal elements, which together mean that Americans are far more likely to die young, account for a big chunk of the difference in average life expectancy with countries like, say, Japan.

On the bright side, the U.S. has a better record than other countries in keeping people who make it to 75 alive–but that is cold comfort to those who don’t make it to 50. And when you look at the causes identified by the NAS study, you can’t help but think that a big part of the problem is socioeconomic. Americans who are fortunate to live in comfortable suburban neighborhoods, for example, don’t face the same mortality risks as those who have been born into the south side of Chicago.

The mortality statistics are embarrassing, but in the 10 years since its release the NAS study hasn’t made much of a dent in public consciousness. Regrettably, in America “live fast, die young” isn’t just a good line from a ’40s novel, it’s a summary of reality.

An App Too Far

Governments the world over have struggled to address the COVID-19 pandemic. In the United States, we’ve seen large-scale shutdowns of businesses, mask mandates on planes and in buildings, and social distancing and stay-at-home orders. But it is the Land Down Under — Australia — that has really pushed the envelope.

This week The Atlantic carried an eye-opening article about some of the governmental edicts that have been imposed in Australia–edicts so draconian that the article carries the provocative headline “Australia Traded Away Too Much Liberty.” Consider this partial list of emergency decrees and requirements:

  • Australia has dramatically curtailed its citizens’ ability to leave the country. The article quotes a government website (which you can see here) that states: “Australia’s borders are currently closed and international travel from Australia remains strictly controlled to help prevent the spread of COVID-19. International travel from Australia is only available if you are exempt or you have been granted an individual exemption.”
  • Travel between the six states that make up Australia also is restricted. You can access the governmental website that discloses the current restrictions, which include closing state borders, limiting ability to travel within a state, and mandatory quarantines, here.
  • States have imposed curfews, have banned anti-lockdown protests, and have used the military to disperse and arrest anti-lockdown protesters in Sydney and Melbourne. In Sydney, more than five million people have been in lockdown status for more than two months.

But the most draconian requirement of all is being tested and rolled out by the state of South Australia. It’s an app that the state would require its citizens to download, and the Atlantic article describes it as follows:

“People in South Australia will be forced to download an app that combines facial recognition and geolocation. The state will text them at random times, and thereafter they will have 15 minutes to take a picture of their face in the location where they are supposed to be. Should they fail, the local police department will be sent to follow up in person. ‘We don’t tell them how often or when, on a random basis they have to reply within 15 minutes,’ Premier Steven Marshall explained. ‘I think every South Australian should feel pretty proud that we are the national pilot for the home-based quarantine app.’”

It’s a pretty amazing development when a democratic government claims the ability to unilaterally require citizens to download an app, respond to random government texts, and be required to respond within a specified time period with a personal photo showing they are in “the location where they are supposed to be” or receive a visit from the local police. It’s even more amazing that the head of that government actually thinks citizens should be proud that their state government is the leader in imposing that kind of extraordinary government intrusion. I’d like to think that no duly elected government in America would think that kind of action was anything other than an egregious overreach–but then, I would have thought the Aussies would never have done anything like that, too.

There’s obviously a delicate balance between preserving individual rights and liberties and dealing with public health issues. As The Atlantic article notes, Australia’s dramatic decrees can be cited as allowing it to achieve COVID-related death statistics that are far below those in the U.S. But Australia also shows how the balancing of health and rights can tip decidedly to one side, in a way that strikes at the core of freedoms that are a defining characteristic of democratic societies. Citizens of other countries should be looking at what has happened in Australia and asking themselves: “Was it worth it?” and “Could that happen here?”

Pandemiflab

When the COVID-19 lockdowns started, I remember getting texts from friends with memes consisting of before and after photos showing people gaining weight during the lockdown period. We chuckled at them then. Now a newly released study cites evidence that people in fact did put on weight during the shutdown–and it’s really no laughing matter.

The study involved adult participants from 37 states and the District of Columbia who were monitored between February 1 and June 1 last year. The study indicates that, once shutdown orders were implemented in their locations, the adults began gaining weight at a rate of 0.6 pound every 10 days, or roughly a pound and a half of body weight a month. Researchers attribute the weight gain to the effect of shelter-in-place and office shutdown orders that curtailed everyday activities like walking from an office desk to a conference room or walking to the subway and standing to wait for a train. Those little snippets of exercise during the day add up, and people working from home and sitting on their behinds all day don’t get them. Add in the fact that people reported eating and drinking more during the shutdown, and you’ve got the recipe for weight gain.

Gaining a pound and a half a month may not sound like much, but multiply 1.5 pounds by the number of months the various shutdowns were imposed in different states, or authorities were encouraging people to stay at home to curb spikes and hot spots, and you’ve got more than the “freshman 10” weight gain that people talked about back in college. That’s a lot of weight for people to add in a country where obesity had already become one of the largest public health challenges. And, as any adult knows, once you’ve put on that extra weight, trying to take it off isn’t easy–particularly if you’ve fallen into bad habits.

Once the pandemic period finally ends, we’ll start to get some perspective and meaningful data on whether the prolonged shutdown orders, including the current recommendations that even fully vaccinated people should stay at home if they can, were sound public policy decisions. That involves balancing the impact of those orders on the incidence of COVID-19 cases and hospitalizations against a number of other factors, like depression, suicide, economic disruption and job loss, child development . . . and basic public health issues, like daily exercise, alcohol consumption, and weight gain. We should reserve judgment until all of the meaningful data comes in, but the study noted above shows that there are negative public health consequences to shutdown orders that need to be carefully balanced against the positive effects. It’s pretty clear that the analysis is not going to show a simple, one-sided story.

Going Medieval

The New York Times had an interesting piece on Friday about how the coronavirus is spurring a “new” approach to dealing with disease — “new” in the sense that it is different from how the modern world has handled disease over the past few decades, but really not new at all in that it harkens back to the methods used in medieval times.  The “new” approach is called the quarantine.

quarantineAs the Times article points out, the quarantine is a disease control method that’s as old as time.  During the medieval period, when the spread of disease wasn’t understood from a scientific standpoint, authorities still had techniques they used during a health crisis:  they fought the spread of the Black Plague by closing borders, quarantining sick people on ships and in pest houses, and heading out of the cities into the countryside to get away from the sick zones.  That method of dealing with the spread of disease lasted for centuries.

After advances in science and medicine, the invention of the microscope, and the development of ways of discovering, and treating, diseases and viruses, the approach to public health changed.  The Times article reports that the last time the U.S. government, for example, imposed a national restriction on entry into the country was in 1892, when President Benjamin Harrison ordered that ships from Hamburg be kept offshore for 20 days because Hamburg had lied about a cholera epidemic.  Since then, the U.S. has adopted the “modern” approach, which involves accepting the spread of the disease and trying to deal with it through antibiotics, vaccines, and other forms of treatment.

With the coronavirus, the Trump Administration has combined the “modern” approach with the “medieval” approach.  The Administration imposed a very early ban on entry into the country by non-citizens from China and discouraging travel to China, and over the weekend President Trump announced additional restrictions on travel to areas where new outbreaks have occurred:  Iran, and specific areas of South Korea and Italy.  And, as the Times article points out, these restrictions seem to have worked.  Although there are coronavirus cases reported in the U.S., the incidence rate is far below what some other countries have experienced, and the travel restrictions gave the country time to prepare for the virus.

When it comes to dealing with communicable disease. harsh measures are sometimes necessary, and time is frequently of the essence.  If travel bans and quarantines help public health officials, I’m all in favor of going a bit “medieval” in response to the coronavirus.

The Elephant In The Room

As coronavirus continues to spread, with the total number of reported cases now exceeding 77,000 people worldwide, stock markets plummeting because of the impact of the virus on the global economy, and the World Health Organization saying that the world should be prepared for a pandemic, scientists are trying to figure out exactly how the virus spreads.

According to the Chinese Center for Disease Control and Prevention, one of the apparent pathways for the disease is through the fecal matter of infected people.  The Chinese CDC “recommends strengthening sanitation and hygiene measures to prevent fecal-oral transmission” in areas where the coronavirus is present, with the hygiene measures to include “drinking boiled water, avoiding eating raw food, implementing separate meal systems, frequent hand-washing, disinfecting toilets, and preventing water and food contamination from patients’ stool.”  The concern is that infected persons’ “stool samples may contaminate hands, food, water” and cause infection when the microbes enter the mouth or eyes, or are inhaled.

gettyimages-693551624What does the apparent transmission route through fecal matter tell us about who is at risk in the event of a serious outbreak in the United States — something that hasn’t happened yet?  It seems that one logical course should be to target specific populations where sanitation and disposal of human waste aren’t well controlled.  If I were a public health official in America, I’d therefore be considering what can be done to anticipate and prevent a nightmare scenario in which coronavirus reaches one of the colossal homeless encampments found in some U.S. cities, like Los Angeles.  Public health officials have already identified poor health conditions and contact with fecal matter in “homeless zone” as the source for transmission of diseases like typhus, typhoid fever, and tuberculosis in Los Angeles.  What would happen if a rapidly spreading disease like coronavirus were to reach one of the densely populated, squalid encampments?

America hasn’t shown much of an appetite for tackling the issue of homelessness, which has become the unspoken of elephant in the room in many American cities.  When it comes to public health and disease prevention, however, we’re all in this together, and potential avenues for rapid disease transmission can’t simply be ignored away.

I’m hoping that the potentially disastrous implications of coronavirus reaching homeless populations will cause local, state, and federal officials to finally work out a solution that helps the homeless find places that are safe, secure, and healthy, with adequate sanitation facilities and running water.  If we’re going to get a grip on the spread of coronavirus, or the next disease coming down the pike, it’s time to be proactive and to act to protect the vulnerable and the rest of us as well.

Old Shots

Measles has been in the news a lot lately, from a recent New York City public health order requiring mandatory vaccinations in an attempt to stop a measles outbreak in Brooklyn that is (inevitably) being challenged in court, to reports of cases of measles in various places in the U.S., to scary outbreaks in other parts of the world like Europe and the Philippines.

measles-vaccine-gettyimages-544419442Although measles is typically viewed as a childhood disease, getting it as an adult can be serious business.   And, because measles is a highly contagious condition that can be readily communicated from one person to another through airborne droplets sneezed and coughed out by random strangers in public places — like airport terminals — it’s a concern for people who do a lot of traveling.   Health care officials uniformly identify vaccination as the best defense against contracting a case of measles.  But what should you do if, like me, you got that painful measles shot in the arm or the butt when you were a kid long ago, and your childhood vaccination and immunization records are God knows where?  Do we all need to get another shot?

Here’s some good news:  according to the Centers for Disease Control and Prevention, if you received the Measles, Mumps, and Rubella (MMR) shot that every kid of my generation got as a matter of course, you’re in good shape.  The CDC says that the measles component of the vaccine provides lifelong protection at 93 percent efficiency even if, like me, you got your shot more than 50 years ago.  And if you were born before 1957, you don’t need to worry about the measles, either, because the vast majority of people living in the pre-1957 world were exposed to measles as kids and have natural immunity to the disease as a result.

It’s weird to think that, in the 21st century, Americans should be worrying about diseases like measles that can be readily controlled by vaccination, but that’s what happens when parents start getting lax about vaccinating their kids — or believing quacks who raise unproven claims about side effects of vaccination.  If you’re not sure about whether you’ve been vaccinated, you really should talk to your doctor.  When it comes to communicable diseases, we’re all in this together.

Is Porn A Public Health Crisis?

Utah’s state legislature has passed a resolution declaring pornography a public health crisis, and yesterday Utah’s governor signed it.

ip01091The resolution doesn’t ban pornography in Utah — with the volume of porn available on the internet and through various media outlets, it’s hard to see how that could be accomplished, anyway — but it does seek to highlight what it calls an epidemic.  The resolution says that porn “perpetuates a sexually toxic environment” and “is contributing to the hypersexualisation of teens, and even prepubescent children, in our society,” and speakers at yesterday’s signing ceremony argued that porn also undermines marriages and contributes to sexual aggression.

Utah, which is a majority Mormon state, has always long been one of the most socially conservative states in America, and an “adult entertainment” trade group called The Free Speech Coalition said that Utah’s declaration is an “old-fashioned” morals bill that ignores that porn watchers tend to have more progressive views on sexuality and women’s rights and that ready access to porn correlates with a decline in sex crimes.

It’s hard to see how anyone could plausibly argue that pornography is a public health crisis in the same way that, say, the Zika virus or Ebola are.  Porn isn’t randomly striking people down or causing microcephaly or other serious health conditions through mosquito bites, and if there is such a thing as “porn addiction” it sure isn’t as widespread or destructive as alcoholism or drug addiction.  Clearly, there are more serious targets of our public health spending than porn.  And there obviously are free speech concerns at issue, too, that the law has wrestled with since one Justice of the Supreme Court famously declared that he might not be able to craft a legal definition of pornography, but he knew it when he saw it.

Still, I think anyone who pooh-poohs the fact or significance of the increasing prevalence of porn — soft, hard, and even violent — in our society might be missing the point.  “Dirty books” and “dirty movies” have always been around, but they sure are a lot more accessible these days, available with a few clicks of a mouse or TV remote control unit.  Anybody who watched HBO, as we do, can’t help but notice how graphic the depiction of sexual activity and sexual situations has become, and broadcast TV isn’t far behind.

There’s a reason pornography is euphemistically called “adult entertainment.”  Parents have a legitimate interest in protecting their children from exposure to porn until the kids have a chance to learn about sex in a more neutral, less charged, less graphic way.   No one wants their kids to think that the scenarios presented in porn are a normal representation of sexual activity in a loving relationship.  That’s not old-fashioned, it’s common sense.

Pontius Pilate Probably Did It Wrong, Too

Scientists have determined that we’ve all been washing our hands the wrong way.  They say the simple soap up, vigorously rub until lather forms, then rinse method that we’ve been using isn’t very effective at killing the bacteria that collects on our hands.

handwashing-banner1A study conducted by a university in Scotland concluded that the common three-step method only reduced the “average bacterial count”on hands from “3.08 colony-forming units per milliliter to 2.88.”  The study advocates, instead, for a six-step method that involves the initial soap-up step followed by scrubbing the backs of hands, the backs of fingers, between fingers, then rotational rubbing of your thumbs, and finally the fingers on your opposite hand.  If it sounds complicated, it is:  the study confesses that only 65 percent of people who were given an instruction sheet did it correctly.  The average time to correctly complete the six-step procedure, incidentally, was 42.5 seconds.

But here’s the rub:  after doing the six-step hand fandango, there were still an average of 2.58 colony-forming units of bacteria per milliliter on the study participants’ hands.  In other words, even after you’ve vigorously scrubbed away and performed the “rotational rubbing of your thumbs” for a full 42.5 seconds, more than half of those bacteria that had been on your hands are still there, ready to form a “colony.”

And that’s not even the worst part.  Standing in front of the sink in a public restroom washing your hands for 42.5 seconds is the functional equivalent of an eternity.  Nobody spends that much time washing their hands — not even Howard Hughes.  If you stood at a sink in a public bathroom for 42.5 seconds aggressively scouring your hands in a lathery storm, any other person who happened to be in the bathroom at the same time would conclude that you are either trying to eliminate DNA evidence after committing murder or on the verge of being committed for raging hypochondria.

So I don’t think I’m going to be spending 42.5 seconds enduring the over-the-top fragrances of hand soaps and giving my thumbs a workout in order to marginally reduce, but not come close to eliminate altogether, the bacteria hanging out on my hands.  I’ll stick with the three-step method, get out of the bathroom within a reasonable time, and just let those hardy surviving bacteria go about their colony-forming business.

Slowing The Aging Process

Mention “aging” to someone in their 50s — like me — and you’re likely to provoke a grim expression.  We feel the aging process in our muscles and bones, we get that ugly twinge after a sudden move, and we see it when we look in the mirror and notice the grey hairs, the wrinkles, and the pathetic turkey neck.

But what if aging could be slowed?  What if therapies and treatments could be developed that would decelerate the ravages of time, or stave it off altogether?

Scientists are looking into the possibility that gene therapy, hormone treatments, and other approaches might have that effect and have been using some of the new treatment concepts in experiments on animals.  Economists believe that treatments that successfully delay aging — and thereby allow people to be productive and healthy longer — could have enormous economic consequences.

Speaking as one of the aging generation, I’m all in favor of seeing whether reasonable treatments can be developed.  At the same time, however, I question whether heroic efforts should be devoted to deferring the effects of aging when there are many other public health issues that also need attention.  And a public health focus on aging makes sense only if the years that are added are healthy, sane, active, non-institutionalized years.  When you regularly visit a nursing home and see how many Americans are living their final years, you can legitimately question whether living longer is inevitably a great thing.

AIDS And Alzheimer’s

The New York Times has a thought-provoking piece contrasting the public health reaction to AIDS to the public health reaction to Alzheimer’s disease.

The article notes that this year AIDS has fallen out of the list of the top 10 causes of death in New York City — replaced by Alzheimer’s.  In fact, the article reports, research now indicates that deaths attributable to the latter disease are grossly underestimated and that it may be responsible for nearly as many deaths in one year as AIDS has been in the more than three decades since its terrible emergence.  And yet, while AIDS research remains a public health focus supported by a robust social movement, there is no similarly active movement lobbying for increased Alzheimer’s research, prevention, and treatment.  Why?

IMG_2947Although the article correctly points out the success of the fight against AIDS as a public health movement, it was not always that way.  In the early days of AIDS, there was a lot of denial and politicization of the underlying health issues, discussed in appalling detail in the excellent book And the Band Played On:  Politics, People, and the AIDS Epidemic, by Randy Shilts.  It wasn’t until people got past the denial and politicization and focused on the awful public health cost of AIDS that effective education, prevention, and ultimately treatment programs were developed.  The fact that the disease was so terrible in its toll, and cut down our friends and family members in the prime of their lives, helped to drive the public health effort.

With Alzheimer’s, the toll of the disease is great, but the catalyzing circumstances that energized the fight against AIDS seem to be lacking.  Alzheimer’s is an affliction primarily of the elderly, who are regarded as already in their twilight years.  It’s a painful and somewhat embarrassing disease for surviving family members to deal with, as the victim gradually loses his mental faculties and all memories of loved ones.  So far as we know, Alzheimer’s is not readily communicable, and we’ve already got facilities in place where those unfortunate souls who become debilitated can be kept and cared for while the disease does its grim and inexorable work.  Those different circumstances, perhaps, explain why Alzheimer’s simply doesn’t command the same kind of attention that AIDS received.

Or, alternatively, it may be that these factors have simply kept Alzheimer’s in the denial stage for a much longer period, and only now are people finally confronting the disease and its awful consequences, which leave formerly vibrant people empty, haunted shells of their former selves.  The aging of the Baby Boom generation no doubt will help to increase awareness and attention.  I hope so, because the clock is ticking, and the prospect of contracting Alzheimer’s should scare the hell out of us.

Mumps On Campus

The Ohio State University is reporting an outbreak of 23 cases of mumps on campus. Eighteen students and one staff member — as well as others with links to the University community — apparently have the disease.

Mumps is one of those diseases, like scarlet fever or measles, that people used to get as kids before vaccines became commonplace. I had mumps when I was a tot, and so did all of the kids in my family. I remember being tired and having a sore throat and swollen glands, but getting to eat ice cream and drink 7-Up and read Archie comic books in bed made it bearable.

We tend to think of childhood diseases as not so serious, and usually they aren’t — at least, not if you get them when you’re a kid. If you get mumps as an adult, however, it can have more serious consequences, including swelling in some tender areas for post-pubescent males. Mumps also is the kind of disease that sounds tailor-made for transmission in a college campus setting. According to the Centers for Disease Control and Prevention:

“Mumps is spread by droplets of saliva or mucus from the mouth, nose, or throat of an infected person, usually when the person coughs, sneezes, or talks. Items used by an infected person, such as soft drink cans or eating utensils, can also be contaminated with the virus, which may spread to others if those items are shared. In addition, the virus may spread when someone with mumps touches items or surfaces without washing their hands and someone else then touches the same surface and rubs their mouth or nose.”

Now, compare that description of mumps transmission to the close quarters and hygiene standards found in the off-campus residences and dorm rooms maintained by college students, and you’ll soon find yourself wondering how big an outbreak of mumps on a college campus could become. (If you’re an Ohio State basketball fan, you also find yourself hoping that all of the members of the team have been vaccinated.)

Which raises one final point: you don’t get mumps if you had it as a kid or you’ve been vaccinated. I thought vaccinations for mumps was pretty universal in the United States. An outbreak of 23 cases of the mumps suggests that understanding may be unfounded — which is deeply troubling. Aren’t parents getting basic vaccinations for their kids these days? If they aren’t, why not? It makes you wonder if other basic public health steps are being ignored, and what other outbreaks and consequences might lie in store for us as a result.

Polio And “Superbugs”

In Syria, more than a dozen children have fallen prey to the crippling effects of polio.

“Polio?”, you say.  “That terrible affliction that paralyzed thousands of American children each year?  But polio was eradicated by the development of the Salk vaccine.”  Yes, but a vaccine can only work if the shot is delivered.  In war-torn Syria, some children aren’t receiving their vaccinations — and the polio virus is still out there, lurking and ready to spread its infection that, for some unlucky few, will produce paralysis.

The story of the Syrian children is a reminder of the thin line of defense that protects humans from illness caused by bacteria, microbe, and virus.  It’s a timely reminder, too, because the Centers for Disease Control and Prevention and other world health organizations are increasingly concerned about the development of “superbugs” — bacteria that have developed resistance to treatment because antibiotics are being overused.  The CDC estimates that more than 2 million Americans get antibiotic-resistant infections each year, and at least 23,000 die because drugs no longer stop their infections from spreading.  The two most dangerous “superbugs” in America are CRE bacteria, which produce deadly, raging infections, and Clostridium difficile, which produces diarrhea that kills thousands each year.  The CDC’s European health counterpart is reporting on outbreaks of other antibiotic-resistant illnesses in some European countries.

This is one of those stories that don’t get much attention because it isn’t threatening to most of us — at least, not right now.  But the spread of “superbugs,” and the overuse of antibiotics that often kill “good” bacteria that are found in every human, are an enormously important public health issue.  We need to stop the overuse of antibiotics that have contributed to the development of drug-resistant bacteria and focus on developing new vaccines and forms of treatment to fight the superbugs.  Otherwise, one day we might wake up to find that the stout antibiotic line of defense that has protected humans from all manner of deadly diseases is simply gone.

Hands Off Our Coke! (And Pepsi)

California is at it again.  It has determined that because the caramel coloring used in Coke and Pepsi includes a substance that a study has found causes cancer in mice, the soft drinks need to include a cancer warning label.  Not surprisingly, Coke and Pepsi have decided instead to change their recipes — and because it would be more costly to just change the recipe for soda sold in California, the recipes will be changed for all Coke and Pepsi products sold in the U.S.

What’s that, you say?  You haven’t noticed that the soft drink-guzzling Americans you see on the street, who have been swilling Coke and Pepsi on a daily basis for decades, have turned into tumorous monstrosities?  That’s because the study on which California’s determination is based deals with tumors in mice, not people.  What’s more, the Food and Drug Administration states that a human would need to drink more than a thousand cans of Coke and Pepsi a day to equal the dose administered to the mice in the study. Even the most slothful, couch-bound, Coke-addicted video game geek couldn’t approach such levels.

This latest action by California is another example of our regulatory state run amok. Studies, no doubt funded in part by tax dollars, test substances on rodents at ludicrous exposure levels and find increased incidence of cancer, which is not surprising because gross overexposure to just about anything — including water — can be harmful.  Then, “consumer advocacy groups” use the study results to start the drumbeat to ban the substance, advancing the dubious argument that because absurd exposure levels are associated with increased cancer incidence in mice, any exposure at any level increases the risk of cancer in humans.  Then, nanny states like California issue edicts like the one directed to Coke and Pepsi and manufacturers have to change what they are doing, thereby increasing costs and messing with products that Americans have used for years without any problem.

At some point, I hope, people will wake up to the sham nature of such “public health” findings and demand that states like California reserve their intrusive regulations for those rare cases that raise real public health issues — ones that don’t assume consumers quaff 1,000 cans of Coke a day.  Until then, hands off our Coke!

Eat Like An Egyptian

Scientists have performed x-rays and other scans on Egyptian mummies and have determined that ancient Egyptians experienced clogged arteries and heart disease, just like modern Americans do. The mummies that were examined as part of the study were of upper-class social status, which meant they ate meat and had richer than normal diets that were similar to those of modern Americans.

The results may mean that the modern activities which often are cited as causal factors for heart disease — such as smoking, eating processed foods, and leading sedentary lifestyles — in fact aren’t significant causes of heart disease at all. Instead, the root causes may be genetic.

H1N1 In The Air

I had to fly yesterday on business, and the dire warnings about a new, more virulent outbreak of H1N1 certainly make air travel more exciting.  Any time you are in an enclosed area with a bunch of people you are bound to hear a certain number of coughs, sneezes, and sniffles, and if you are paranoid you wonder whether any public surface you touch — like the armrest on a chair in an airport waiting lounge — has just been exposed to the drippings from a kid’s runny nose.

Most Americans seem to ignore health warnings, or at least don’t let them affect their everyday lives.  I don’t know whether many Americans are taking H1N1 more seriously, but on the plane yesterday I did see an older woman wearing one of those white masks.  She may have been trying to avoid infecting people or trying to avoid being infected.  In any case, it was an unusual, and somewhat unnerving, sight to see.