Grip Evolution

Here’s another story to add to the slew of news articles about general health trends:  human beings, on average, are getting weaker.  In this case, the indicator is grip strength — that is, how much holding and squeezing force can a person generate with just the fingers of their hand.  Recent studies have indicated that grip strength has declined significantly, even in the last 30 years.

best-hand-gripper-exercisesSo what, you might ask?  You’re less likely to encounter the guys who give you a bone-crushing handshake, and you don’t see people walking around flexing those hand exercisers anymore.  What’s the big deal?  The big deal is this:  grip strength is one of those inverse health indicators lurking in the human body, with lower grip strength associated with increased mortality from all causes and cardiovascular mortality in particular.  And, especially for those of us who are getting up there, grip strength is a key indicator of sarcopenia, the loss of muscle that occurs as we age, and may also indicate issues with cognitive performance.

Why is grip strength declining?  Of course, gripping is a key part of the evolution of homo sapiens — whose distant ancestors needed a strong grip when they were swinging through trees, and whose more recent predecessors used their hands to create and then wield tools and weapons that allowed them to survive predators and gather food.  In short, humans needed that strong grip to make it through the natural selection melee and emerge at the top of the evolutionary pyramid.  But in recent years, the need for hand strength at home or on the job has declined.  White collar workers need hand dexterity as they tap away at computers, not hand strength, and even blue collar workers now use automatic tools that don’t need the kind of personal strength that hand wrenches of the past, for example, required.  Mix those factors in with a general decline in fitness and increase in obesity, and you’ve gone a long way to explaining why human beings increasingly are becoming a bunch of unhealthy softies.

In short, as a species humans may be losing their grip.  It’s not a positive development.

Advertisements

Foodie Calls

Two recent surveys have identified what is being depicted as a “new trend” on the dating scene:  the “foodie call.”  It happens when one person goes out with another person that they really aren’t that interested in — just to get a free meal.

foodie-call-istock-fudfoto-696x392The two surveys of heterosexual women were conducted by Azusa Pacific University and the University of California-Merced, and the results were published in the journal of the Society for Personality and Social Psychology.  The participants were asked questions about their personalities, the views on gender roles, and their views, and personal histories, with “foodie calls.”  In one survey, one third of the respondents admitted to going out on a date just to get a free meal, and in the second survey 23 percent of the study group admitted to a “foodie call.”  The research also found that the majority of respondents were aghast at the concept of a “foodie call” and believed it to be moderately to extremely unacceptable.

What are we to make of “foodie calls”?  Speaking as someone who enjoys a good meal from time to time, I don’t think being motivated, in whole or in part, to go out on a date to get a good meal is incredibly egregious behavior.  I also think, however, that people who go on “foodie calls” might be selling themselves short, and I wonder if they ultimately find the meals very satisfying.  Spending two or three hours with somebody you really have no interest in and making cheery chit-chat that entire time would be exhausting, and is a pretty high price to pay for some fine dining.  Meals are supposed to be a pleasant, shared experience, and having to work hard to maintain a conversation would tend to interfere with your enjoyment of the cuisine.

As for the guys who’ve paid for the “foodie calls” — well, if the person you’ve asked out starts negotiating with you about the only restaurants that would be acceptable destinations for the date, you might just want to be on guard.

Breaking The Bad News

On the TV show House, House’s oncologist pal Wilson was reputed to be so humane and caring when giving patients bad news about their condition that, when he was done, patients actually thanked him.  Studies indicate, however, that there aren’t a lot of Wilsons out there in the medical profession.  Instead, many doctors botch one of the most important parts of their job — giving patients truthful information about their medical condition when the diagnosis is grim.

photo-hospital-doorwayTelling patients that they have untreatable cancer, or some other fatal disease, clearly is one of the toughest parts of a doctor’s job — and research indicates that doctors just aren’t very good at it.  Some doctors will break the bad news indirectly or use medical jargon that leaves the patient confused, others will do it with brutal directness, and still others will sugarcoat the news with treatment options.  As a result, many cancer patients aren’t well informed about their actual condition, and their prospects. A 2016 study found that only five percent of cancer patients understood their prognoses well enough to make informed decisions about their care.

Why are doctors so inept at giving patients bad news about their condition?  Of course, it’s incredibly hard to be the bearer of bad tidings, especially when the bad news is about a fatal illness, but there’s more to it than that.  Communications skills apparently aren’t emphasized at medical schools, and many doctors see a diagnosis of an incurable disease as a kind of personal failure on their part.

It’s interesting that, in a profession so associated with the phrase “bedside manner,” so many doctors regularly mishandle what is arguably the most important part of their job and so few medical schools make sure that their graduates are equipped to handle that task in a genuine, caring, and understandable way.  I hope I never receive a devastating diagnosis, but if I do I hope it comes from a doctor who knows how to break the bad news.

Travel Guilt

If you’ve got a big trip planned for this year, should you cancel it?  Should you refrain from traveling at all, because of the impact that your share of carbon emissions from the plane flight may be having on Arctic sea ice, or rising sea levels?

edited-travel-guilt-770x515That’s the question posed by a curious New York Times article earlier this week.  The author wrings his hands about the issue, caught between a desire to broaden his horizons by seeing the world and his professed guilt that his travel interests are selfish and evil because they may be affecting global climate change.  After quoting lots of statistics about the potential impact of one person’s activities, and envisioning being glared at by a hungry polar bear while pondering his contribution toward disappearing Arctic ice, the author notes that he’s still going to take a trip to Greece and Paris, but only after he’s purchased enough “carbon offsets” to “capture the annual methane emanations of a dozen cows.”

The Times article notes that, in 2016, two climatologists published a paper that concluded that there is a direct relation between carbon emissions and the melting of Arctic sea ice, and “each additional metric ton of carbon dioxide or its equivalent — your share of the emissions on a cross-country flight one-way from New York to Los Angeles — shrinks the summer sea ice cover by 3 square meters, or 32 square feet.”  Taking a cruise isn’t the answer, either; the article says that cruise ships produce three or four times the pollution produced by jets.  Even worse, the article states that just by being an average American we’re harming and even killing fellow human beings, and quotes a determination somehow made by a University of Tennessee professor, who concluded: The average American causes through his/her greenhouse gas emissions the serious suffering and/or deaths of two future people.”

So, should we just stay huddled in our houses with the lights turned off, so as to minimize our personal contribution to potential global catastrophe?  I won’t be doing that.  I like leisure travel, and unlike the Times writer, I’m not wracked with guilt about it.  I’m quite skeptical of any calculation that purports to show that, in view of all of the huge, overarching factors, such as sunspot cycles, solar flares, ocean currents, and wind systems, that can affect the Earth’s climate, the activity of an “average American” can be isolated and found to have a direct, measurable impact on climate.  Science has endured a lot of black eyes lately, with research and calculations shown to be inaccurate and, in some instances, politically motivated, and I’m just not willing to accept unquestioningly that going to visit my sister-in-law in California will melt 32 square feet of Arctic sea ice.  I also question how the activities of an “average American” are calculated, or how a walk-to-work person like me compares to the carbon footprint of the “average.”

So, I guess you can call me selfish, because I do want to see more of the world and experience the wonders of faraway places.  But don’t just ask me — ask the places that travelers visit if they’d rather not receive the infusions of cash, and the jobs created, that come from being a tourist destination.  If we’re going to be doing impossibly complex calculations of benefits and harm, how about throwing in the economic and cultural benefits that flow from travel into the equation?

“Burn-out” As A Medical Condition

Every few years, the World Health Organization produces a new version of the International Classification of Diseases, a catalog of acknowledged medical conditions that is used as a diagnostic guide by health care providers.  With every new version of the ICD, there seems to be some controversy about whether or not a particular ailment or complaint should be recognized.

burnoutThis year, the “should it be included or not” controversy swirls around “burn-out.”  Apparently there has been a long, ongoing debate about whether “burn-out” should be recognized as a medical condition, and the WHO has now weighed in with a “yes”:  the ICD-11 lists “burn-out” and defines it as “a syndrome conceptualised as resulting from chronic workplace stress that has not been successfully managed.”  According to the WHO, “burn-out” syndrome is characterized by “1) feelings of energy depletion or exhaustion; 2) increased mental distance from one’s job, or feelings of negativism or cynicism related to one’s job; and 3) reduced professional efficacy.”  Notably, the ICD-11 tries to draw a kind of line in the sand by stating that “burn-out” “refers specifically to phenomena in the occupational context and should not be applied to describe experiences in other areas of life.”

My guess is that many — if not all — workers have, at some particular point or another in their careers, experienced “burn-out” as defined by the WHO.  Jobs typically involve stress, and it’s almost inevitable that there will be periods where multiple obligations pile on top of each other, leaving the worker feeling overwhelmed, exhausted, and dissatisfied.  But . . . should “burn-out” be viewed as a medical condition?  What, exactly, is a doctor supposed to do for a patient who presents with classic “burn-out” symptoms — prescribe a three-month vacation, or a new job, or new job responsibilities, or a change in the patient’s workplace manager?  Will employers be required to allow leaves of absence, beyond their designated vacation periods, for employees whose doctors diagnose them with “burn-out,” and will health insurers be required to pay for vacations as a form of treatment?  By classifying “burn-out” as a diagnosable health condition, aren’t we really going far down the road of “medicalizing” common aspects of our daily lives?

And can “burn-out” really be limited to the “occupational context,” as the ICD-11 instructs, or will the same concepts underlying workplace “burn-out” ultimately be recognized in other areas, like family or marital or college “burn-out”?  Here’s a possible answer to that question:  the ICD-11 now recognizes video gaming, with cocaine, and alcohol, and gambling, as a potential source of addiction.

Drawing An Unscientific Maggot Line

I have a high regard for scientists . . . generally.  But sometimes scientists don’t exactly have a solid appreciation of the sensibilities of normal human beings.

maggots_lede_photo_bigstock_2100-768x526Consider, for example, this report on the work of scientists at the University of Queensland in Brisbane, Australia.  They conclude that, given the population in the world, humans need to start turning to alternative sources of protein besides animal meat.  The article linked above quotes “meat science professor Dr. Louwrens Hoffman” — apparently “meat science” is a discipline that has been developed since I’ve been in college, because otherwise that would have been a pretty darned tempting major — as saying:  “An overpopulated world is going to struggle to find enough protein unless people are willing to open their minds, and stomachs, to a much broader notion of food.”

So far, so good.  But Dr. Hoffman and his team at the University of Queensland are looking to replace beef and chicken and pork with — gulp! — maggots and locusts.  They reason that the world’s insect population is a far more sustainable source of supply for such protein.  They also recognize that most people rebel at the notion of consuming chitinous locusts or squirmy maggots, so they are working on developing “prepared foods” that include locusts and maggots as disguised ingredients.  So far, they’ve worked on a maggot sausage with promising results, and Dr. Hoffman swears that a student has developed an insect ice cream that is “very tasty.”  Who knows?  Soon you may be able to have an ice cream cone with a scoop of vanilla and a scoop of “insect.”

According to the article, there are already some insect-based products available in the U.S., such as Chirps chips and Chapul protein bars.  I haven’t had any of these items, and I haven’t noticed them flying off the shelves at the neighborhood grocery store, either.

There’s a basic repulsion issue involved in eating maggots.  With a nod to the French government defense strategy before World War II, you might call it The Maggot Line, and scientific-based arguments aren’t going to cross it.  I think the the issue with insect-based foods is whether ingredient lists on food packaging are required to accurately and clearly disclose the insect element.  If maggots can be called by their scientific names — which are Lucilia sericata and Phaenicia sericata — and jumbled in with the other scientific sounding ingredients for prepared foods, like sodium benzoate and monosodium glutamate, then maggot sausage might stand a chance.  But if the packaging has to use plain English and disclose maggots as an ingredient, forget it.

A Heady Whiff Of Conference Room Air

It’s long been a standing joke that big office meetings — especially those that feature lengthy PowerPoint presentations — do nothing but make everyone in attendance dumber.  Now it looks like (gulp!) there’s some scientific evidence that the jest just might just have more than a kernel of truth to it.

conf-room-cm-2013-12-12_172632Conference room meetings involve two factors that don’t necessarily go well together:  living human beings, and closed spaces.   The human beings breathe in oxygen and exhale carbon dioxide, and the closed spaces prevent the air in the conference room from circulating.  Indeed, modern buildings are a lot more insulated and better at keeping outdoor air outside, and indoor air inside.  That means that, if you’re in a conference room meeting with lots of other people, as time goes on the carbon dioxide generated by the breathing process will accumulate and the percentage of carbon dioxide in the air will increase.

Studies have shown that breathing air with carbon dioxide that are too high — much higher than you could expect to find at even the longest, most deadly office meeting — can have clear negative effects on the brain.  The impact includes stifling interaction between different regions of the brain, reduced neuronal activity, and dilated blood vessels in the brain.  Now, scientists are starting to look at the effects of exposure to air with lower carbon dioxide concentrations, like what you might find in a closed door meeting in a conference room, and what they’re finding indicates that the old joke just might mirror reality.  The studies are showing that, as the carbon dioxide levels in indoor air increase, human performance on tests designed to measure higher end intellectual acuity qualities like strategy and initiative declined.

So what can you do, other than avoiding large-scale meetings?  One answer is to increase the ventilation rate in modern buildings, but that’s not something that most of us can readily control.  Other options are to open a window — if you’re in one of the incredibly rare conference rooms that actually has one — or even a door.  Keeping all-hands meetings as short as possible will help, too.  And there’s always the option we used to urge teachers to adopt on a beautiful spring day — have class outside.

The bottom line is that people who work in office buildings, as many of us do, need to be sensitive to getting outside where the tools of nature — trees, plants and cool breezes — have had a chance to scrub the air and return carbon dioxide levels to normal.  It turns out that getting out of closed cubicles and into the fresh air outside isn’t just good for the soul, it’s good for the brain, too.