Selfie Psychosis

We are learning more and more about people who have a “selfie” obsession.  We know that people taking selfies are at greater risk of having serious, and even fatal, accidents because they are oblivious to their surroundings while they are taking pictures of themselves on streets or, say, at the edge of the Grand Canyon.  We’ve also seen evidence that people who take selfies are so self-absorbed that they don’t show the decency and sensitivity you typically would expect from a fellow human being.

Woman taking a selfieNow new research is indicating what seems like a pretty obvious conclusion:  people who take selfies are more likely to undergo plastic surgery.  The connection is even stronger if the selfies are taken with filters, or if the posters regularly take down selfie postings that they later conclude aren’t very flattering.  Cosmetic surgeons are reporting that members of the selfie crowd are coming to their offices with selfies where the features have been digitally altered and asked the doctor to change their appearance to match the altered image.

It shouldn’t come as a surprise, I suppose, that people who take selfies are narcissistic and are interested in changing their appearance to try to reach their own definition of personal perfection.  After all, if you spend your time constantly looking at your own pouting face, you’re bound to notice a few imperfections to be cleaned up.  The selfie-obsessed also tend to compare their selfies with the countless other selfies that appear on social media feeds and find their looks wanting.

As one of the plastic surgeons quoted in the article linked above notes, that’s not healthy behavior.  It’s the kind of behavior that those of us who don’t take selfies, and indeed don’t particularly like to have their photos taken at all, just can’t understand.

But we’ll have to, because the selfie epidemic seems to be getting worse, not better.  Researchers estimate that 650 million selfies are posted every day on social media.  That’s a lot of potential plastic surgery.

Advertisements

Why Opposable Thumbs Exist

Why do opposable thumbs exist in humans and other primates?  Scientists generally agree that the appearance of the opposable thumb was a key evolutionary point in the development of our species.  It is what allowed primates to grip and climb and move into the trees, away from the realm of large predators looking for a meal.  Opposable thumbs also proved to be pretty handy from a toolmaking and tool using perspective, whether the tool was a stick to be manipulated or a rudimentary axe.

All of this is true,  Curiously, however, scientists haven’t fully explored whether the opposable thumb was developed in anticipation that modern humans who are too cheap to buy a nozzle for their garden hose might need the thumb to water their yard and plants on a beastly hot summer day.  Sure, the opposable thumb might not have been evolved specifically for watering and hose wielding, but it sure works well for that purpose — whether you want to generate a gentle sprinkle or a high velocity jet to reach the side of the yard beyond the length of the hose.

How do we know for sure that our distant ancestors weren’t big on watering?

Grip Evolution

Here’s another story to add to the slew of news articles about general health trends:  human beings, on average, are getting weaker.  In this case, the indicator is grip strength — that is, how much holding and squeezing force can a person generate with just the fingers of their hand.  Recent studies have indicated that grip strength has declined significantly, even in the last 30 years.

best-hand-gripper-exercisesSo what, you might ask?  You’re less likely to encounter the guys who give you a bone-crushing handshake, and you don’t see people walking around flexing those hand exercisers anymore.  What’s the big deal?  The big deal is this:  grip strength is one of those inverse health indicators lurking in the human body, with lower grip strength associated with increased mortality from all causes and cardiovascular mortality in particular.  And, especially for those of us who are getting up there, grip strength is a key indicator of sarcopenia, the loss of muscle that occurs as we age, and may also indicate issues with cognitive performance.

Why is grip strength declining?  Of course, gripping is a key part of the evolution of homo sapiens — whose distant ancestors needed a strong grip when they were swinging through trees, and whose more recent predecessors used their hands to create and then wield tools and weapons that allowed them to survive predators and gather food.  In short, humans needed that strong grip to make it through the natural selection melee and emerge at the top of the evolutionary pyramid.  But in recent years, the need for hand strength at home or on the job has declined.  White collar workers need hand dexterity as they tap away at computers, not hand strength, and even blue collar workers now use automatic tools that don’t need the kind of personal strength that hand wrenches of the past, for example, required.  Mix those factors in with a general decline in fitness and increase in obesity, and you’ve gone a long way to explaining why human beings increasingly are becoming a bunch of unhealthy softies.

In short, as a species humans may be losing their grip.  It’s not a positive development.

Foodie Calls

Two recent surveys have identified what is being depicted as a “new trend” on the dating scene:  the “foodie call.”  It happens when one person goes out with another person that they really aren’t that interested in — just to get a free meal.

foodie-call-istock-fudfoto-696x392The two surveys of heterosexual women were conducted by Azusa Pacific University and the University of California-Merced, and the results were published in the journal of the Society for Personality and Social Psychology.  The participants were asked questions about their personalities, the views on gender roles, and their views, and personal histories, with “foodie calls.”  In one survey, one third of the respondents admitted to going out on a date just to get a free meal, and in the second survey 23 percent of the study group admitted to a “foodie call.”  The research also found that the majority of respondents were aghast at the concept of a “foodie call” and believed it to be moderately to extremely unacceptable.

What are we to make of “foodie calls”?  Speaking as someone who enjoys a good meal from time to time, I don’t think being motivated, in whole or in part, to go out on a date to get a good meal is incredibly egregious behavior.  I also think, however, that people who go on “foodie calls” might be selling themselves short, and I wonder if they ultimately find the meals very satisfying.  Spending two or three hours with somebody you really have no interest in and making cheery chit-chat that entire time would be exhausting, and is a pretty high price to pay for some fine dining.  Meals are supposed to be a pleasant, shared experience, and having to work hard to maintain a conversation would tend to interfere with your enjoyment of the cuisine.

As for the guys who’ve paid for the “foodie calls” — well, if the person you’ve asked out starts negotiating with you about the only restaurants that would be acceptable destinations for the date, you might just want to be on guard.

Breaking The Bad News

On the TV show House, House’s oncologist pal Wilson was reputed to be so humane and caring when giving patients bad news about their condition that, when he was done, patients actually thanked him.  Studies indicate, however, that there aren’t a lot of Wilsons out there in the medical profession.  Instead, many doctors botch one of the most important parts of their job — giving patients truthful information about their medical condition when the diagnosis is grim.

photo-hospital-doorwayTelling patients that they have untreatable cancer, or some other fatal disease, clearly is one of the toughest parts of a doctor’s job — and research indicates that doctors just aren’t very good at it.  Some doctors will break the bad news indirectly or use medical jargon that leaves the patient confused, others will do it with brutal directness, and still others will sugarcoat the news with treatment options.  As a result, many cancer patients aren’t well informed about their actual condition, and their prospects. A 2016 study found that only five percent of cancer patients understood their prognoses well enough to make informed decisions about their care.

Why are doctors so inept at giving patients bad news about their condition?  Of course, it’s incredibly hard to be the bearer of bad tidings, especially when the bad news is about a fatal illness, but there’s more to it than that.  Communications skills apparently aren’t emphasized at medical schools, and many doctors see a diagnosis of an incurable disease as a kind of personal failure on their part.

It’s interesting that, in a profession so associated with the phrase “bedside manner,” so many doctors regularly mishandle what is arguably the most important part of their job and so few medical schools make sure that their graduates are equipped to handle that task in a genuine, caring, and understandable way.  I hope I never receive a devastating diagnosis, but if I do I hope it comes from a doctor who knows how to break the bad news.

Travel Guilt

If you’ve got a big trip planned for this year, should you cancel it?  Should you refrain from traveling at all, because of the impact that your share of carbon emissions from the plane flight may be having on Arctic sea ice, or rising sea levels?

edited-travel-guilt-770x515That’s the question posed by a curious New York Times article earlier this week.  The author wrings his hands about the issue, caught between a desire to broaden his horizons by seeing the world and his professed guilt that his travel interests are selfish and evil because they may be affecting global climate change.  After quoting lots of statistics about the potential impact of one person’s activities, and envisioning being glared at by a hungry polar bear while pondering his contribution toward disappearing Arctic ice, the author notes that he’s still going to take a trip to Greece and Paris, but only after he’s purchased enough “carbon offsets” to “capture the annual methane emanations of a dozen cows.”

The Times article notes that, in 2016, two climatologists published a paper that concluded that there is a direct relation between carbon emissions and the melting of Arctic sea ice, and “each additional metric ton of carbon dioxide or its equivalent — your share of the emissions on a cross-country flight one-way from New York to Los Angeles — shrinks the summer sea ice cover by 3 square meters, or 32 square feet.”  Taking a cruise isn’t the answer, either; the article says that cruise ships produce three or four times the pollution produced by jets.  Even worse, the article states that just by being an average American we’re harming and even killing fellow human beings, and quotes a determination somehow made by a University of Tennessee professor, who concluded: The average American causes through his/her greenhouse gas emissions the serious suffering and/or deaths of two future people.”

So, should we just stay huddled in our houses with the lights turned off, so as to minimize our personal contribution to potential global catastrophe?  I won’t be doing that.  I like leisure travel, and unlike the Times writer, I’m not wracked with guilt about it.  I’m quite skeptical of any calculation that purports to show that, in view of all of the huge, overarching factors, such as sunspot cycles, solar flares, ocean currents, and wind systems, that can affect the Earth’s climate, the activity of an “average American” can be isolated and found to have a direct, measurable impact on climate.  Science has endured a lot of black eyes lately, with research and calculations shown to be inaccurate and, in some instances, politically motivated, and I’m just not willing to accept unquestioningly that going to visit my sister-in-law in California will melt 32 square feet of Arctic sea ice.  I also question how the activities of an “average American” are calculated, or how a walk-to-work person like me compares to the carbon footprint of the “average.”

So, I guess you can call me selfish, because I do want to see more of the world and experience the wonders of faraway places.  But don’t just ask me — ask the places that travelers visit if they’d rather not receive the infusions of cash, and the jobs created, that come from being a tourist destination.  If we’re going to be doing impossibly complex calculations of benefits and harm, how about throwing in the economic and cultural benefits that flow from travel into the equation?

“Burn-out” As A Medical Condition

Every few years, the World Health Organization produces a new version of the International Classification of Diseases, a catalog of acknowledged medical conditions that is used as a diagnostic guide by health care providers.  With every new version of the ICD, there seems to be some controversy about whether or not a particular ailment or complaint should be recognized.

burnoutThis year, the “should it be included or not” controversy swirls around “burn-out.”  Apparently there has been a long, ongoing debate about whether “burn-out” should be recognized as a medical condition, and the WHO has now weighed in with a “yes”:  the ICD-11 lists “burn-out” and defines it as “a syndrome conceptualised as resulting from chronic workplace stress that has not been successfully managed.”  According to the WHO, “burn-out” syndrome is characterized by “1) feelings of energy depletion or exhaustion; 2) increased mental distance from one’s job, or feelings of negativism or cynicism related to one’s job; and 3) reduced professional efficacy.”  Notably, the ICD-11 tries to draw a kind of line in the sand by stating that “burn-out” “refers specifically to phenomena in the occupational context and should not be applied to describe experiences in other areas of life.”

My guess is that many — if not all — workers have, at some particular point or another in their careers, experienced “burn-out” as defined by the WHO.  Jobs typically involve stress, and it’s almost inevitable that there will be periods where multiple obligations pile on top of each other, leaving the worker feeling overwhelmed, exhausted, and dissatisfied.  But . . . should “burn-out” be viewed as a medical condition?  What, exactly, is a doctor supposed to do for a patient who presents with classic “burn-out” symptoms — prescribe a three-month vacation, or a new job, or new job responsibilities, or a change in the patient’s workplace manager?  Will employers be required to allow leaves of absence, beyond their designated vacation periods, for employees whose doctors diagnose them with “burn-out,” and will health insurers be required to pay for vacations as a form of treatment?  By classifying “burn-out” as a diagnosable health condition, aren’t we really going far down the road of “medicalizing” common aspects of our daily lives?

And can “burn-out” really be limited to the “occupational context,” as the ICD-11 instructs, or will the same concepts underlying workplace “burn-out” ultimately be recognized in other areas, like family or marital or college “burn-out”?  Here’s a possible answer to that question:  the ICD-11 now recognizes video gaming, with cocaine, and alcohol, and gambling, as a potential source of addiction.