Grip Evolution

Here’s another story to add to the slew of news articles about general health trends:  human beings, on average, are getting weaker.  In this case, the indicator is grip strength — that is, how much holding and squeezing force can a person generate with just the fingers of their hand.  Recent studies have indicated that grip strength has declined significantly, even in the last 30 years.

best-hand-gripper-exercisesSo what, you might ask?  You’re less likely to encounter the guys who give you a bone-crushing handshake, and you don’t see people walking around flexing those hand exercisers anymore.  What’s the big deal?  The big deal is this:  grip strength is one of those inverse health indicators lurking in the human body, with lower grip strength associated with increased mortality from all causes and cardiovascular mortality in particular.  And, especially for those of us who are getting up there, grip strength is a key indicator of sarcopenia, the loss of muscle that occurs as we age, and may also indicate issues with cognitive performance.

Why is grip strength declining?  Of course, gripping is a key part of the evolution of homo sapiens — whose distant ancestors needed a strong grip when they were swinging through trees, and whose more recent predecessors used their hands to create and then wield tools and weapons that allowed them to survive predators and gather food.  In short, humans needed that strong grip to make it through the natural selection melee and emerge at the top of the evolutionary pyramid.  But in recent years, the need for hand strength at home or on the job has declined.  White collar workers need hand dexterity as they tap away at computers, not hand strength, and even blue collar workers now use automatic tools that don’t need the kind of personal strength that hand wrenches of the past, for example, required.  Mix those factors in with a general decline in fitness and increase in obesity, and you’ve gone a long way to explaining why human beings increasingly are becoming a bunch of unhealthy softies.

In short, as a species humans may be losing their grip.  It’s not a positive development.

Advertisements

Living In A Time-Free Zone

It’s June 21, which means it’s officially summer.  (Those of us in the rainy, cool Midwest may be forgiven for not recognizing that.)  June 21 also means the summer solstice has arrived and therefore, in the northern hemisphere, it’s the longest day and shortest night of the year.

190617165942-watches-on-bridge2-photographer-jran-mikkelsen-jpgSome of the northernmost cities of the globe have already been enjoying days where the sun never sets.  In Sommaroy, a Norwegian island that is north of the Arctic Circle, the sun doesn’t set for more than two months — from May 18 to July 26.  And during that period of constant daylight, the islanders don’t exactly follow conventional concepts of time.  In the early a.m. hours, when most of us are abed, Sommaroy residents are likely to be out doing activities that we associate with late morning or afternoon.  In part, that’s to compensate for the fact that, from November to January, Sommaroy doesn’t get any sunlight at all — but the practices of the islanders during this time period also recognize that standard concepts of time, set by a daily sunrise and sunset, really don’t apply when you have 24 hours of constant daylight.

Now Sommaroy residents want the Norwegian government to recognize their practices officially, and declare Sommaroy a “time-free zone” during the constant daylight period, which would allow businesses and schools to have flexibility in their hours of operation.  Visitors to Sommaroy during this period are encouraged to acknowledge the “time-free” concept by leaving their watches on the bridge that connects the island to the mainland.

Many of us live lives that are governed, to a certain extent, by the clock.  We get up, eat, work, watch TV, and go to bed on a schedule that is derived, in large part, from the rhythms established by the sun.  What would it be like to live in a place where there was constant sun — or for that matter, no sun — and therefore no standard concept of time?  Would you still follow a schedule, or would you simply sleep when you wanted, eat when you wanted, and work when you felt you had to, without regard to the tyrannical clock?

Most of us don’t have to think about that, because we don’t live in places where there is constant sunlight, or constant darkness, for any part of the year.  But if humans venture into space, and take years-long interstellar voyages or live underground on inhospitable planets and moons where sunrise and sunset are not daily occurrences, our prevailing notions of time will be put to the test.  In a way, our time-free friends on Sommaroy may be giving us a peek into what human lives might be like in the future.

Emphasis Added

Anyone who does much writing will eventually confront the question of the best way to give emphasis to a particular word or phrase in what they have written.  Maybe it’s a desire to attach special significance to part of a quote, or a need to make absolutely sure that the reader doesn’t miss a central point — but the time will come where, to be on the safe side, emphasis must be added.

9154299_web1_171030-pan-m-alexander-browne-top-hat-1So, what’s the best way to emphasize the written word?  The basic options, currently, are using underlines, italics, or boldface.  Some people then use a combination of the three to give even more emphasis.  (Back when I first started working, in the days long before social media and texting, some people used all caps to provide emphasis.  Now the all-caps look is generally perceived by the reader as screaming, and there’s very little being written about that needs that much emphasis.  What you want is for the reader’s internal voice to “think” the word being emphasized just a bit louder than the rest of the text, and not have them mentally screaming like a character in a bad teen horror movie.)

My emphasis tastes vary depending on what I’m writing.  For blog entries like this one, I prefer to use italics to give a word that special nudge.  For legal briefs, however, where case names are italicized and section headings are in bold print, I tend to use simple underlining to emphasize specific text.  That way, there’s no mixing up the message.

And I don’t like using various combinations of bold, italics, and underlining to give extra-special emphasis to certain words or passages.  For one thing, I think random mixtures of “emphasis-adders” is confusing to the reader; it suggests that there is some emphasis hierarchy that the readers hasn’t been told about, which may leave them wondering about relative emphasis rather than concentrating on what is written.  (“Let’s see — is don’t supposed to get more emphasis than don’t, or is it the other way around?”)  And using multiple combinations for some words seems to devalue the words that merit only a single emphasizer.  I think emphasis-adders should be used sparingly, and if you’ve got to use combinations you’re probably overdoing emphasis to the point where the message is being lost.  You might want to think about editing your sentences to be shorter and clearer, instead.  Plus, the use of random combinations of emphasizers makes the printed page look messy, like a riotous fruit salad.

So, my rule of thumb on adding emphasis is to stick to one — and only one — technique, and to use it sparingly.  If you write clearly, you’ll be just fine with that.

The Power Of No

We’ve all got friends who seem to be absurdly stressed, all the time.  They’re constantly harried, rushing from one important commitment to another, complaining all the while about how incredibly busy they are.  They’ve got their jobs, of course, but also a number of other activities and obligations piled up on top of their work, occupying pretty much every minute of every day.

Three Signs In Male Fists Saying No, No and No Isolated on a White Background.If only they’d learned to say “no”!

Over the weekend The Guardian published an interesting article about saying no.  The article points out that people who are miserably overcommitted aren’t powerless — they can directly affect their situations by carefully considering their own interests and saying no to things that they really don’t want, or need, to do.  By declining unwanted invitations, and shedding obligations that aren’t really rewarding or essential, they free up time to do what they actually want to do with people they really like.  And, as a result, the stress level goes down and the enjoyment of life goes up.

This recommendation mirrors my own experience.  Some years ago I realized that, with work, charity involvements, and other obligations, I wasn’t enjoying much free time — on weekends, or otherwise.  I looked at what I was doing and decided I needed to lighten my load, and then I went through my commitments and decided which ones could reasonably be eliminated — and then I eliminated them.  When I did that, I felt like a weight was lifted from my shoulders and my free time was multiplied, and I’ve never regretted doing it.

I do disagree with The Guardian article in this sense:  it suggests that most of the over-busy folks are people-pleasers who feel they just have to say yes.  I’m sure there are people in that category, but I think there are two other categories at play.  One is people who want to help and make a contribution, and just find out that they can’t manage all of the obligations they’ve assumed.  The other is people who perversely like projecting to others how busy they are.  The first category just needs to understand the power of saying no.  The second category doesn’t want to say no.

“Burn-out” As A Medical Condition

Every few years, the World Health Organization produces a new version of the International Classification of Diseases, a catalog of acknowledged medical conditions that is used as a diagnostic guide by health care providers.  With every new version of the ICD, there seems to be some controversy about whether or not a particular ailment or complaint should be recognized.

burnoutThis year, the “should it be included or not” controversy swirls around “burn-out.”  Apparently there has been a long, ongoing debate about whether “burn-out” should be recognized as a medical condition, and the WHO has now weighed in with a “yes”:  the ICD-11 lists “burn-out” and defines it as “a syndrome conceptualised as resulting from chronic workplace stress that has not been successfully managed.”  According to the WHO, “burn-out” syndrome is characterized by “1) feelings of energy depletion or exhaustion; 2) increased mental distance from one’s job, or feelings of negativism or cynicism related to one’s job; and 3) reduced professional efficacy.”  Notably, the ICD-11 tries to draw a kind of line in the sand by stating that “burn-out” “refers specifically to phenomena in the occupational context and should not be applied to describe experiences in other areas of life.”

My guess is that many — if not all — workers have, at some particular point or another in their careers, experienced “burn-out” as defined by the WHO.  Jobs typically involve stress, and it’s almost inevitable that there will be periods where multiple obligations pile on top of each other, leaving the worker feeling overwhelmed, exhausted, and dissatisfied.  But . . . should “burn-out” be viewed as a medical condition?  What, exactly, is a doctor supposed to do for a patient who presents with classic “burn-out” symptoms — prescribe a three-month vacation, or a new job, or new job responsibilities, or a change in the patient’s workplace manager?  Will employers be required to allow leaves of absence, beyond their designated vacation periods, for employees whose doctors diagnose them with “burn-out,” and will health insurers be required to pay for vacations as a form of treatment?  By classifying “burn-out” as a diagnosable health condition, aren’t we really going far down the road of “medicalizing” common aspects of our daily lives?

And can “burn-out” really be limited to the “occupational context,” as the ICD-11 instructs, or will the same concepts underlying workplace “burn-out” ultimately be recognized in other areas, like family or marital or college “burn-out”?  Here’s a possible answer to that question:  the ICD-11 now recognizes video gaming, with cocaine, and alcohol, and gambling, as a potential source of addiction.

Working For The Three-Day Weekend

In the distant, early days of Homo sapiens, there was no concept of “work” in the modern sense, and thus there were no holidays, either. Every day involved its many toils, from hunting and gathering to working to find shelter and water and protection against predators.

Then, as civilization developed, designated jobs became an inevitable part of the process. No city could exist without people charged with performing essential functions like laboring in the fields to bring in the crops, delivering food from the countryside, serving as scribe for Pharoah, or building the new pyramid or ziggurat.  The concept of holidays came later still. First, there were only religious holidays or seasonal holidays, to mark the Feast Day of Set or commemorate the harvest with a day of celebration. In the medieval era, when a saint’s day arrived, the duties of the job were replaced by lengthy religious obligations and, perhaps, fasting and the ritual wearing of a hair shirt.  It wasn’t exactly a laugh riot.

As humanity advanced even more, the concept of a work week was introduced and, then, secular holidays. When some brilliant soul realized that secular holidays really didn’t have to be tied to a specific date on the calendar and instead could float — so that the holiday could combine with a normal weekend to create a three-day weekend — it was a huge step forward in human development. And when an even more enlightened individual realized that we could use those three-day weekends to bookend the summer months, so that the joys of summer could begin with a glorious three-day revel in the warmth, it marked a true pinnacle in the annals of human achievement.

As we celebrate the joys of this three-day Memorial Day weekend, let’s remember those forgotten figures of human history who came up with the ideas that led us here — and be grateful that wearing sweaty hair shirts isn’t part of the equation.

The Impending Dash Clash

The apostrophe battle has been amicably settled.

spaced-em-dash2After some sternly worded exchanges, with many grammarians and wannabe English stylists weighing in, the B.A. Jersey Girl found an authoritative source that was able to bridge the gap between our competing positions and resolve the dispute.  She discovered that Bryan Garner’s Redbook:  A Manual on Legal Style acknowledged that while many style manuals follow the rule that always requires an apostrophe s to indicate a possessive, former journalists follow the Associated Press Style Manual and don’t add an apostrophe s when the word in question ends in s.  In short, both sides have a basis for their opinion, so we shook and decided to leave that issue behind.

Alas, a new punctuation fight looms directly ahead.  The virgin battleground involves something called an “em dash”—this super-long dash that, according to some grammarians, can be used as a substitute for a parentheses, can replace appositives that contain commas, and can be used to set off a sudden change in the direction of a sentence, among other uses.  It’s called the “em dash” because the length of the dash is about the same width as a capital M.

I’m all for adding a little dash to writing, but I’m not a fan of the “em dash” because it’s too long and is used without spaces on either side.  I’m a proponent of the dash that is formed with two hyphens and a space on both sides.  I think it looks neater and more orderly, whereas the “em dash” looks like a spear that is impaling the neighboring words.  I’m a fan of space in writing, and the “em dash” makes a sentence look crowded.  I say tap the space bar, give your words space to breathe, and let the “em dash” be damned.