“Burn-out” As A Medical Condition

Every few years, the World Health Organization produces a new version of the International Classification of Diseases, a catalog of acknowledged medical conditions that is used as a diagnostic guide by health care providers.  With every new version of the ICD, there seems to be some controversy about whether or not a particular ailment or complaint should be recognized.

burnoutThis year, the “should it be included or not” controversy swirls around “burn-out.”  Apparently there has been a long, ongoing debate about whether “burn-out” should be recognized as a medical condition, and the WHO has now weighed in with a “yes”:  the ICD-11 lists “burn-out” and defines it as “a syndrome conceptualised as resulting from chronic workplace stress that has not been successfully managed.”  According to the WHO, “burn-out” syndrome is characterized by “1) feelings of energy depletion or exhaustion; 2) increased mental distance from one’s job, or feelings of negativism or cynicism related to one’s job; and 3) reduced professional efficacy.”  Notably, the ICD-11 tries to draw a kind of line in the sand by stating that “burn-out” “refers specifically to phenomena in the occupational context and should not be applied to describe experiences in other areas of life.”

My guess is that many — if not all — workers have, at some particular point or another in their careers, experienced “burn-out” as defined by the WHO.  Jobs typically involve stress, and it’s almost inevitable that there will be periods where multiple obligations pile on top of each other, leaving the worker feeling overwhelmed, exhausted, and dissatisfied.  But . . . should “burn-out” be viewed as a medical condition?  What, exactly, is a doctor supposed to do for a patient who presents with classic “burn-out” symptoms — prescribe a three-month vacation, or a new job, or new job responsibilities, or a change in the patient’s workplace manager?  Will employers be required to allow leaves of absence, beyond their designated vacation periods, for employees whose doctors diagnose them with “burn-out,” and will health insurers be required to pay for vacations as a form of treatment?  By classifying “burn-out” as a diagnosable health condition, aren’t we really going far down the road of “medicalizing” common aspects of our daily lives?

And can “burn-out” really be limited to the “occupational context,” as the ICD-11 instructs, or will the same concepts underlying workplace “burn-out” ultimately be recognized in other areas, like family or marital or college “burn-out”?  Here’s a possible answer to that question:  the ICD-11 now recognizes video gaming, with cocaine, and alcohol, and gambling, as a potential source of addiction.

Advertisements

Working For The Three-Day Weekend

In the distant, early days of Homo sapiens, there was no concept of “work” in the modern sense, and thus there were no holidays, either. Every day involved its many toils, from hunting and gathering to working to find shelter and water and protection against predators.

Then, as civilization developed, designated jobs became an inevitable part of the process. No city could exist without people charged with performing essential functions like laboring in the fields to bring in the crops, delivering food from the countryside, serving as scribe for Pharoah, or building the new pyramid or ziggurat.  The concept of holidays came later still. First, there were only religious holidays or seasonal holidays, to mark the Feast Day of Set or commemorate the harvest with a day of celebration. In the medieval era, when a saint’s day arrived, the duties of the job were replaced by lengthy religious obligations and, perhaps, fasting and the ritual wearing of a hair shirt.  It wasn’t exactly a laugh riot.

As humanity advanced even more, the concept of a work week was introduced and, then, secular holidays. When some brilliant soul realized that secular holidays really didn’t have to be tied to a specific date on the calendar and instead could float — so that the holiday could combine with a normal weekend to create a three-day weekend — it was a huge step forward in human development. And when an even more enlightened individual realized that we could use those three-day weekends to bookend the summer months, so that the joys of summer could begin with a glorious three-day revel in the warmth, it marked a true pinnacle in the annals of human achievement.

As we celebrate the joys of this three-day Memorial Day weekend, let’s remember those forgotten figures of human history who came up with the ideas that led us here — and be grateful that wearing sweaty hair shirts isn’t part of the equation.

The Impending Dash Clash

The apostrophe battle has been amicably settled.

spaced-em-dash2After some sternly worded exchanges, with many grammarians and wannabe English stylists weighing in, the B.A. Jersey Girl found an authoritative source that was able to bridge the gap between our competing positions and resolve the dispute.  She discovered that Bryan Garner’s Redbook:  A Manual on Legal Style acknowledged that while many style manuals follow the rule that always requires an apostrophe s to indicate a possessive, former journalists follow the Associated Press Style Manual and don’t add an apostrophe s when the word in question ends in s.  In short, both sides have a basis for their opinion, so we shook and decided to leave that issue behind.

Alas, a new punctuation fight looms directly ahead.  The virgin battleground involves something called an “em dash”—this super-long dash that, according to some grammarians, can be used as a substitute for a parentheses, can replace appositives that contain commas, and can be used to set off a sudden change in the direction of a sentence, among other uses.  It’s called the “em dash” because the length of the dash is about the same width as a capital M.

I’m all for adding a little dash to writing, but I’m not a fan of the “em dash” because it’s too long and is used without spaces on either side.  I’m a proponent of the dash that is formed with two hyphens and a space on both sides.  I think it looks neater and more orderly, whereas the “em dash” looks like a spear that is impaling the neighboring words.  I’m a fan of space in writing, and the “em dash” makes a sentence look crowded.  I say tap the space bar, give your words space to breathe, and let the “em dash” be damned.

Old, And Still Working

America’s elderly are working at levels not seen in decades.  Is that a good thing, or a bad thing?

This year, the participation rate in the labor force of retirement-age workers — that is, workers aged 65 and older — has cracked the 20 percent mark.  That’s the highest participation rate in 57 years, and twice as high as the low mark participation rate in 1985.  Since 1985, the rate of participation has steadily moved upward, with a significant increase in recent years.

seniorsThe Bloomberg article linked above suggests that many of these working elderly are doing so because they have no choice:  “Rickety social safety nets, inadequate retirement savings plans and sky high health-care costs are all conspiring to make the concept of leaving the workforce something to be more feared than desired.”  But the statistics indicate that at least some of the people who are working longer are doing so by choice, rather than by desperate need.   The share of all employees age 65 or older with at least an undergraduate degree is now 53 percent, up from 25 percent in 1985, and the inflation-adjusted income of those workers has increased to an average of $78,000, 63 percent higher than the $48,000 older folks brought home in 1985.  The increase in wages of the working elderly is better than the increase for workers below 65 during that same time period.

So why are people working longer than they used to?  To be sure, some may be doing so because they’ve got no choice — America’s retirement savings statistics are dismal.  But if that is the root cause for some significant percentage of the working elderly, why is that a bad thing?  If people haven’t saved, working longer in order to build up your retirement nest egg, and cut down on the number of years in which you’ll be living on that nest egg, is just the responsible thing to do.  We shouldn’t feel sorry for them, we should be applauding them for recognizing that, when it comes to retirement planning and saving, it’s better late than never.

The more interesting and deeper trend is that the economy is welcoming these older workers and rewarding them with increasing salaries.   In short, it’s not like all of these older workers are serving as friendly, red-vested greeters at Wal-Mart.  The salary statistics indicate that the job creation in the current economy is strong, and that companies are holding on to experienced older workers rather than incentivizing them to retire.  They are recognizing that older workers have value, and still have something to contribute.  If you are an older person who likes working and wants to continue to work, that’s a very encouraging trend.

 

A Heady Whiff Of Conference Room Air

It’s long been a standing joke that big office meetings — especially those that feature lengthy PowerPoint presentations — do nothing but make everyone in attendance dumber.  Now it looks like (gulp!) there’s some scientific evidence that the jest just might just have more than a kernel of truth to it.

conf-room-cm-2013-12-12_172632Conference room meetings involve two factors that don’t necessarily go well together:  living human beings, and closed spaces.   The human beings breathe in oxygen and exhale carbon dioxide, and the closed spaces prevent the air in the conference room from circulating.  Indeed, modern buildings are a lot more insulated and better at keeping outdoor air outside, and indoor air inside.  That means that, if you’re in a conference room meeting with lots of other people, as time goes on the carbon dioxide generated by the breathing process will accumulate and the percentage of carbon dioxide in the air will increase.

Studies have shown that breathing air with carbon dioxide that are too high — much higher than you could expect to find at even the longest, most deadly office meeting — can have clear negative effects on the brain.  The impact includes stifling interaction between different regions of the brain, reduced neuronal activity, and dilated blood vessels in the brain.  Now, scientists are starting to look at the effects of exposure to air with lower carbon dioxide concentrations, like what you might find in a closed door meeting in a conference room, and what they’re finding indicates that the old joke just might mirror reality.  The studies are showing that, as the carbon dioxide levels in indoor air increase, human performance on tests designed to measure higher end intellectual acuity qualities like strategy and initiative declined.

So what can you do, other than avoiding large-scale meetings?  One answer is to increase the ventilation rate in modern buildings, but that’s not something that most of us can readily control.  Other options are to open a window — if you’re in one of the incredibly rare conference rooms that actually has one — or even a door.  Keeping all-hands meetings as short as possible will help, too.  And there’s always the option we used to urge teachers to adopt on a beautiful spring day — have class outside.

The bottom line is that people who work in office buildings, as many of us do, need to be sensitive to getting outside where the tools of nature — trees, plants and cool breezes — have had a chance to scrub the air and return carbon dioxide levels to normal.  It turns out that getting out of closed cubicles and into the fresh air outside isn’t just good for the soul, it’s good for the brain, too.

Self-Proclaiming “Bad Ass” Status

Can you properly proclaim yourself a “bad ass”?  Or, is “bad ass” status something that can only truly be conferred by others, in recognition of your record and your lifelong body of work?

il_570xn.1459814370_mhh7This compelling issue arose because a friend at work referred to herself as a “bad ass.”  Admittedly, she did it in a carefully phrased, utterly lawyerly way — I think she may have said that she “projects a bad ass persona,” or something similar — but the implication was essentially the same.  And that raises the question of whether self-proclaiming that you are a “bad ass” is really valid.  Can you become a “bad ass” simply by buying a “bad ass” nameplate for your desk?

I think earning true “bad ass” distinction can only come from your recognition as such by third parties, and not by a personal declaration.  A “bad ass” is defined as someone who is tough, intimidating, and uncompromising.  (And wouldn’t you like to know, incidentally, how the phrase came to have that meaning, and when it was first used in that context?)  Being a “bad ass” therefore is a quality, like being deemed “smart,” that is difficult for individuals to self-assess, in part because it involves some comparison of your qualities to others.  And true personal evaluation isn’t easy for people.

Samwell Tarly, for example, desperately wants to be seen as a tough guy, and he’ll remind anyone within earshot that he once killed a white walker — but despite poor Sam’s pleading, no one is going to call him a “bad ass.”  Arya Stark, on the other hand, is recognized by one and all as a “bad ass,” without really even trying.  She has that quality and everyone knows it, and Sam doesn’t.  Indeed, the Urban Dictionary website says that the first rule of actually being a “bad ass” is that you don’t talk about being a “bad ass.”

I therefore question whether self-proclaiming that you are a “bad ass” really works.  However, I acknowledge that my friend is indeed tough and uncompromising — so I hereby declare that I consider her to be a “bad ass,” thereby conferring upon her official “bad ass” status.  Now I just need to find one of those nameplates for her desk.

Apostrophe Wars

The other day we were putting the finishing touches on a brief when an apostrophe argument arose.  We needed to indicate the possessive for an individual whose last name ended in s.  So, the question was, should it be “Mr. Jones‘ car” or “Mr. Jones’s car?”

b3ac60b60ddd3ad1a1d723192a4a4c65I always use the former construction, but the Jersey Girl was adamant that the second construction is the only permissible approach.  As is so often the case with grammar matters, the dispute became heated, passionate positions were staked out on both sides, voices were raised, and the Soccer Star, another member of the team on the case, heard the argument and came from a nearby office to enter the fray.  From there, the dispute escalated quickly, and if it had continued one of the participants probably would have been seen galloping away from the area with a trident lodged in his or her back.  But, because we needed to get a draft out the door, I yielded to the Jersey Girl’s resolute insistence that we must go with “Mr. Jones’s car,” and permanent injury was avoided.

Many people don’t really care about grammar, but for those who do correct usage is a very important issue.  And one of the reasons that the question of precisely how to show that the car belongs to Mr. Jones is a point of great dispute is that there is no universally recognized right answer.  Some authorities take the position that, whenever a possessive is used with a word ending in “s,” an “apostrophe s” must be added, others say that only an apostrophe should be used, and still others acknowledge that there is no correct answer and the key thing is to be consistent.

I prefer the use of the apostrophe only in this situation, because I think “Mr. Jones’s car” looks clunky.  In addition, when I read and write I admittedly tend to sound things out in my head, and the Jersey Girl’s approach with its multiple back-to-back sibilants leaves me hissing like a snake.

Still, it was interesting to see how much people can care about grammar.  And there’s nothing like a good grammar fight to get the tridents flying!