Breaking The Bad News

On the TV show House, House’s oncologist pal Wilson was reputed to be so humane and caring when giving patients bad news about their condition that, when he was done, patients actually thanked him.  Studies indicate, however, that there aren’t a lot of Wilsons out there in the medical profession.  Instead, many doctors botch one of the most important parts of their job — giving patients truthful information about their medical condition when the diagnosis is grim.

photo-hospital-doorwayTelling patients that they have untreatable cancer, or some other fatal disease, clearly is one of the toughest parts of a doctor’s job — and research indicates that doctors just aren’t very good at it.  Some doctors will break the bad news indirectly or use medical jargon that leaves the patient confused, others will do it with brutal directness, and still others will sugarcoat the news with treatment options.  As a result, many cancer patients aren’t well informed about their actual condition, and their prospects. A 2016 study found that only five percent of cancer patients understood their prognoses well enough to make informed decisions about their care.

Why are doctors so inept at giving patients bad news about their condition?  Of course, it’s incredibly hard to be the bearer of bad tidings, especially when the bad news is about a fatal illness, but there’s more to it than that.  Communications skills apparently aren’t emphasized at medical schools, and many doctors see a diagnosis of an incurable disease as a kind of personal failure on their part.

It’s interesting that, in a profession so associated with the phrase “bedside manner,” so many doctors regularly mishandle what is arguably the most important part of their job and so few medical schools make sure that their graduates are equipped to handle that task in a genuine, caring, and understandable way.  I hope I never receive a devastating diagnosis, but if I do I hope it comes from a doctor who knows how to break the bad news.

Advertisements

Travel Guilt

If you’ve got a big trip planned for this year, should you cancel it?  Should you refrain from traveling at all, because of the impact that your share of carbon emissions from the plane flight may be having on Arctic sea ice, or rising sea levels?

edited-travel-guilt-770x515That’s the question posed by a curious New York Times article earlier this week.  The author wrings his hands about the issue, caught between a desire to broaden his horizons by seeing the world and his professed guilt that his travel interests are selfish and evil because they may be affecting global climate change.  After quoting lots of statistics about the potential impact of one person’s activities, and envisioning being glared at by a hungry polar bear while pondering his contribution toward disappearing Arctic ice, the author notes that he’s still going to take a trip to Greece and Paris, but only after he’s purchased enough “carbon offsets” to “capture the annual methane emanations of a dozen cows.”

The Times article notes that, in 2016, two climatologists published a paper that concluded that there is a direct relation between carbon emissions and the melting of Arctic sea ice, and “each additional metric ton of carbon dioxide or its equivalent — your share of the emissions on a cross-country flight one-way from New York to Los Angeles — shrinks the summer sea ice cover by 3 square meters, or 32 square feet.”  Taking a cruise isn’t the answer, either; the article says that cruise ships produce three or four times the pollution produced by jets.  Even worse, the article states that just by being an average American we’re harming and even killing fellow human beings, and quotes a determination somehow made by a University of Tennessee professor, who concluded: The average American causes through his/her greenhouse gas emissions the serious suffering and/or deaths of two future people.”

So, should we just stay huddled in our houses with the lights turned off, so as to minimize our personal contribution to potential global catastrophe?  I won’t be doing that.  I like leisure travel, and unlike the Times writer, I’m not wracked with guilt about it.  I’m quite skeptical of any calculation that purports to show that, in view of all of the huge, overarching factors, such as sunspot cycles, solar flares, ocean currents, and wind systems, that can affect the Earth’s climate, the activity of an “average American” can be isolated and found to have a direct, measurable impact on climate.  Science has endured a lot of black eyes lately, with research and calculations shown to be inaccurate and, in some instances, politically motivated, and I’m just not willing to accept unquestioningly that going to visit my sister-in-law in California will melt 32 square feet of Arctic sea ice.  I also question how the activities of an “average American” are calculated, or how a walk-to-work person like me compares to the carbon footprint of the “average.”

So, I guess you can call me selfish, because I do want to see more of the world and experience the wonders of faraway places.  But don’t just ask me — ask the places that travelers visit if they’d rather not receive the infusions of cash, and the jobs created, that come from being a tourist destination.  If we’re going to be doing impossibly complex calculations of benefits and harm, how about throwing in the economic and cultural benefits that flow from travel into the equation?

“Burn-out” As A Medical Condition

Every few years, the World Health Organization produces a new version of the International Classification of Diseases, a catalog of acknowledged medical conditions that is used as a diagnostic guide by health care providers.  With every new version of the ICD, there seems to be some controversy about whether or not a particular ailment or complaint should be recognized.

burnoutThis year, the “should it be included or not” controversy swirls around “burn-out.”  Apparently there has been a long, ongoing debate about whether “burn-out” should be recognized as a medical condition, and the WHO has now weighed in with a “yes”:  the ICD-11 lists “burn-out” and defines it as “a syndrome conceptualised as resulting from chronic workplace stress that has not been successfully managed.”  According to the WHO, “burn-out” syndrome is characterized by “1) feelings of energy depletion or exhaustion; 2) increased mental distance from one’s job, or feelings of negativism or cynicism related to one’s job; and 3) reduced professional efficacy.”  Notably, the ICD-11 tries to draw a kind of line in the sand by stating that “burn-out” “refers specifically to phenomena in the occupational context and should not be applied to describe experiences in other areas of life.”

My guess is that many — if not all — workers have, at some particular point or another in their careers, experienced “burn-out” as defined by the WHO.  Jobs typically involve stress, and it’s almost inevitable that there will be periods where multiple obligations pile on top of each other, leaving the worker feeling overwhelmed, exhausted, and dissatisfied.  But . . . should “burn-out” be viewed as a medical condition?  What, exactly, is a doctor supposed to do for a patient who presents with classic “burn-out” symptoms — prescribe a three-month vacation, or a new job, or new job responsibilities, or a change in the patient’s workplace manager?  Will employers be required to allow leaves of absence, beyond their designated vacation periods, for employees whose doctors diagnose them with “burn-out,” and will health insurers be required to pay for vacations as a form of treatment?  By classifying “burn-out” as a diagnosable health condition, aren’t we really going far down the road of “medicalizing” common aspects of our daily lives?

And can “burn-out” really be limited to the “occupational context,” as the ICD-11 instructs, or will the same concepts underlying workplace “burn-out” ultimately be recognized in other areas, like family or marital or college “burn-out”?  Here’s a possible answer to that question:  the ICD-11 now recognizes video gaming, with cocaine, and alcohol, and gambling, as a potential source of addiction.

Grading The “Experts”

In our modern world, we’re bombarded with the opinions of “experts.”  Virtually every news story about a development or an incident features a quote from an “expert” who interprets the matter for us and, typically, makes a prediction about what will happen.  “Experts” freely offer their forecasts on specific things — like the contents and results of the Mueller Report, for example — and on big-picture things, like the direction of the economy or geopolitical trends.

d36a6136-6dfd-425a-b7f7-2b2a1b446b1eThere are so many “experts” giving so many predictions about so many things that it’s reasonable to wonder whether anyone is paying attention to whether the “experts” ultimately turn out to be very good at making their predictions.

The Atlantic has a fascinating article about this topic that concludes that so-called “experts” are, in fact, dismally bad at predicting the future.  That’s not a surprising conclusion for those of us who’ve been alive, paying attention, and recalling some of the confident forecasts of days gone by.  Whether it’s the “population bomb” forecasts noted in The Atlantic article, or the predictions in the ’80s that Japan would soon own the world, or the prognostications about how elections will end up or whether one party or another has that elusive “permanent majority,” recent history is littered with failed expert predictions.

Why are would-be “experts” so bad at their predictions?  The article notes that academics and others who focus on one field tend to be especially wrong in their foretelling because they typically ignore other forces at work.  They also are often so invested in their specialty, and their belief in their own evaluations, that they react to failure by doubling down on their predictions — like doomsday cult leaders who tweak their calculations after a deadline has passed to come up with a new day the world will end.  People who are less invested in the belief in their own infallibility, and who are less focused on one discipline or area of study, tend to be much better at making predictions about the future than the “experts.”

Does the consistent thread of “expert” predictive failure mean that we shouldn’t try to see ahead at what the future may bring?  Of course not.  But it does mean that we should take the dire forecasts of “experts” with a healthy dose of skepticism.  Keep that in mind the next time a talking head says we need to make some dramatic change in order to avoid certain doom.

Drawing An Unscientific Maggot Line

I have a high regard for scientists . . . generally.  But sometimes scientists don’t exactly have a solid appreciation of the sensibilities of normal human beings.

maggots_lede_photo_bigstock_2100-768x526Consider, for example, this report on the work of scientists at the University of Queensland in Brisbane, Australia.  They conclude that, given the population in the world, humans need to start turning to alternative sources of protein besides animal meat.  The article linked above quotes “meat science professor Dr. Louwrens Hoffman” — apparently “meat science” is a discipline that has been developed since I’ve been in college, because otherwise that would have been a pretty darned tempting major — as saying:  “An overpopulated world is going to struggle to find enough protein unless people are willing to open their minds, and stomachs, to a much broader notion of food.”

So far, so good.  But Dr. Hoffman and his team at the University of Queensland are looking to replace beef and chicken and pork with — gulp! — maggots and locusts.  They reason that the world’s insect population is a far more sustainable source of supply for such protein.  They also recognize that most people rebel at the notion of consuming chitinous locusts or squirmy maggots, so they are working on developing “prepared foods” that include locusts and maggots as disguised ingredients.  So far, they’ve worked on a maggot sausage with promising results, and Dr. Hoffman swears that a student has developed an insect ice cream that is “very tasty.”  Who knows?  Soon you may be able to have an ice cream cone with a scoop of vanilla and a scoop of “insect.”

According to the article, there are already some insect-based products available in the U.S., such as Chirps chips and Chapul protein bars.  I haven’t had any of these items, and I haven’t noticed them flying off the shelves at the neighborhood grocery store, either.

There’s a basic repulsion issue involved in eating maggots.  With a nod to the French government defense strategy before World War II, you might call it The Maggot Line, and scientific-based arguments aren’t going to cross it.  I think the the issue with insect-based foods is whether ingredient lists on food packaging are required to accurately and clearly disclose the insect element.  If maggots can be called by their scientific names — which are Lucilia sericata and Phaenicia sericata — and jumbled in with the other scientific sounding ingredients for prepared foods, like sodium benzoate and monosodium glutamate, then maggot sausage might stand a chance.  But if the packaging has to use plain English and disclose maggots as an ingredient, forget it.

A Heady Whiff Of Conference Room Air

It’s long been a standing joke that big office meetings — especially those that feature lengthy PowerPoint presentations — do nothing but make everyone in attendance dumber.  Now it looks like (gulp!) there’s some scientific evidence that the jest just might just have more than a kernel of truth to it.

conf-room-cm-2013-12-12_172632Conference room meetings involve two factors that don’t necessarily go well together:  living human beings, and closed spaces.   The human beings breathe in oxygen and exhale carbon dioxide, and the closed spaces prevent the air in the conference room from circulating.  Indeed, modern buildings are a lot more insulated and better at keeping outdoor air outside, and indoor air inside.  That means that, if you’re in a conference room meeting with lots of other people, as time goes on the carbon dioxide generated by the breathing process will accumulate and the percentage of carbon dioxide in the air will increase.

Studies have shown that breathing air with carbon dioxide that are too high — much higher than you could expect to find at even the longest, most deadly office meeting — can have clear negative effects on the brain.  The impact includes stifling interaction between different regions of the brain, reduced neuronal activity, and dilated blood vessels in the brain.  Now, scientists are starting to look at the effects of exposure to air with lower carbon dioxide concentrations, like what you might find in a closed door meeting in a conference room, and what they’re finding indicates that the old joke just might mirror reality.  The studies are showing that, as the carbon dioxide levels in indoor air increase, human performance on tests designed to measure higher end intellectual acuity qualities like strategy and initiative declined.

So what can you do, other than avoiding large-scale meetings?  One answer is to increase the ventilation rate in modern buildings, but that’s not something that most of us can readily control.  Other options are to open a window — if you’re in one of the incredibly rare conference rooms that actually has one — or even a door.  Keeping all-hands meetings as short as possible will help, too.  And there’s always the option we used to urge teachers to adopt on a beautiful spring day — have class outside.

The bottom line is that people who work in office buildings, as many of us do, need to be sensitive to getting outside where the tools of nature — trees, plants and cool breezes — have had a chance to scrub the air and return carbon dioxide levels to normal.  It turns out that getting out of closed cubicles and into the fresh air outside isn’t just good for the soul, it’s good for the brain, too.

What A Difference A Night Makes

Recently I’ve been having some irregular sleep patterns.  I’ll go to bed and fall asleep promptly, but then wake up only a few hours later, with heart pumping and mind racing. When that happens, it’s hard to fall back into the REM cycle quickly, and I’ll inevitably toss and turn for as much as an hour, fretting all the while that I’m losing out on sleep that I need and will never make up.

But last night I fell asleep as soon as my head hit the pillow, slept through the night without any nocturnal wakefulness, and arose feeling refreshed.  When I went down to make the morning coffee the birds were chirping, I unloaded the dishwasher with a happy feeling, and the coffee tasted richer and better than ever.

Wake up of an asleep girl stopping alarm clockThere’s no doubt that sleep is therapeutic on multiple fronts.  The National Institutes of Health reports that, physically, the changes in breathing, heart rate, and blood pressure that occur during a good night’s sleep help to promote cardiovascular health, and while you sleep hormones are released that repair cells and control your body’s use of energy.  And although the physical aspects of sleep are significant, the mental aspects are even more important.  Getting your 7 or 8 hours of sound sleep enhances mood, alertness, intellectual functioning, and reflexes, while chronic sleep deprivation can lead to depression and anxiety disorders.

Knowing all of this, why doesn’t the human brain always do what is necessary to allow everyone to get their share of shuteye?  Unfortunately, things don’t don’t work that way, stresses and concerns at work and at home can interfere with the sleep cycle, and then the lack of sleep and the irritability it produces can have a compounding effect on those stresses and concerns.

That’s one of the reasons why getting a solid night of slumber time after a few night’s of anxious restlessness feels so good.  You may not be making up for lost sleep, but it’s comforting to know that your mind and body are back to their normal cycles — at least, until the next round of stresses and concerns hit.