“Burn-out” As A Medical Condition

Every few years, the World Health Organization produces a new version of the International Classification of Diseases, a catalog of acknowledged medical conditions that is used as a diagnostic guide by health care providers.  With every new version of the ICD, there seems to be some controversy about whether or not a particular ailment or complaint should be recognized.

burnoutThis year, the “should it be included or not” controversy swirls around “burn-out.”  Apparently there has been a long, ongoing debate about whether “burn-out” should be recognized as a medical condition, and the WHO has now weighed in with a “yes”:  the ICD-11 lists “burn-out” and defines it as “a syndrome conceptualised as resulting from chronic workplace stress that has not been successfully managed.”  According to the WHO, “burn-out” syndrome is characterized by “1) feelings of energy depletion or exhaustion; 2) increased mental distance from one’s job, or feelings of negativism or cynicism related to one’s job; and 3) reduced professional efficacy.”  Notably, the ICD-11 tries to draw a kind of line in the sand by stating that “burn-out” “refers specifically to phenomena in the occupational context and should not be applied to describe experiences in other areas of life.”

My guess is that many — if not all — workers have, at some particular point or another in their careers, experienced “burn-out” as defined by the WHO.  Jobs typically involve stress, and it’s almost inevitable that there will be periods where multiple obligations pile on top of each other, leaving the worker feeling overwhelmed, exhausted, and dissatisfied.  But . . . should “burn-out” be viewed as a medical condition?  What, exactly, is a doctor supposed to do for a patient who presents with classic “burn-out” symptoms — prescribe a three-month vacation, or a new job, or new job responsibilities, or a change in the patient’s workplace manager?  Will employers be required to allow leaves of absence, beyond their designated vacation periods, for employees whose doctors diagnose them with “burn-out,” and will health insurers be required to pay for vacations as a form of treatment?  By classifying “burn-out” as a diagnosable health condition, aren’t we really going far down the road of “medicalizing” common aspects of our daily lives?

And can “burn-out” really be limited to the “occupational context,” as the ICD-11 instructs, or will the same concepts underlying workplace “burn-out” ultimately be recognized in other areas, like family or marital or college “burn-out”?  Here’s a possible answer to that question:  the ICD-11 now recognizes video gaming, with cocaine, and alcohol, and gambling, as a potential source of addiction.

Advertisements

Drawing An Unscientific Maggot Line

I have a high regard for scientists . . . generally.  But sometimes scientists don’t exactly have a solid appreciation of the sensibilities of normal human beings.

maggots_lede_photo_bigstock_2100-768x526Consider, for example, this report on the work of scientists at the University of Queensland in Brisbane, Australia.  They conclude that, given the population in the world, humans need to start turning to alternative sources of protein besides animal meat.  The article linked above quotes “meat science professor Dr. Louwrens Hoffman” — apparently “meat science” is a discipline that has been developed since I’ve been in college, because otherwise that would have been a pretty darned tempting major — as saying:  “An overpopulated world is going to struggle to find enough protein unless people are willing to open their minds, and stomachs, to a much broader notion of food.”

So far, so good.  But Dr. Hoffman and his team at the University of Queensland are looking to replace beef and chicken and pork with — gulp! — maggots and locusts.  They reason that the world’s insect population is a far more sustainable source of supply for such protein.  They also recognize that most people rebel at the notion of consuming chitinous locusts or squirmy maggots, so they are working on developing “prepared foods” that include locusts and maggots as disguised ingredients.  So far, they’ve worked on a maggot sausage with promising results, and Dr. Hoffman swears that a student has developed an insect ice cream that is “very tasty.”  Who knows?  Soon you may be able to have an ice cream cone with a scoop of vanilla and a scoop of “insect.”

According to the article, there are already some insect-based products available in the U.S., such as Chirps chips and Chapul protein bars.  I haven’t had any of these items, and I haven’t noticed them flying off the shelves at the neighborhood grocery store, either.

There’s a basic repulsion issue involved in eating maggots.  With a nod to the French government defense strategy before World War II, you might call it The Maggot Line, and scientific-based arguments aren’t going to cross it.  I think the the issue with insect-based foods is whether ingredient lists on food packaging are required to accurately and clearly disclose the insect element.  If maggots can be called by their scientific names — which are Lucilia sericata and Phaenicia sericata — and jumbled in with the other scientific sounding ingredients for prepared foods, like sodium benzoate and monosodium glutamate, then maggot sausage might stand a chance.  But if the packaging has to use plain English and disclose maggots as an ingredient, forget it.

A Heady Whiff Of Conference Room Air

It’s long been a standing joke that big office meetings — especially those that feature lengthy PowerPoint presentations — do nothing but make everyone in attendance dumber.  Now it looks like (gulp!) there’s some scientific evidence that the jest just might just have more than a kernel of truth to it.

conf-room-cm-2013-12-12_172632Conference room meetings involve two factors that don’t necessarily go well together:  living human beings, and closed spaces.   The human beings breathe in oxygen and exhale carbon dioxide, and the closed spaces prevent the air in the conference room from circulating.  Indeed, modern buildings are a lot more insulated and better at keeping outdoor air outside, and indoor air inside.  That means that, if you’re in a conference room meeting with lots of other people, as time goes on the carbon dioxide generated by the breathing process will accumulate and the percentage of carbon dioxide in the air will increase.

Studies have shown that breathing air with carbon dioxide that are too high — much higher than you could expect to find at even the longest, most deadly office meeting — can have clear negative effects on the brain.  The impact includes stifling interaction between different regions of the brain, reduced neuronal activity, and dilated blood vessels in the brain.  Now, scientists are starting to look at the effects of exposure to air with lower carbon dioxide concentrations, like what you might find in a closed door meeting in a conference room, and what they’re finding indicates that the old joke just might mirror reality.  The studies are showing that, as the carbon dioxide levels in indoor air increase, human performance on tests designed to measure higher end intellectual acuity qualities like strategy and initiative declined.

So what can you do, other than avoiding large-scale meetings?  One answer is to increase the ventilation rate in modern buildings, but that’s not something that most of us can readily control.  Other options are to open a window — if you’re in one of the incredibly rare conference rooms that actually has one — or even a door.  Keeping all-hands meetings as short as possible will help, too.  And there’s always the option we used to urge teachers to adopt on a beautiful spring day — have class outside.

The bottom line is that people who work in office buildings, as many of us do, need to be sensitive to getting outside where the tools of nature — trees, plants and cool breezes — have had a chance to scrub the air and return carbon dioxide levels to normal.  It turns out that getting out of closed cubicles and into the fresh air outside isn’t just good for the soul, it’s good for the brain, too.

What A Difference A Night Makes

Recently I’ve been having some irregular sleep patterns.  I’ll go to bed and fall asleep promptly, but then wake up only a few hours later, with heart pumping and mind racing. When that happens, it’s hard to fall back into the REM cycle quickly, and I’ll inevitably toss and turn for as much as an hour, fretting all the while that I’m losing out on sleep that I need and will never make up.

But last night I fell asleep as soon as my head hit the pillow, slept through the night without any nocturnal wakefulness, and arose feeling refreshed.  When I went down to make the morning coffee the birds were chirping, I unloaded the dishwasher with a happy feeling, and the coffee tasted richer and better than ever.

Wake up of an asleep girl stopping alarm clockThere’s no doubt that sleep is therapeutic on multiple fronts.  The National Institutes of Health reports that, physically, the changes in breathing, heart rate, and blood pressure that occur during a good night’s sleep help to promote cardiovascular health, and while you sleep hormones are released that repair cells and control your body’s use of energy.  And although the physical aspects of sleep are significant, the mental aspects are even more important.  Getting your 7 or 8 hours of sound sleep enhances mood, alertness, intellectual functioning, and reflexes, while chronic sleep deprivation can lead to depression and anxiety disorders.

Knowing all of this, why doesn’t the human brain always do what is necessary to allow everyone to get their share of shuteye?  Unfortunately, things don’t don’t work that way, stresses and concerns at work and at home can interfere with the sleep cycle, and then the lack of sleep and the irritability it produces can have a compounding effect on those stresses and concerns.

That’s one of the reasons why getting a solid night of slumber time after a few night’s of anxious restlessness feels so good.  You may not be making up for lost sleep, but it’s comforting to know that your mind and body are back to their normal cycles — at least, until the next round of stresses and concerns hit.

Living In The Matrix

I thought The Matrix was a terrific movie.  I like the sequel, too.  (The last film in the trilogy, eh, not so much.)

But I had no idea that reputable scientists were seriously considering the central premise of The Matrix — that what we think of as the real world is in fact a huge computer simulation run by machines and designed and policed to enslave humanity.  In fact, a scientist named Rizwan Virk has written a book, entitled The Simulation Hypothesis, about that possibility.

matrix_inThe Matrix concept is gaining traction for several reasons.  One is that computer technology, and games-playing technology, apparently is developing to the point where sophisticated multi-player, on-line games are routine and it’s becoming harder and harder to distinguish reality from simulation.  (I say “apparently” because I’m not a gamer — that is, unless I’m really trapped in a computer simulation and playing, unwittingly, just by living my life.)  If our technology is developing in that direction, the argument goes, isn’t it possible that we are living in a more advanced simulation created by more advanced computer system developed by a more advanced civilization?

And there’s also a weird statistical argument for the simulation hypothesis that goes like this:  once a civilization creates computers that are powerful enough to create plausible simulations for millions or billions of players, it’s comparatively easy to create entirely new, realistic settings for entirely new simulated players that are all artificial intelligence.  Crossing that technological-capability threshold means that trillions of AI creations could be living in games — making it statistically likely that you’re an AI creation rather than a flesh-and-blood being.

And here’s an even weirder concept:  if we’re all players in a video game, maybe our scores are being kept somewhere for some purpose that we don’t quite know yet, and won’t know until our own experience in the simulation ends.  It would help to know the rules of the game, wouldn’t it?

Are we living in a simulation?  I don’t see how you can prove or disprove that, from our perspective as potential players in an ultra-advanced game created by an ancient alien civilization.  But I do know this:  if that is our reality, I’m glad the programmers have finally allowed the weather to warm up a bit.

The Day The Dinosaurs Died

You’ve probably read about how a massive asteroid strike ended the era of the dinosaurs and caused their ultimate mass extinction.  The geological evidence indicates that, 66 million years ago, the asteroid hit on the Yucatan peninsula of modern Mexico and produced massive earthquakes, volcanic eruptions, tidal waves, and forest fires.  The strike threw up a dense plume of dust and debris that turned the world dark and wiped out 99 percent of life on Earth.  Thanks to that asteroid strike, the Cretaceous period ended with a bang and the way was clear for mammals — and human beings — to take the dinosaurs’ place at the top of the food chain.

sk-2017_04_article_main_mobileWhat was it like on the day, 66 million years ago, when the asteroid struck the Earth with such terrible force?  Robert DePalma, a doctoral student at the University of Kansas, has found compelling evidence of what happened on that momentous day, and this week he published his findings in the journal Proceedings of the National Academy of Sciences.  In 2012, looking at a site called Tanis, in the Hell Creek geological formation in North Dakota, DePalma found layers of perfectly preserved animals and fish fossils at the precise boundary between the Cretaceous period and the Tertiary period that followed it — the very day when the asteroid struck the Yucatan.

The geological evidence shows that the asteroid strike created a magnitude 10 or 11 earthquake that generated seismic waves that reached out thousands of miles.  In prehistoric North Dakota, which like much of the North American continent was covered by an inland sea, the seismic waves produced a water surge that threw fish onto shores to suffocate — producing the layers of fish and animals that DePalma found.  At the same time, molten material was hurled into the atmosphere.  In the geological formation, DePalma found bone, teeth, and hatchling remains of many dinosaur groups, including an intact dinosaur egg complete with embryo — indicating that the dinosaurs survived that fateful day, although their ultimate day of reckoning was coming.

In an article in the New Yorker, DePalma describes his find as “like finding the Holy Grail clutched in the bony fingers of Jimmy Hoffa, sitting on top of the Lost Ark.”  Thanks to him, we now know a lot more about the day that the ground buckled and snapped, the waters surged, the skies were lit with fire, and the world changed forever.

Studying Stonehenge

When I took a trip to England right after I graduated from college, one of the coolest places I visited was Stonehenge.  There was a strong air of ancient mystery lurking among the massive stones arranged in a circle on the Salisbury plains.  You couldn’t help but walk among the stones and think about where the enormous stones came from, who put them there, how in the world they got there — and what their mysterious purpose actually was.

02-stonehenge-dog-tooth.ngsversion.1492466772317.adapt_.1900.1Now scientists have answered the first question, at least in part:  many of the smaller stones at the Stonehenge site came from ancient quarries in the Preseli Hills of Wales, and they were consciously mined and taken to Stonehenge, not deposited on the Salisbury plains by glaciers.  Scientists used tools that allowed them to test the chemical composition of rocks in the quarry and match it to the composition of the rocks at Stonehenge.  The tests are so precise that scientists were able to determine that the Stonehenge stones came from quarries in the northern part of the hills rather than the southern part — a finding that is significant, because it means that the stones were probably transported to the Salisbury plains over land, rather than floated there on rivers.  The scientists also found mining tools at that date back to 3000 B.C., when the first stage of Stonehenge was built.

So now we know that, 5000 years ago, human beings mined large stones from Wales and then somehow dragged them 150 miles away, where they were arranged in circles that seem to be related in some way to the summer solstice.  But we don’t know why ancient humans would undertake such an enormous task, or how they accomplished it.  Unless someone invents a time machine, the answers to those questions probably will forever remain an unsolvable mystery — which is one reason why Stonehenge is so cool.