Apollo’s Lasting Legacy

As we commemorate the 50th anniversary of the Apollo 11 landing on the Moon — which happened on July 20, 1969 — we’ve seen a lot of interesting articles about the space program, the Apollo program, and NASA’s lunar missions, including a fascinating Smithsonian article about Apollo 11 specifically.  Popular Mechanics also has reprinted an interview with Buzz Aldrin from 25 years ago about why he went to the Moon, and why he thinks we should go back.

as16-113-18339hrsmOne of the most intriguing pieces I’ve seen was a UPI article that sought to identify products and technologies that can be attributed to the Apollo program and that still are in use today.  (That means that “Space Food Sticks,” an awful-tasting product from my youth that quickly went out of production, doesn’t qualify.)  The UPI writer found that Apollo’s legacy goes beyond Tang, velcro, and computer chips.  Products such as the “Dustbuster” hand-held vacuum cleaner, high-performance athletic shoes, communications headsets, credit card swiping machines, and even the “memory foam” in your mattress all trace their roots back to developments that occurred during the Apollo program.

These technological advances are important, of course, and show what can happen when you hire a bunch of really smart, creative, highly motivated engineers and problem-solvers, give them a mission and adequate funding, and establish a meaningful deadline to achieve the goal.  Technological developments are a pretty predictable result of such an effort, which is one reason why I think the United States should end the 50-year drought and get back into the manned space arena in a significant way — whether through government programs, or through partnership with the private companies that are focused on space, or through some other creative means.

But new technology and techniques are not, perhaps, the best reason to go back into space.  For those of us who grew up during the ’60s space program days, and dreamed about being an astronaut like the courageous adventurers of our youth, there will always be a part of our make-up that is interested in space, and science, and the stars.  Perhaps it would be impossible to fully recreate the conditions that made the early astronauts celebrity-heroes in those innocent days, but wouldn’t it nevertheless be valuable to give the current generation of young people role models who are smart, well-educated, selfless, and brave, and encourage those young people to dream about discovery and scientific advancement?

The technological legacy of the Apollo program is impressive, but I think the real legacy is aspirational — something that touched us deeply and leaves even 60-somethings like me still keenly interested in space and hoping that one day, perhaps, I’ll follow in Neil Armstrong’s footsteps and be able to visit the Moon.  The real legacy tells you something about the power of a dream.  We should give the children of today, and tomorrow, the chance to experience such dreams again.

Advertisements

Selfie Psychosis

We are learning more and more about people who have a “selfie” obsession.  We know that people taking selfies are at greater risk of having serious, and even fatal, accidents because they are oblivious to their surroundings while they are taking pictures of themselves on streets or, say, at the edge of the Grand Canyon.  We’ve also seen evidence that people who take selfies are so self-absorbed that they don’t show the decency and sensitivity you typically would expect from a fellow human being.

Woman taking a selfieNow new research is indicating what seems like a pretty obvious conclusion:  people who take selfies are more likely to undergo plastic surgery.  The connection is even stronger if the selfies are taken with filters, or if the posters regularly take down selfie postings that they later conclude aren’t very flattering.  Cosmetic surgeons are reporting that members of the selfie crowd are coming to their offices with selfies where the features have been digitally altered and asked the doctor to change their appearance to match the altered image.

It shouldn’t come as a surprise, I suppose, that people who take selfies are narcissistic and are interested in changing their appearance to try to reach their own definition of personal perfection.  After all, if you spend your time constantly looking at your own pouting face, you’re bound to notice a few imperfections to be cleaned up.  The selfie-obsessed also tend to compare their selfies with the countless other selfies that appear on social media feeds and find their looks wanting.

As one of the plastic surgeons quoted in the article linked above notes, that’s not healthy behavior.  It’s the kind of behavior that those of us who don’t take selfies, and indeed don’t particularly like to have their photos taken at all, just can’t understand.

But we’ll have to, because the selfie epidemic seems to be getting worse, not better.  Researchers estimate that 650 million selfies are posted every day on social media.  That’s a lot of potential plastic surgery.

Why Opposable Thumbs Exist

Why do opposable thumbs exist in humans and other primates?  Scientists generally agree that the appearance of the opposable thumb was a key evolutionary point in the development of our species.  It is what allowed primates to grip and climb and move into the trees, away from the realm of large predators looking for a meal.  Opposable thumbs also proved to be pretty handy from a toolmaking and tool using perspective, whether the tool was a stick to be manipulated or a rudimentary axe.

All of this is true,  Curiously, however, scientists haven’t fully explored whether the opposable thumb was developed in anticipation that modern humans who are too cheap to buy a nozzle for their garden hose might need the thumb to water their yard and plants on a beastly hot summer day.  Sure, the opposable thumb might not have been evolved specifically for watering and hose wielding, but it sure works well for that purpose — whether you want to generate a gentle sprinkle or a high velocity jet to reach the side of the yard beyond the length of the hose.

How do we know for sure that our distant ancestors weren’t big on watering?

Grip Evolution

Here’s another story to add to the slew of news articles about general health trends:  human beings, on average, are getting weaker.  In this case, the indicator is grip strength — that is, how much holding and squeezing force can a person generate with just the fingers of their hand.  Recent studies have indicated that grip strength has declined significantly, even in the last 30 years.

best-hand-gripper-exercisesSo what, you might ask?  You’re less likely to encounter the guys who give you a bone-crushing handshake, and you don’t see people walking around flexing those hand exercisers anymore.  What’s the big deal?  The big deal is this:  grip strength is one of those inverse health indicators lurking in the human body, with lower grip strength associated with increased mortality from all causes and cardiovascular mortality in particular.  And, especially for those of us who are getting up there, grip strength is a key indicator of sarcopenia, the loss of muscle that occurs as we age, and may also indicate issues with cognitive performance.

Why is grip strength declining?  Of course, gripping is a key part of the evolution of homo sapiens — whose distant ancestors needed a strong grip when they were swinging through trees, and whose more recent predecessors used their hands to create and then wield tools and weapons that allowed them to survive predators and gather food.  In short, humans needed that strong grip to make it through the natural selection melee and emerge at the top of the evolutionary pyramid.  But in recent years, the need for hand strength at home or on the job has declined.  White collar workers need hand dexterity as they tap away at computers, not hand strength, and even blue collar workers now use automatic tools that don’t need the kind of personal strength that hand wrenches of the past, for example, required.  Mix those factors in with a general decline in fitness and increase in obesity, and you’ve gone a long way to explaining why human beings increasingly are becoming a bunch of unhealthy softies.

In short, as a species humans may be losing their grip.  It’s not a positive development.

Foodie Calls

Two recent surveys have identified what is being depicted as a “new trend” on the dating scene:  the “foodie call.”  It happens when one person goes out with another person that they really aren’t that interested in — just to get a free meal.

foodie-call-istock-fudfoto-696x392The two surveys of heterosexual women were conducted by Azusa Pacific University and the University of California-Merced, and the results were published in the journal of the Society for Personality and Social Psychology.  The participants were asked questions about their personalities, the views on gender roles, and their views, and personal histories, with “foodie calls.”  In one survey, one third of the respondents admitted to going out on a date just to get a free meal, and in the second survey 23 percent of the study group admitted to a “foodie call.”  The research also found that the majority of respondents were aghast at the concept of a “foodie call” and believed it to be moderately to extremely unacceptable.

What are we to make of “foodie calls”?  Speaking as someone who enjoys a good meal from time to time, I don’t think being motivated, in whole or in part, to go out on a date to get a good meal is incredibly egregious behavior.  I also think, however, that people who go on “foodie calls” might be selling themselves short, and I wonder if they ultimately find the meals very satisfying.  Spending two or three hours with somebody you really have no interest in and making cheery chit-chat that entire time would be exhausting, and is a pretty high price to pay for some fine dining.  Meals are supposed to be a pleasant, shared experience, and having to work hard to maintain a conversation would tend to interfere with your enjoyment of the cuisine.

As for the guys who’ve paid for the “foodie calls” — well, if the person you’ve asked out starts negotiating with you about the only restaurants that would be acceptable destinations for the date, you might just want to be on guard.

Breaking The Bad News

On the TV show House, House’s oncologist pal Wilson was reputed to be so humane and caring when giving patients bad news about their condition that, when he was done, patients actually thanked him.  Studies indicate, however, that there aren’t a lot of Wilsons out there in the medical profession.  Instead, many doctors botch one of the most important parts of their job — giving patients truthful information about their medical condition when the diagnosis is grim.

photo-hospital-doorwayTelling patients that they have untreatable cancer, or some other fatal disease, clearly is one of the toughest parts of a doctor’s job — and research indicates that doctors just aren’t very good at it.  Some doctors will break the bad news indirectly or use medical jargon that leaves the patient confused, others will do it with brutal directness, and still others will sugarcoat the news with treatment options.  As a result, many cancer patients aren’t well informed about their actual condition, and their prospects. A 2016 study found that only five percent of cancer patients understood their prognoses well enough to make informed decisions about their care.

Why are doctors so inept at giving patients bad news about their condition?  Of course, it’s incredibly hard to be the bearer of bad tidings, especially when the bad news is about a fatal illness, but there’s more to it than that.  Communications skills apparently aren’t emphasized at medical schools, and many doctors see a diagnosis of an incurable disease as a kind of personal failure on their part.

It’s interesting that, in a profession so associated with the phrase “bedside manner,” so many doctors regularly mishandle what is arguably the most important part of their job and so few medical schools make sure that their graduates are equipped to handle that task in a genuine, caring, and understandable way.  I hope I never receive a devastating diagnosis, but if I do I hope it comes from a doctor who knows how to break the bad news.

Travel Guilt

If you’ve got a big trip planned for this year, should you cancel it?  Should you refrain from traveling at all, because of the impact that your share of carbon emissions from the plane flight may be having on Arctic sea ice, or rising sea levels?

edited-travel-guilt-770x515That’s the question posed by a curious New York Times article earlier this week.  The author wrings his hands about the issue, caught between a desire to broaden his horizons by seeing the world and his professed guilt that his travel interests are selfish and evil because they may be affecting global climate change.  After quoting lots of statistics about the potential impact of one person’s activities, and envisioning being glared at by a hungry polar bear while pondering his contribution toward disappearing Arctic ice, the author notes that he’s still going to take a trip to Greece and Paris, but only after he’s purchased enough “carbon offsets” to “capture the annual methane emanations of a dozen cows.”

The Times article notes that, in 2016, two climatologists published a paper that concluded that there is a direct relation between carbon emissions and the melting of Arctic sea ice, and “each additional metric ton of carbon dioxide or its equivalent — your share of the emissions on a cross-country flight one-way from New York to Los Angeles — shrinks the summer sea ice cover by 3 square meters, or 32 square feet.”  Taking a cruise isn’t the answer, either; the article says that cruise ships produce three or four times the pollution produced by jets.  Even worse, the article states that just by being an average American we’re harming and even killing fellow human beings, and quotes a determination somehow made by a University of Tennessee professor, who concluded: The average American causes through his/her greenhouse gas emissions the serious suffering and/or deaths of two future people.”

So, should we just stay huddled in our houses with the lights turned off, so as to minimize our personal contribution to potential global catastrophe?  I won’t be doing that.  I like leisure travel, and unlike the Times writer, I’m not wracked with guilt about it.  I’m quite skeptical of any calculation that purports to show that, in view of all of the huge, overarching factors, such as sunspot cycles, solar flares, ocean currents, and wind systems, that can affect the Earth’s climate, the activity of an “average American” can be isolated and found to have a direct, measurable impact on climate.  Science has endured a lot of black eyes lately, with research and calculations shown to be inaccurate and, in some instances, politically motivated, and I’m just not willing to accept unquestioningly that going to visit my sister-in-law in California will melt 32 square feet of Arctic sea ice.  I also question how the activities of an “average American” are calculated, or how a walk-to-work person like me compares to the carbon footprint of the “average.”

So, I guess you can call me selfish, because I do want to see more of the world and experience the wonders of faraway places.  But don’t just ask me — ask the places that travelers visit if they’d rather not receive the infusions of cash, and the jobs created, that come from being a tourist destination.  If we’re going to be doing impossibly complex calculations of benefits and harm, how about throwing in the economic and cultural benefits that flow from travel into the equation?