The Scientists Are Wrong Again

If you’ve read anything about the early days of Earth, you’ve read about how our home planet had a thick, heavy atmosphere, filled with greenhouse gases that trapped heat and allowed early plants and other organisms to take root and ultimately thrive.  That was the prevailing scientific consensus.


protistaRecent studies of gas bubbles trapped in a lava flow that occurred 2.7 billion years ago show that the atmosphere of the early Earth was much, much thinner than scientists had assumed — less than half as thick, in fact, as our current atmosphere.  That’s a surprise, because it is clear that the sun was weaker in those days, and a thick atmosphere was thought to be needed to make the Earth warm enough to support life.  With a thin atmosphere, Earth must have been a daunting place for the early photosynthetic life forms — but somehow those hardy little creatures survived anyway and started the process that ultimately led to . . . us.

It’s long been clear that the environment of planet Earth is ever-changing, but the studies of the gas bubbles indicates that the changes over time are even more significant than was suspected.  That’s an interesting discovery, but it might also have practical consequences in our search for life forms on other moons and planets.  Scientists searching for extraterrestrial life tend to look for planets that are “Earth-like” in the sense of modern-day Earth — but it turns out our own planet wasn’t very “Earth-like” at all in its early days.  The range of places that could support life therefore is likely much wider than previously suspected.

So scientists are wrong about the early atmosphere on Earth, just as they have been wrong about countless other things during the long and rich history of science.  That’s the great thing about science — inaccuracy and failed hypotheses are just an inevitable part of the process.  In this case, the scientific error also happens to tell us something useful and gratifying about just how tenacious life forms can be.

A Germophobe’s Analysis Of The Relative Health Advantages Of Fist Bumps Over Handshakes

It seems as though scientists are always trying to get us to change our time-honored habits.  Now they want us to reject handshake greetings in favor of “fist bumps,” because a study has shown that a firm handshake transmits far more germs than a quick knuckle clash.

In the study, a scientist stuck his gloved hand into a vat of bacteria, let it dry, and then shook hands, fist-bumped, or high-fived other participants and measured how many germs ended up on their gloves.  (Apparently the scientists didn’t think the “bro shake” or the “down low” were sufficiently common to warrant testing.)  The results showed handshakes transmitted 10 times more bacteria than fist bumps and two times more germs than a palm-smacking high five.

Am I the only person who is relieved at the fact that scientists who developed this particular study didn’t decide to also examine the germ transmission of hugs and kisses, and thereby avoided sticking their faces, lips and entire bodies into vats of bacteria?

No one will be surprised that physical contact with humans involves potential germ transmission.  Of course, contact with just about anything outside of a sealed white-room environment involves potential germ transmission.  Do these scientists ever use a public restroom or take a crowded subway train and have to hang onto a pole?  Unless you want to be a recluse, germ transmission is just something we accept in modern life.

And, in the professional world — at least for a 50-something guy like me — there really aren’t any viable alternatives to a handshake.  I’m not going to be high-fiving opposing counsel when they arrive for a deposition, and in many situations advancing toward someone with your hand clenched into a fist could be misconstrued and provoke more immediate and painful health consequences than a little germ transmission.

If we’re really that concerned about public germ transmission, why not start a campaign to avoid hand contact altogether and encourage everyone to use the Fonzie thumbs-up sign, the double finger-point, or something equally ludicrous?  I’ll just accept the germ-infested reality of the modern world and stick to handshakes, thank you very much.

The Lab Coat Factor

Lately I’ve noticed that more and more products — from gasoline to rewards cards to patent medicines — are being advertised by people in lab coats.

IMG_1153Somebody must have done a marketing study about this and determined that Americans just trust people in lab coats.  How else to explain why companies who are trying to decide how to clothe the human mannequins that appear on billboards and point-of-purchase ads would pick lab coats as opposed to, say, a minister’s collar, a nurse’s uniform, or the loud sportscoat and gold-buckled loafers of a used car salesman?  If my assumption is correct, why would people be more trusting of a shill just because he’s clad in a lab coat?  Is it because a lab coat suggests intelligence and precision?  Or, is it because lab coats have quasi-medical connotations, and people trust their doctors?  I’ve known scientists and lab workers and they were decent human beings — but not measurably more honest or credible than people in other lines of work.

Often, the lab coat seems to have nothing to do with the product or service being sold.  Consider the Shell rewards card ad that I saw when I fueled up my car today, a picture of which accompanies this post.  It features a nerdy-looking guy in a lab coat gesturing toward the card.  I guess he’s supposed to be a fun-loving Shell fuel technician . . . but why would anyone rely on a lab worker to provide them with guidance about smart financial decisions?  Lab workers may be adept with Bunsen burners, but that doesn’t mean they know bupkis about whether a payment card is a good deal or a rip-off.


Every now and then, scientists smash atoms together and discover a new element.  The new elements then go through an accreditation process before they become part of the periodic table that is grimly familiar to everyone who hated having to memorize the elements in their high school chemistry class.

The problems really arise, however, when the time comes to name the new elements.  The sad fact is, scientists suck at coming up with good names.  The latest two proposed names, for example, are Flerovium and Livermorium.  Basically, scientists just take the name of a person or place, add “ium” at the end, and that’s it.

That uninspired convention was used for most of the recent additions to the periodic table.  The boring names for newer elements — Ytterbium?  Lutetium?  Mendelevium? — stand in sharp contrast to the pithy, lyrical names of the older elements, like gold, silver, tin, and mercury.  No one is going to write a song called “Heart of Ytterbium” or pen a holiday standard called “Mendelevium Bells.”  It must be maddening for high school kids to try to pronounce, much less remember, all of these “iums.”

The new names are not only hopelessly unmemorable, they don’t tell you anything about the element itself.  The name “lead” connotes the heaviness of that ponderous metal.  In that regard, “Livermorium” is a missed opportunity.  That substance is formed by smashing calcium ions into the element curium and quickly decays into Flerovium.  How about a name that reflects the element’s short life — like Ephemerite?

I hereby offer to help the scientific community in developing better names — and thereby advance the cause of beleaguered high school chemistry students everywhere.