Ludwig’s Locks

Ludwig van Beethoven was a musical genius who was almost as well known during his lifetime for his health problems as for his titanic, soaring symphonies and his beautiful piano works. Beethoven famously suffered from progressive hearing problems that eventually produced functional deafness–requiring him to produce his later compositions in his head, without actually hearing the music he was creating–but his health problems went beyond hearing loss. Beethoven experienced chronic gastric issues for years, and when he died in 1827, after having been bed-ridden for months, he was afflicted by jaundice, liver disease, swollen limbs, and breathing problems. His health problems were so great that Beethoven wrote out a testament asking that his conditions be studied and shared after his death.

Two hundred years later, scientists have heeded those wishes and tried to figure out what was wrong with Beethoven. They took an interesting approach–identifying locks of the composer’s hair that had been cut from his head in the seven years before his death and preserved ever since, and then using DNA analysis. The team started with eight samples that purported to be Beethoven’s hair, and found that two of the hair clippings weren’t his and another was too damaged to use. One of the five remaining samples had been initially provided by Beethoven himself to a pianist friend, and analysis showed that all of the hair in the samples came from the same European male of Germanic ancestry.

The DNA analysis did not reveal the causes of Beethoven’s deafness or his gut issues, but did indicate that he was suffering from hepatitis B and had genetic risk factors for liver disease that may have been exacerbated by the composer’s alcohol consumption habits–which a close friend wrote included drinking at least a liter of wine with lunch every day. The genetic analysis also determined that one of Beethoven’s ancestors was the product of an extramarital affair.

I’ve ceased to be amazed by the wonders of modern DNA analysis and what it is capable of achieving. To me, the most surprising aspect of this story is that five legitimate clippings of hair from Beethoven’s head survived for two centuries. It make you wonder how many people were given locks of Beethoven’s hair in the first place. Ludwig van’s barber must have been a very popular guy.

ZZZs, If You Please

I’m a big believer in the benefits of a good night’s sleep. Humans obviously have a physical and mental need for sleep–as anyone who has pulled a college or job all-nighter can attest–and studies show that sleep increases mental sharpness, aids the functioning of the hormone system, and reduces stress, among many other values. A good night’s sleep also can provide helpful perspective on issues or problems. There’s a reason why people who are trying to make an important decision say that they “want to sleep on it.”

A recent study also shows that there is an association between sleeping well and avoiding depression. The annual Sleep in America poll conducted by the National Sleep Foundation (who knew there was such an organization?) found that more than 90 percent of adults who report that they sleep well also were free of depressive symptoms, whereas two thirds of adults who aren’t happy with their sleeping had significant levels of depressive symptoms.

There’s an obvious chicken-and-egg issue at play here: does a good night’s sleep help to ward off significant depression, or do people who are troubled by depression or anxiety have trouble sleeping as an essential part of the condition? Nevertheless, the correlation is worth noting. The proven, positive impact of sleep on mental acuity and stress reduction, and the fresh perspective sleep can bring, may also affect depressive thoughts.

Adults are supposed to get between seven and nine hours of sleep a day. If you’re feeling blue, you might want to examine your sleep habits and see whether a few extra hours in the Land of Nod helps you to feel better.

Mighty Forces, Ever At Work

For about 100 million years, all of the land masses of Earth combined to form one huge supercontinent that we now call Pangea (or Pangaea). When Pangea existed, the current continents fit neatly together, like the pieces of a jigsaw puzzle, as shown in the illustration above. The east coasts of North America and South America were wedged up against Africa, and it would have been possible to take a delightful driving trip to anywhere in the world–such as heading due east from Columbus, Ohio across the northern rim of Africa and then up through Eurasia to the eastern tip of Siberia. Careful packing would have been a must for that journey!

Pangea was just one of several supercontinents that have existed–and will exist again–in the long history of Earth. Like its predecessors, Pangea broke apart thanks to the ceaseless grinding movements of the tectonic plates that comprise Earth’s mantle. Those movements caused North and South America to drift gradually westward, creating the Atlantic Ocean, and Africa and Antarctica to fall away to the south and Australia to head east.

The best current evidence of the impact of the ongoing churning of the Earth’s crust is found in Africa, which geologists have determined is slowly splitting apart. The Somalian plate in east Africa is shearing off from the Nubian plate on which the rest of Africa is situated. Each year, they move a few millimeters farther away from each other–meaning that in a few million years a considerable gap will open, a new ocean will be created that fills the gap, and there will be a new, large island off the east coast of a smaller Africa.

In an even longer period of time. the tectonic plate movements will push North America all the way west to Japan and the east coast of Asia, forming a new supercontinent that they have already dubbed “Amasia.” At that point, it will be possible to take a cool driving trip from Columbus due west to the Great Wall of China–with a stop at the Corn Palace and other attractions along the way, of course.

The Unknown OCD Legacy

I’m reading an interesting biography of Noah Webster called The Forgotten Founding Father, by Joshua Kendall. Webster was an educator who developed a classic book on spelling that American schools used for generations, a lawyer, and a relentless champion of the need to establish a unique American identity and culture. His passion caused him to tackle the monumental project of creating an American dictionary–the American Dictionary Of The English Language, which was first published in 1828. Webster’s dictionary is one of the principal reasons why Americans and British have been called people separated by a common language; he thought English spelling rules were unnecessarily complex and was responsible, among other things, for eliminating the u in the American version of humour/humor and colour/color.

Kendall believes that Webster, an odd personality who once wrote “I am not formed for society,” suffered from what is now known as obsessive-compulsive personality disorder, or “OCD.” OCD “causes an extensive preoccupation with perfectionism, organization and control” that can produce rigidity and an inability to compromise, and therefore interfere with maintaining interpersonal relationships. You can see indications of OCD in this description of what Noah Webster did to create his dictionary:

“In total it took twenty-eight years to complete. To evaluate the etymology of words, Webster learned twenty-six languages, including Old English (Anglo-Saxon), Greek, Hebrew and Latin. Webster completed his dictionary during his year abroad in 1825 in Paris, France, and at the University of Cambridge. His book contained seventy thousand words, of which twelve thousand had never appeared in a published dictionary before.”

We don’t know for sure whether Noah Webster had a diagnosable case of OCD, because the condition wasn’t generally recognized until well after this death. Webster therefore is one of those figures where historians look for clues to determine whether OCD was likely. A British history of OCD suggests that other notable people who may have had the condition include Charles Darwin, Dr. Samuel Johnson, and Martin Luther; some believe that Albert Einstein, Marie Curie, and Sir Isaac Newton also had forms of OCD. The list of potential sufferers from the condition makes you wonder how many literary, scientific, and cultural advances occurred because an individual became fixated on a particular project or idea and engaged in a single-minded pursuit of it, to the exclusion of normal human interaction and behavior.

Interestingly, one word that was historically used to describe some of the symptoms of what we now call OCD was “scrupulosity”–and it was one of the 70,000 words Noah Webster defined in his dictionary. The 1828 edition of his dictionary defined it as follows:

1. The quality or state of being scrupulous; doubt; doubtfulness respecting some difficult point, or proceeding from the difficulty or delicacy of determining how to act; hence, the caution or tenderness arising from the fear of doing wrong or offending.

The first sacrilege is looked upon with some horror; but when they have once made the breach, their scrupulosity soon retires.

2. Nicety of doubt; or nice regard to exactness and propriety.

So careful, even to scrupulosity were they to keep their sabbath.

3. Niceness; preciseness.”

I wonder if Noah Webster had a flash of self-awareness when he wrote that definition?

The Personal Cost Of Gridlock

No one–not even the most hardened California or New York City commuter–likes sitting in traffic. It’s frustrating, and annoying, and a colossal waste of time. But what does it mean for you, physically, if you are spending hours every day personally experiencing gridlock?

A recent study from the University of British Columbia suggests that sitting in traffic and breathing in diesel fumes and exhaust is bad for your brain. The study indicates that exposure to traffic pollution produces altered brain network connectivity in humans, and that signs of decreased brain function can start to appear after as little as two hours of exposure.

The UBC study exposed 25 test subjects to either diesel exhaust or filtered air, then used MRI technology to measure brain activity. The results showed that people exposed to diesel exhaust exhibited less activity in the parts of the brain that are involved with internal thoughts and memories. Fortunately, the affects were temporary, and the brains of people exposed to the diesel exhaust were temporary. What hasn’t been tested yet, however, is whether consistent, daily exposure might cause more lasting damage to brain connections.

For some years now I’ve lived close enough to work to walk, and I have been very happy to avoid a long daily commute, sitting in traffic, and the stresses those activities produce. The UBC study just provides further confirmation that prolonged daily exposure to snarled traffic isn’t a good thing. If you have a tough commute, the UBC researchers suggest keeping the windows rolled up and making sure your air filter is a good one. And if you’re a cyclist or a pedestrian, they urge finding a route that keeps you away from diesel exhaust.

We all share a common interest in maintaining our remaining brain connections, such as they are, at peak functionality.

Core Dynamics

I freely admit that I don’t spend a lot of time thinking about the Earth’s core. Living as I do on the thin outer crust of our planetary home, my focus is on the surface I inhabit and the atmosphere beyond, not on what’s happening miles below my feet.

That’s too bad, because the Earth’s inner core seems to be an interesting, and apparently somewhat quixotic, place. (It’s also the subject of some pretty cool science book-type graphics, like the one above.)

We don’t really know a lot about the Earth’s inner core, because of course no one has visited it. Based on a 1930s study of seismic waves and later confirming data, scientists believe that the inner core is a solid ball of iron and nickel. That solid core is covered by a sheath of liquid iron and other elements, and the interaction between the solid center and its liquid shell creates our planet’s magnetic field. But here’s the weird part: Because the solid inner core is separated from the rest of Earth by that liquid coating, the inner core can spin at its own pace, like a ball bearing covered by a thick layer of hot oil–without regard for what the rest of the planet is doing.

Some scientists have believed for years that the inner core rotates at a faster rate than the rest of the planet (called super-rotation), but there is a lively, ongoing debate about that. The debate has been spurred by some recent findings that the super-rotation has stopped, and that the core might now be spinning at a pace slower than the rest of the planet. If the pace of inner core spin has in fact changed, no one knows exactly why, or what causes the core rotation to slow down or speed up. Only the curious physical forces influencing that inner planet of solid metal and its interaction with its superheated liquid iron coating know for sure.

And there’s another cool element to all of this: the preferred scientific method for studying the inner core is . . . earthquakes. When earthquakes occur, seismic waves pass through the planet and, the data acquired about their variations in speed and direction can equip scientists to draw inferences about what’s happening deep inside our planet. One article described the seismic waves as serving like a kind of geological x-ray. So you can be sure that, the next time an earthquake rattles the cupboards out in California or outer Mongolia, some scientists will be eagerly monitoring the seismic waves that result, looking for more clues about Earth’s quixotic inner core.

Identifying The First Writing

Historians generally accept that the first writing, using cuneiform script, was developed in ancient Sumer, in the region of modern-day Iraq, sometime around 3300 B.C., and that the first hieroglyphics were created in Egypt soon thereafter. In short, the prevailing view is that spoken language existed for thousands of years before written language was invented.

The consensus among historians and archaeologists is that the invention of writing began with pictures representing objects, and then the savvy Sumerians realized that they could use symbols to represent sounds in their spoken language–which is the basic concept underlying cuneiform script. The symbols in cuneiform and hieroglyphics became easily recognizable as a form of writing when the ancients began creating clay tablets and papyrus scrolls and covering them with the symbols.

But how do we know for sure that there weren’t even earlier forms of writing–forms that use symbols that are obscure to us in the modern day, and aren’t seen as obvious attempts at writing because they don’t, for example, appear to be used for record-keeping? That’s a question that scientists and historians are considering in connection with the beautiful cave paintings of Lascaux, which are believed to have been created about 20,000 years ago–long before the first cuneiform appeared in Sumer. The cave paintings include dots and dashes and geometric signs, along with the striking and colorful representations of ancient animals and hunting scenes. Could those apparently intentional, non-representational markings have been some accepted form of written form of communication, like a prehistoric Morse code? That question has generated a lively, ongoing scientific debate, with some researchers arguing yes while others are skeptical.

Of course, absent a new discovery of a Stone Age Rosetta Stone, we’ll probably never know for sure if the cave wall symbols are writing, and if so what they are meant to represent. But I suspect that the concept of writing came to early humans long before the ancient Sumerians invented cuneiform. Humans are communicating creatures, and if the creators of the Lascaux cave art used painting to communicate, as they clearly did, is it really so surprising that they might take the logical next step and use symbols, too?

The Unknown Ancestor In Our Human Family

The Earth of about 80,000 years ago must have been a pretty interesting place. That’s the point in time when our direct human ancestors left the African continent and began to spread across the face of the globe. As they spread, they encountered hominid cousins–cousins so closely related that, from time to time, our direct ancestors were able to interbreed with them and produce live, fertile offspring who, in turn, produced other children who entered into the ancient human genetic mix.

We know all of this because of the work performed in the human genome project, which is hard at work in analyzing human DNA and tracing it back to sources. The human genome project has shown that our DNA includes elements from Neanderthals, like the thoughtful fellow pictured above, and Denisovans. Now the project has identified a third, previously unknown, and as-yet-unnamed ancestor species that left an imprint on our DNA. The unknown species might be a product of separate interbreeding between Neanderthals and Denisovans or might be an entirely separate species.

Neanderthals, Denisovans, and the third species are now extinct, but they live on in their fractional contributions to our DNA, with most modern Europeans and Asians carrying a tiny part of Neanderthal DNA and most Melanesians and Australian Aboriginals carrying slightly larger amounts of Denisovan DNA. Researchers are trying to figure out what meaningful impact–if any–this “archaic DNA” has on the appearance, immune systems, and other characteristics of humans. That’s a complicated process, and the fact that we’ve now identified and welcomed to the human family another, previously unknown ancestor just makes the puzzle more challenging.

Making Oxygen On Mars

Thanks to the renewed interest in space exploration and improvements in rocketry technology developed by companies like SpaceX, we’re inching closer to the point where we might actually land human beings on the surface of Mars. But if we’re going to stay there for any meaningful length of time, we’ve got another challenge that we’ll need to overcome: the human visitors will need to breathe, and that means coming up with a reliable way to create a lot of oxygen in the pointedly carbon dioxide-rich, oxygen-poor Martian environment.

Fortunately for the future explorers of Mars, it looks like the big brains at MIT have come up with a solution. They created a lunchbox-sized device called the Mars Oxygen In-Situ Resource Utilization Experiment, or “MOXIE,” that was taken to Mars as part of NASA’s Perseverance rover mission and has been on the surface of Mars since the mission landed in February 2021. The underlying concept of MOXIE was to use the carbon dioxide on Mars to create oxygen–which is a lot cheaper than trying to cart oxygen all the way from Earth.

MOXIE sucks in the thin Martian air, filters and pressurizes it, then moves it through the Solid OXide Electrolyzer (SOXE), an instrument created by OxEon Energy. SOXE splits the carbon dioxide-rich air into oxygen ions and carbon monoxide, isolates the oxygen ions, and combines them into O2, the kind of oxygen humans breathe. Once MOXIE confirms that breathable oxygen has been created, it releases the oxygen and carbon monoxide into the Martian atmosphere.

MOXIE has been powered up on multiple occasions since its arrival, during different Martian seasons and at different times of the day and night, to see whether it works in different conditions and different temperatures. So far, MOXIE has worked under all conditions except dawn and dusk, when the temperature is changing rapidly, but the MIT team believes it has a solution to that problem. The little lunchbox-sized device creates six grams of oxygen per hour, which is about the same amount as a small tree on Earth.

When we get to the point of sending a human mission to Mars, the plan would be to send much bigger versions of MOXIE to the Red Planet ahead of the human mission, power them up, and let it generate a repository of oxygen that would supply the needs of both the human visitors and the rocket that would take the humans back home to Earth. Pretty cool!

The Perils Of Picking

Kids learn that they aren’t supposed to pick their noses at an early age, when their horrified mothers tell them it’s disgusting, ill-mannered behavior and they should stop doing it–right now! A recent study suggests that there is another potential reason to heed your mother’s edict: there could be a connection between damage to the internal tissues of the nostrils and dementia.

The study looked at the transmission of bacteria in mice and focused on Chlamydia pneumoniae, a form of bacteria that is common to mice and humans. That species of bacteria not only can cause pneumonia, as its name suggests, it also is found in many human brains that are afflicted with late-onset dementia. The study determined that the bacteria can travel quickly up the olfactory nerve that connects the mouse nasal cavity and brain to reach and infect the central nervous system and brain cells. The study also found that when the bacteria reached the mouse brains, the mice responded by creating amyloid-beta proteins to combat the infection–and amyloid beta protein clumps are typically found in humans suffering from Alzheimer’s Disease.

Moreover, the study showed that when there is damage to the nasal epithelium, the delicate membrane at the roof of the nasal cavity that separates it from the brain, the nerve infections get worse. And that’s where nose-picking–which can damage that protective layer of tissue between the nasal cavity and the brain–enters the picture.

We have a lot to learn about the causes of dementia, and studies of mice obviously don’t necessarily translate to humans. But if it’s even remotely possible that you can reduce your chances of developing dementia by refraining from self-inflicted nostril probes, it’s yet another reason to heed your Mom’s advice and keep your fingers away from your nose.

Redefining “Personal Hygiene”

Basic acts of personal hygiene have been in the headlines lately. First, an Iranian hermit described as “the world’s dirtiest man,” who hadn’t bathed in 60 years because he believed soap and water would make him sick, died recently at age 94. Someone dying at age 94 wouldn’t be especially noteworthy–except that the media reports of his death emphasized that the man, pictured above, became sick only after nearby villagers persuaded him to finally go for a wash-up, and he unfortunately went downhill after that.

Now a supposed “hygiene expert” has endorsed the approach of the hermit, contending that you not only don’t need to take a shower every morning, you don’t need to shower, period. Professor Sally Bloomfield of the London School of Hygiene and Tropical Medicine says daily bathing is “really not important” to hygiene and only became de rigueur to avoid offensive personal aromas. She says too much effort in applying soap and water could strip the human body of microorganisms that perform important functions and leave your skin dried and cracked, besides. According to the Professor, the only time you should shower off is before going into a swimming pool, because immersion could cause the microbes on our bodies to be transferred to a fellow swimmer–a concept that doesn’t exactly make me eager to go for a dip.

It’s hard to imagine what it must have been like to be within nostril range of someone who hadn’t bathed in 60 years and looked as filthy as the Iranian hermit did in the above photo. It’s also hard to imagine what the working world would be like if people who worked in physical labor jobs, or who worked in close proximity to others, stopped performing their daily ablutions. It’s even harder to imagine that anyone whose mother drilled in the notion that cleanliness is next to godliness and that daily washing, including behind the ears, is essential if you want to assume a place in polite society, could ever retreat from hopping into a morning shower for a good, hot scrub.

In short, Dial soap used the phrase “aren’t you glad you use Dial? Don’t you wish everybody did?” for a reason. Our mothers were right: odor avoidance and regular dirt removal are important parts of real personal hygiene and the general social compact. Let’s all resist the temptation to go full hermit, shall we?

An End To Nightmares

Could technology bring about an end to recurrent nightmares? Scientists think they may have found a way to redirect the sleeping brain away from those disturbing bad dreams that cause the frightened sleeper to awake with a start, with their heart hammering.

The development came during a study of people, estimated to be as many as four percent of all adults, who experience nightmares at the “clinically significant” level. Nightmare issues are deemed “clinically significant” when they occur more than once per week and cause other symptoms, like general anxiety and daytime fatigue.

The study divided 36 participants into two groups. One group received imagery rehearsal therapy (“IRT”), an existing form of treatment where they were instructed to recount their bad dreams, develop alternative, happier endings to the dreams, and then rehearse those happy endings during the hours when they were awake.

The other participants received IRT treatment, with a twist: as they envisioned more positive scenarios for their nightmares, a major piano chord was played every ten seconds, in an attempt to have the happier endings associated with the sound. The participants then went to bed wearing headbands that were capable of detecting when the sleeper had entered the rapid eye movement (“REM”) phase, when dreaming occurs, and playing the major chord associated with positive outcomes. The sound evidently helped to generate the positive outcomes, because while both groups saw a decrease in nightmares, the results were significantly better for the headband-wearing group, both immediately during the treatment and for months thereafter.

My dreams are mostly a confused rehash of things that happened during the day, as if my unconscious brain is trying to sort diverse experiences and inputs into a narrative–and since the experiences and sensations aren’t logically connected, the dream ends up making no sense. Fortunately, I don’t have recurrent nightmares, other than the “I’ve got an exam and I didn’t prepare” dream that I still get occasionally, decades after my schooling ended. I can imagine, however, that people who do experience nightmares at the clinically significant level will welcome a therapy that works. Wearing a headband and listening to piano chords would be a small price to pay to avoid waking up in terror. And the results also provide interesting insight into the power of music and its impact on the unconscious brain.

Fake Smiles And True Feelings

People have thought about fake smiles for a long time–probably for about as long as human beings have walked upright and the act of smiling became associated with happiness. They are curious about how to distinguish a fake smile from a real one, and why people fake smiles in the first place. Researchers have even examined whether working in a job where you are supposed to give a cheery smile to even unpleasant customers for your entire shift is likely to make you drink more at the end of the work day. (Spoiler: it looks like it does.)

But what about fake smiles outside the workplace, where you don’t have to give that grimace grin for eight hours while interacting with jerky customers? Does forcing a smile make you feel happier? This question had been the subject of scientific debate for so long that even Charles Darwin weighed in on the topic. In The Expression of the Emotions in Man and Animals, Darwin argued: “Even the simulation of an emotion tends to arouse it in our minds–but different studies over the years have produced different results.

Recently researchers decided to test the hypothesis, again, with a study of 3,800 people from 19 countries who were asked to respond to different prompts with a smile or a neutral expression, and then rate their happiness. The prompts were disguised, and mixed in with other facial expression requirements and even math problems, so participants presumably didn’t know that they were involved in testing whether a fake smile actually produced a happier perspective. The results suggest that faking a smile does, in fact, tend to make the fake smiler feel incrementally happier, at least in the short term.

So old Chuck Darwin apparently is right again, and forcing a grin will cause momentary changes in attitude–and at least so long as that keeping that fake smile on your face isn’t one of the requirements for your job at the neighborhood coffee shop.

The DART Hits The Bullseye (II)

When we last checked in on the NASA Double Asteroid Rendezvous Test (“DART”) probe, the golf cart-sized spacecraft had successfully smashed into Dimorphos, the asteroid circling its big brother Didymos, What wasn’t clear at that point was whether the successful navigation of the DART into Dimorphos had changed the trajectory of the asteroid.

Now we know: the DART not only hit the bullseye, it successfully changed the trajectory of the asteroid and exceeded expectations in doing so. Mission planners hoped that the DART would be able to change the length of time it takes Dimorphos to circle Didymos by 10 minutes, and tests reveal that the collision with the DART changed the orbit by 32 minutes.

The success of the DART is a big moment in developing a planetary defense to a potentially catastrophic asteroid strike. NASA Administrator Bill Nelson observed: “This mission shows that NASA is trying to be ready for whatever the universe throws at us. NASA has proven we are serious as a defender of the planet. This is a watershed moment for planetary defense and all of humanity, demonstrating commitment from NASA’s exceptional team and partners from around the world.”

Thanks to the DART, we are no longer at the mercy of the asteroids and meteors hurtling around our solar system. It’s not only cool, it’s great news for the future of homo sapiens and the other species that share planet Earth with us.

Analyzing Healthy Weight

What’s the “right” weight? It’s a question that doctors and their patients have wrestled with for years, and it’s clear that the standards are changing as human diet, nutrition, activity level, and general health are changing. Humans during the 1400s, being subject to periodic famines, plagues, and disease that stunted their growth, and engaging in day-long physical labor to put modest amounts of food on the table, probably looked a lot different from modern Americans. Even in the last century, the standards have changed. Consider, for example, that the average G.I. in World War II was about 5′ 8″ and weighed about 150 pounds. These days, you don’t see many 150-pound men in the average American city.

So what’s the “right” weight now, in an era of relative food abundance and modern medical treatments for human disease, where many people work at sedentary desk jobs?

For years, the accepted method for determining health weight has been the body mass index. The BMI was simple: it took your weight in kilograms and divided it by your height in meters, squared. The target zone for a healthy you was a BMI between 18.5 and 24.9. Now there is a debate about whether the BMI is really an effective tool, because it doesn’t consider where human fat cells have accumulated. That’s important, because the location of fat cells matters to human health and is related to conditions like diabetes, heart disease, and some forms of cancer. Abdominal fat–that “stubborn belly fat” that clickbait articles claim you can melt away with some “weird trick” or special drink–is more unhealthy than fat that accumulates around the hips, and “visceral fat,” the abdominal fat that builds up around the internal organs, is especially harmful.

As a result, some researchers are urging that use of the BMI be replaced by a focus on the waist to hip ratio. The waist to hip ratio is easy to use, too–you apply a tape measure to your waistline and your hips, and determine the ratio between them. Lower waist to hip ratios mean lower abdominal fat accumulation. And a recent study found that the waist-to-hip ratio was a better predictor of early mortality than the BMI.

There’s no doubt that losing excess weight is helpful to overall health; your hips, knees, and ankles will thank you. But the distribution of weight also matters. We’ll probably never avoid the scale at the doctor’s office, but the predictive value of the waist-to-hip ratio may mean your doctor will be taking out a tape measure, too, at your next exam.