Ancient Surgery (And Post-Operative Care)

One of the most tantalizing aspects of human history is how little we know about our ancient forebears. Once you go back more than 5,000 or 10,000 years, to the period before written records and the age of surviving structures like the Sphinx and the pyramids of Egypt, there is little specific evidence of what humans did or how they lived. And the great length of that unknown period of human prehistory, stretching back tens of thousands of years, dwarfs the length of the historical record.

We tend to assume that our prehistoric ancestors were crude, ignorant people who hunted, ate, reproduced, and lived short, dangerous, violent lives. Every once in a while, however, scientists uncover something that challenges that assumption. The latest evidence that the ancients were more knowledgeable and more capable than we might have thought comes from a cave in Borneo, where scientists unearthed a 31,000-year-old skeleton of a young adult. The remarkable feature of the skeleton was that the bones revealed a successful amputation of the individual’s ankle–and that the patient then lived for years afterward.

Successful amputations require significant medical knowledge. Practitioners must know about the structure of bones, blood vessels, and muscle tissue, where and how to cut to remove the ruined bone and flesh, the need to leave flaps of skin to cover the remaining exposed bone, and how to close the wound, stop the bleeding, and avoid infection. Before this recent discovery, the oldest known evidence of an amputation dated to 7,000 years ago in France. The Borneo discovery pushes that medical knowledge back to a point more than 20,000 years earlier, and indicates that, in at least some areas, ancient humans were much more medically sophisticated that we believed. It makes you wonder: if Borneo communities had knowledgeable doctors 31,000 years ago, what other medical knowledge did they possess, and for that matter how sophisticated were their scientific, religious, philosophical, and political beliefs?

There is another, equally compelling conclusion to be drawn from the Borneo discovery. The wound healed, and the patient, who scientists believed was a child when the injury occurred, lived for years afterward. Given the rugged local terrain, like that shown in the photo above, surviving with only one working leg would have been impossible without the help of caregivers–and in all likelihood the entire tribe or local community. That necessary reality confirms that our ancestors weren’t thoughtless savages, but were decent, generous people who took care of each other. That conclusion also makes me feel better about our species.

Redefining Death

Yahoo has published an interesting article about an ongoing debate that most of us are blissfully unaware of: how do you define, as a legal matter, who is dead? The debate is heated, and is occurring in the context of discussions about rewriting the Uniform Determination of Death Act (“UDDA”). UDDA, which has been around since 1981, is one of many uniform laws that were drafted by the Uniform Law Commission and submitted to the 50 states in an effort to achieve standardized approaches to common issues, like what constitutes a contract for the sale of goods. In most instances, the work of the Uniform Law Commission addresses uncontroversial topics where reaching consensus is not difficult.

Redefining death has turned out to be an exception.

Determining who is legally dead is one of those areas where advances in medicine have affected legal issues. For many centuries, doctors determined death by listening for a heartbeat or taking a pulse and pushing a mirror under the patient’s nose to see whether breathing was occurring. Medical technology developed over recent decades has allowed machines to substitute for the heart and lungs, however, and other inventions have allowed us to examine human brain activity, which means the focus has shifted to the brain. If there is no brain activity, but a human being continues to breathe and other bodily functions continue with the help of machines, is that person alive or dead? How do we know if the cessation of brain activity is permanent? Should brain activity be controlling, or should the activities of other anatomical parts that affect body activity, like glands and the hippocampus, be considered? And another relatively recent medical advance–organ transplants–also is playing a role in the redefinition process. Essential organs can only be removed from a patient who is dead, so having a clear understanding of what that means is crucial to the organ transplant system.

The original UDDA was adopted by some states, but not others, and the rules defining death in different countries are even more muddled. The Uniform Law Commission is working to rewrite UDDA, and thereby redefine what legally constitutes death, against the backdrop of the medical issues and developments as well as some high-profile cases that have raised issues about when the end of life occurs. It’s a topic that touches upon medicine, law, philosophy, ethics, and religion–and, as with everything else in our modern era, politics. When UDDA was first proposed and adopted by states in the 1980s, it was not viewed as a controversial topic. Does anyone seriously believe that a rewrite of the statute would be viewed as apolitical in 2023, when it is expected to be rolled out to each of the 50 states, Puerto Rico, and Washington, D.C. for consideration?

You’d like to think that we can reach agreement on basic principles, like when someone is legally dead. The rewrite of UDDA will test that proposition.

My Doctor’s Questionnaire

My doctor is one of those incredibly capable health care professionals who is always acquiring information in order to provide the best possible medical advice.  He uses the information obtained from a questionnaire as deftly as surgeons use a scalpel or GPs use a rubber tomahawk on your knee to test reflexes.

Recently, though, I’ve noticed a change in the tenor of the questionnaires I’m getting from my doctor.  No longer are they just focused on allergies, or muscle strains, or my diet, or how much exercise I’m getting.  Now the questions seem a lot more, uh, pointed.  In my most recent visit, the very first page of the questionnaire I was given to complete was the “Duke Activity Status Index.”

img_5859“Can you take care of yourself (eating, dressing, bathing or using the toilet)?”

“Can you walk indoors such as around your house?”

“Can you walk a block or two on level ground?”

Can you climb a flight of stairs or walk up a hill?”

Hey, wait a second!  Exactly what kind of questionnaire is this, anyway?  Why are the busybody nerds at Duke wondering about whether I can walk a single block on level ground, or eat without assistance?

I’m guessing the “Duke Activity Status Index” is not given to 25-year-old patients.

And then the very next page in the questionnaire packet is the “Burns Depression Checklist,” and one of its questions is:  “Poor self-image:  Do you think you’re looking old or unattractive?”

Well, to be honest with you, I really wasn’t focused on the subject until I started to read this questionnaire!

The End Of “Drilling And Filling”

Here’s another example of the miracles of modern medicine:  scientists have discovered a drug that appears to encourage damaged teeth to regenerate — a development that could bring an end to the practice of drilling out cavities and filling them.

normal-tooth_1The drug is called Tideglusib.  It not only is self-evidently unpronounceable, it also has the effect of stimulating and activating stem cells within the pulpy center of teeth, promoting the generation of the hard material that makes up most of our teeth, called the dentin — as anyone who has carefully read the tooth diagrams and tooth charts at the dentist’s office will recall.  Scientists tested the drug on mice, and found that applying the drug to cavities in the teeth of mice, using a biodegradable sponge, caused the tooth being treated to regenerate enough dentin to close the cavity.  (Wait a second:  mice get cavities, too?  They must not be very attentive to brushing and flossing.)

The next step will be to test the drug on humans, but the signs are encouraging that we may be on the verge of a new approach to dentistry.  Speaking as someone who practiced terrible dental hygiene as a callow youth and often found myself sitting in the dentist’s chair, mouth agape, listening to the whine of the drill and hoping it didn’t strike a nerve, I think an approach that lets teeth regenerate naturally would be terrific.  And, for those of us who have dental fillings that date back to the days of Beatlemania, the regeneration of natural teeth would have the advantage of avoiding visits to the dentist because old fillings are finally cracking or breaking and need to be replaced, too.

 

Say Hello To Your New Organ

Scientists have determined that there is officially a new organ in the human body, which now will be enshrined within our starting lineup of stomach, lungs, heart, kidneys, and the other slimy, wriggly bags of glop pulsing along inside our skin suits.

mesentery-0The new organ — called the mesentery — isn’t “new” in the sense that it only popped into the human body in 2016.  It’s always been there, between your intestines and your abdomen, helping to advance the human digestive system.  In fact, Leonardo da Vinci, who found time to weigh in on anatomy between completing paintings and designing machines that never got built, considered it to be an organ, but later medical types decided that the mesentery instead should be viewed as a number of distinct structures.  However, recent tests confirmed that the distinct structures function together, which means that old Leo was right and puts the mesentery squarely into the “organ” category.  Gray’s Anatomy, the ultimate medical textbook, has had to be amended to make sure that the mesentery is properly categorized, and scientists and doctors hope that the changed classification will allow the mesentery to be more fully studied and, perhaps, lead to the development of better surgical approaches and treatments of disease.

The mesentery may be an ugly conglomeration of tissue that looks like something that has washed up on a beach and sat there for a while, but it performs two important functions.  First, it provides a conduit for blood vessels, nerves, and the lymphatic system to reach from the rest of the human body down to the intestines.  And second, it allows the intestines to be linked to the abdominal wall without being directly attached to the wall.

As one doctor noted, in describing this second function:  “It is unlikely that [the intestine] would be able to contract and relax along its entire length if it were directly in contact [with the abdominal wall]. [The mesentery] maintains the intestine in a particular conformation, ‘hitched up,’ so that when you stand up or walk about, it doesn’t collapse into the pelvis and not function.”

An important function?  I’ll say!  Given the role of the intestines, we obviously all should be gratefully thanking the mesentery for allowing us to answer nature’s call without having to “hitch up” and rearrange our innards afterwards.  I’m glad the mesentery is finally getting its just acknowledgement.

Pinfoot

I’ve now got steel pins in the bones of the middle three toes of my left foot. It sounds pretty painful, and it is. In fact, it hurts like hell.

Curiously, I didn’t really focus on this aspect of the surgery before going under the knife. I guess I thought it would be like having a dental implant, or some other painless miracle of modern medicine. It isn’t. When you’re drilling holes in bones and inserting metal rods, it’s going to hurt. The fact that the pins protrude from my toes and have little yellow plastic balls at the tip, like some kind of doll pin, just adds insult to injury. And I’ll probably never use the word “pinpoint” again without an inward shudder.

My painful, pinful experience also helps to explain the back story and motivation of that Pinhead horror movie character from the Hellraiser series. I’ve only got pins in three toes; that poor bastard had pins in every square inch of his head. No wonder the guy was always in such a foul mood! Just imagine how murderous he would have been if he had little yellow balls on the end of each pin, too.

Colonoscopy Number 4

I just got home from my fourth colonoscopy. My father had colon cancer years ago. As a result my primary care physician, who’s a big believer in the power of genetics and preventative medicine, he hustled to get me in for one immediately. I had my first colonoscopy at age 40, and I’ve had one every five years since.

Most medical procedures aren’t pleasant, and a colonoscopy is no different. Nobody wants to lie on a table, their keister flapping in the breeze, while a doctor inserts a flexible camera devices up where the sun don’t shine and then probes around in their intestinal tract looking for evidence of cancer. At least for the procedure, though, you’re knocked out.

The worst part of the process is the preparation, when you drink a foul-smelling concoction and then spend a lot of time sitting in the smallest room in the house, waiting for nature to take its course — again, and again, and again. By the time you get to the surgical center for the procedure, your intestines clean as a whistle but feeling somewhat overexercised, you change into a gown and are whisked into a small operating room at one of those pocket hospitals. You awaken in the recovery room, get a quick report, and head out on your way.

As a veteran of four colonoscopies, I can report that they have gotten easier. The clean-out fluid has improved dramatically. The first time I did it, they gave me a gallon of foul-tasting glop that was mixed with over-the-top pineapple flavoring in an effort to mask the awful taste of the glop. It didn’t work. Instead, the pineapple somehow had a catalytic reaction with the glop and formed the most disgusting, smelly sludge you could possibly imagine in your most disturbing, fevered nightmare. Drinking it was almost impossible. Now you drink less of the fluid, it doesn’t have ridiculous flavorings that would ruin your enjoyment of pineapples or grapes forever, and you split your consumption between the night before and the morning of the procedure. As for the procedure itself, it gets quicker and quicker.

We do a lot of things to try to stay healthy. I’m glad having to drink the appalling faux-pineapply laxative is no longer one of them.

The Value Of Vitamins

This week the Annals of Internal Medicine published an editorial about the growing use of vitamin supplements in America that may come as a surprise to many Americans.

Entitled Enough is Enough:  Stop Wasting Money on Vitamins and Mineral Supplements, the strongly worded editorial summarizes three articles and the results of a number of large scale studies that produced “sobering evidence of no benefit or possible harm.”  The editorial’s concluding paragraph states:  “In conclusion, B-carotene, vitamin E, and possibly high doses of vitamin A supplements are harmful.  Other antioxidants, folic acid and B vitamins, and multivitamin and mineral supplements are ineffective for preventing mortality or morbidity due to major chronic diseases.”

America has become a nation of pill-poppers.  About half of Americans take some kind of dietary supplement, and Americans spend $12 billion a year on vitamins alone and $30 billion for all dietary supplements.  The notion that the vitamin supplements Americans are swallowing in record numbers are ineffective — or even harmful — may shock people. Of course, whether Americans learn of the editorial and the results of the studies, and then whether they stop taking the vitamins and dietary supplements, is anybody’s guess; one vitamin user interviewed by CBS said she would keep slugging down the pills anyway.

Why are Americans so committed to vitamins and supplements?  Some people blame the aggressive marketing of the products, but I think the root cause lies in two other factors.  First, for years Americans have been bombarded with stories about studies that conclude that something is good or bad — be it cyclamates, red dye #2, or something else.  These studies, I think, have conditioned people to believe that taking one substance, or avoiding another, could have significant health benefits.  If a “medical study” shows that avoiding something has a material effect on health, why is it so outlandish to believe that taking another substance — or a combination of substances — might have a similar beneficial effect?  The context created by the onslaught of “medical studies” establishes fertile ground for hawking vitamins and supplements.

Second, people clearly hope that a magic little pill or two can make up for their lack of exercise, poor diet, or other questionable lifestyle choices.  Like Fox Mulder on The X-Files, they want to believe — but unlike Mulder, they lack any true skepticism.  If they skip a walk and eat a quart of ice cream but take a vitamin or “fat-burning” concoction, they can rationalize that they are doing something positive about their health.  They simply don’t want to get the advice offered by one of the authors of the Annals of Internal Medicine articles:  “fruits, vegetables, nuts, beans, low fat dairy, things like that ..exercising would probably be a better use of the money.”

And that’s probably why the Annals of Internal Medicine editorial won’t have much impact.  Believers believe, and hard advice and facts usually don’t get in the way.

Polio And “Superbugs”

In Syria, more than a dozen children have fallen prey to the crippling effects of polio.

“Polio?”, you say.  “That terrible affliction that paralyzed thousands of American children each year?  But polio was eradicated by the development of the Salk vaccine.”  Yes, but a vaccine can only work if the shot is delivered.  In war-torn Syria, some children aren’t receiving their vaccinations — and the polio virus is still out there, lurking and ready to spread its infection that, for some unlucky few, will produce paralysis.

The story of the Syrian children is a reminder of the thin line of defense that protects humans from illness caused by bacteria, microbe, and virus.  It’s a timely reminder, too, because the Centers for Disease Control and Prevention and other world health organizations are increasingly concerned about the development of “superbugs” — bacteria that have developed resistance to treatment because antibiotics are being overused.  The CDC estimates that more than 2 million Americans get antibiotic-resistant infections each year, and at least 23,000 die because drugs no longer stop their infections from spreading.  The two most dangerous “superbugs” in America are CRE bacteria, which produce deadly, raging infections, and Clostridium difficile, which produces diarrhea that kills thousands each year.  The CDC’s European health counterpart is reporting on outbreaks of other antibiotic-resistant illnesses in some European countries.

This is one of those stories that don’t get much attention because it isn’t threatening to most of us — at least, not right now.  But the spread of “superbugs,” and the overuse of antibiotics that often kill “good” bacteria that are found in every human, are an enormously important public health issue.  We need to stop the overuse of antibiotics that have contributed to the development of drug-resistant bacteria and focus on developing new vaccines and forms of treatment to fight the superbugs.  Otherwise, one day we might wake up to find that the stout antibiotic line of defense that has protected humans from all manner of deadly diseases is simply gone.

Happily Bionic

How many people do you know who have an artificial hip, or knee, or some other body part?  If you are like me, you know many such people.  They used to walk with tortured gaits, wincing as they favored their “bad knee” or “bad hip.”  Then they went under the knife, endured rehabilitation, and now are happily pain-free and advocates of joint replacements.

Such operations are not without risk, of course.  They involve major surgery.  Dr. Science, who is having both knees replaced, explained the procedure:  the surgeon slices the leg open, uses a whining bone saw to cut through the tibia and femur, removes the unattached knee, replaces it with the artificial knee, and then securely anchors the new knee to the bones above and below.  When you’ve had such a significant operation, you’re going to need lots of recuperation time.  And, of course, artificial knees and hips can fail, and most in any case have a limited life span — so if you’re young enough, you might need to undergo another operation in 12, or 15, or 18 years.

Still, my friends who’ve had successful joint replacements swear by their new bionic body parts.  Their failing knees and hips forced them to endure intense, constant pain for years.  Now, that pain is gone, and they can scarcely believe how wonderful it is to walk, or climb stairs, or sit without feeling like you’re being stabbed by demons.  Is it any wonder, then, that our bionic friends are among the loudest proponents of such surgery?

Without fanfare, we are living through the bionic revolution in medicine, where high-tech, full-scale replacements of joints have become commonplace and we peacefully coexist with friends who fall within the technical definition of cyborgs.

The Value Of A New Face

When I was a kid, they performed the first human heart transplant.  People were amazed, and it was a story and topic for discussion for days.

Now, of course, heart transplants happen with boring regularity, and we cease to be astonished by the advances in the medical sciences.  Whether it is non-invasive surgeries that allow athletes to bounce back within days from procedures that use to require months of recuperation, or drug therapies that can control formerly deadly diseases, or the implantation of devices to regulate heartbeats and stimulate nerve activity, medical miracles have become commonplace.

In my view, we should retain a bit of our awe at what doctors can do.  Consider this heartwarming story of a man who became a recluse for 15 years after his face was horribly disfigured in a gun accident that tore away his lips and nose.  In marathon surgery doctors replaced his jaw, teeth, and tongue and made him look like a normal human being.  He can now brush his teeth and shave and has regained his sense of smell.  More importantly, I imagine, he’s got his life back, and will no longer be too embarrassed to venture outside into the world like everyone else.

Amazing!  Just amazing, and wonderful for this poor man and everyone else who has suffered disfiguring injury.

Roman Medicine

Medical texts from the days of ancient Greece and Rome were consulted by physicians in the western world for hundreds of years, well into the Middle Ages.  Now examination of medicine chests found on a long-lost shipwreck is giving us a more tangible glimpse of how the ancients actually practiced medicine.

The wooden boxes were found on a ship that sank off the coast of Tuscany around 130 B.C.  They contain pills made of vegetables, herbs, plants, nuts, and clay, as well as a mortar and pestle and other devices that suggest that a doctor was on board.  The pills were kept in vials that were so well sealed they have been preserved for more than 2,000 years and can now be tested using DNA sequencing technology.  Experts believe the pills were used to treat sailors for dysentery and diarrhea.

The technology of ancient civilizations — which were able to seal containers against the intrusion of sea water for two millennia — continues to amaze, and one wonders what other discoveries may be lurking under the ocean waters, waiting to be discovered.  And, the modern world being what it is, don’t be surprised to see the “all-natural Roman cure” for diarrhea coming soon to an herbal medicine store and a late-night TV screen near you.

Drug-Addled

The latest story about the circumstances of Michael Jackson’s death is sad, but also symptomatic of how modern medical practices often seem to be extraordinarily reliant on prescribing drugs as the cure for every ill. The amount of medication Jackson apparently received is astonishing.

Can’t sleep? We’ll give you a drug, and if that doesn’t work we’ll give you another, and if that doesn’t work, we’ll try another. We’ve become accustomed to a world where there is a claimed wonder drug for every physical and mental problem. With the emphasis by patients and doctors alike on immediate, drug-induced relief from non-life-threatening conditions like insomnia, is it any wonder that there are instances of wretched excess?

Coming Soon To A Location Near You

H1N1 seems to be spreading throughout the U.S., which is what you would expect of an airborne condition in this time of easy, constant travel. It’s even invaded the ivy-covered walls of Northwestern University.

I don’t think people are panicked about the so-called “swine flu,” but I do think people are aware of it. Earlier this week I was in a conference room with an obviously ill person who sneezed, coughed, and sniffled throughout a day-long deposition. It was a bit unnerving. I’m sure everyone else in the room also was wondering whether the germs they were being exposed to were those of the normal flu, or the more recent, more virulent Mexican import.

Not Scary . . . Scary

In modern America, we are bombarded with news articles, couched in frightening terms, about claimed risks. Stories like those about flesh-eating bacteria, or flammable children’s nightwear, or the chance that a kid playing baseball might get hit in the chest between heartbeats are routinely found in the news media. Most of these claimed risks are minor. Moreover, the drumbeat of alarmist rhetoric has made many Americans jaded about such warnings.

A new disease that has jumped from species to species and that is passed by airborne particles or casual contact, on the other hand — now that is scary. If you doubt that, read And the Band Played On, by Randy Shilts, about the early days of AIDS, or any book about the Spanish Flu pandemic after the end of World War I. The WHO is right to urge strong action and raise concerns in response to the outbreak of swine flu in Mexico.