Rethinking Alzheimer’s

Alzheimer’s disease has been a known condition since it was discovered, in 1906, by a German doctor, and it has been the focus of lots of attention and research for decades.  It ranks as one of the top causes of death in the United States and is the third leading cause of death among people 60 and older, just behind heart disease and cancer.

So, after more than a hundred years, why haven’t we figured out how to treat this dread and deadly disease that robs people of their minds and personalities and leaves them empty shelves of their former selves?  Why, for example, have doctors and drug companies been able to develop effective treatments for HIV and AIDS, but not Alzheimer’s?

alzheimer_brainIt’s not that the scientific and medical community isn’t trying — but identifying the real cause of Alzheimer’s, and then devising a meaningful treatment, is proving to be an incredibly elusive challenge.  A brain with Alzheimer’s is like a car crash with no witnesses, where the accident reconstruction expert tries to find clues from the physical evidence.  Do those skid marks indicate that the driver was going too fast, or do they suggest that the driver was distracted, or was the driver paying attention when something like a deer unexpectedly came onto the road?  In the case of Alzheimer’s the brain is mangled and distorted and physically changed, both chemically and structurally.  Are those changes what caused the disease, or are they mere byproducts of the active agent that does the real harm?

For more than a quarter century, Alzheimer’s researchers and drug companies have been focusing on the “amyloid hypothesis,” which posits that an increase in amyloid deposits causes the disease, and have worked to develop drugs to target amyloid.  The hypothesis was devised because Alzheimer’s patients have an unusual buildup of amyloid in their brains, amyloid buildups have been found to be harmful in other bodily organs, and people with a genetic history of Alzheimer’s in their families also have been found to have mutations in the genes responsible for amyloid production.  With this kind of evidence, it’s not surprising that amyloid production has been the focus of treatment efforts.

Unfortunately, though, the trials of drugs that address amyloid production haven’t been successful — and after repeated failures, some scientists are wondering whether the amyloid hypothesis should be scrapped, and the disease should be examined afresh.  The amyloid hypothesis remains the prevailing view, but a minority of researchers think that the focus on amyloid buildup is like trying to close the barn door after the livestock have already escaped.  And they wonder whether the amyloid hypothesis has become entrenched with so many people, who invested so much time and money in developing amyloid-based treatments, that work on alternative approaches is being undercut.

It’s a classic test for the scientific method.  Over the years, there are countless examples of instances where prevailing views on medical, or physical, problems were overturned in favor of new approaches that turned out to accurately identify cause and effect.  The scientific method is supposed to objectively find the right answers.  For Alzheimer’s disease, maybe it is just a matter of tweaking how to develop the right treatment for the amyloid build-up — or maybe it’s something else entirely.

Those of us who have dealt with Alzheimer’s in our families hope the scientific and medical community put aside preconceived notions, dispassionately assess the evidence, and explore every avenue for developing a successful treatment.  This disease is just too devastating to go unaddressed.

Advertisements

Alzheimer’s Isn’t Funny

Last week there were reports that Will Ferrell was pursuing a new movie in which he would portray Ronald Reagan.  The project was pitched as a comedy set during Reagan’s second term, in which he is depicted as already in the grip of Alzheimer’s disease and an intern is charged with convincing Reagan that he is an actor portraying the President.  After an outcry about the insensitivity of the concept from Reagan’s children and others, one of Ferrell’s representatives said the actor wasn’t going to do the movie.

brain-tree-dementia-624x295I get why the Reagan children reacted as they did, and I think Ferrell was wise to back away from the project.  The reality is that Alzheimer’s disease really isn’t very funny.  Sure, many people who have had to deal with a family member with the disease probably have shaken their heads and had a rueful laugh about a particular episode that demonstrates how the ill person has changed — whether by repeating themselves, or by not knowing a friend or family member, or by showing radical changes to their personality as the disease ravages their brain — but it’s defensive humor, designed to help you cope with the realization that a person you know and love is falling into a black pit from which they will never emerge, and there’s absolutely nothing you can do about it.

I’ve read several memoirs written by children who’ve cared for parents with Alzheimer’s or dementia.  When the books share a “humorous” anecdote, as they sometimes do, it’s uncomfortable reading because the victim of the disease is inevitably the butt of the humor — because they’ve forgotten where they are, or have taken a shower with their pants on, or have used a word that they would never had said before in polite company.  It’s not really funny at all.  It’s tragic, and it’s not fair to the person whose intellect and personality and consciousness is being irreversibly stripped away, bit by bit, until only an unfamiliar shell remains.  They can’t help themselves.

I suppose a hard-bitten, cynical Hollywood agent might think a script about an intern deceiving a character in the grip of Alzheimer’s was a laugh riot, but only if that agent didn’t know anyone who had experienced the disease.  These days, there aren’t many people who fall into that category, and those who have been touched aren’t going to go watch a “comedy” that reminds them of the devastation the disease inflicted.  And if such a movie ever gets made, how many members of the audience are going to erupt in belly laughs about the lead character’s painful confusion?

My guess is that most people who watched such a movie would leave with the same fervent vow found among people who have dealt with Alzheimer’s in their families.  It goes like this: “Please don’t let me ever, ever get Alzheimer’s.”

Ordinary Forgetfulness, Or Alzheimer’s?

The Neal side of our family, unfortunately, has a history of dementia and Alzheimer’s disease that has been growing lately.  Mom and Grandma Neal had dementia, Uncle Gilbert had Alzheimer’s, and my great-aunt, who another relative described as “crazy as a bedbug” when I was a kid, had mental problems so debilitating that she was put into a care facility at about the time she reached retirement age.

When you’ve got such a history in the family, and seen what these terrible degenerative brain diseases can do to bright, kind, loving people, you can’t help but wonder if there is a gene lurking somewhere in your DNA mix that will ultimately turn you down that same dark street.  And, you also pause at every instance of forgetfulness and ask yourself whether it is a sign that the dreaded downhill slide has begun.

It’s important to remember that an infallible memory is not part of the normal human condition.  With the richness of daily experience flooding our brains with new memories during every waking moment, it’s entirely normal to not remember every incident or person from the past with perfect clarity.  And the memory failure that most frequently causes people to question whether they’re losing it — the mental block that leaves you temporarily unable to recall a name, or a word — is commonplace in healthy, average humans.  Other normal issues include the tendency to forget facts or events over time, absent-mindedness, and having a memory influenced by bias, experiences or mood.

Fortunately, too, there are tests that can be taken that can help doctors distinguish between these ordinary conditions and the onset of dementia or Alzheimer’s.  The tests range from simple screening tests of cognitive functioning that can be given by a family doctor as part of an annual exam and completed in a few minutes to intense and extensive neuropsychological examinations that involve multiple days of evaluation.

The existence of such tests raises an interesting question.  Aging Americans are routinely poked, prodded, and scanned for heart disease, cancers and other bodily ailments.  Even though, for many of us, the prospect of being diagnosed with Alzheimer’s is as dreaded as any finding of a debilitating physical disease, there seems to be less of a focus on early detection and treatment of degenerative mental diseases.  With recent studies showing that significant percentages of older Americans are afflicted with dementia, shouldn’t that approach change?  Why shouldn’t a short cognitive screening test be as much a part of the annual physical as the rubber-gloved prostate probe?

Dying With Dementia

Several recent studies about dementia among America’s aged are profoundly disturbing — especially for those of us who aspire to live to a ripe old age.

IMG_1111One study, by the Alzheimer’s Association, concludes that one in three elderly dies with Alzheimer’s disease or some other form of dementia.  The dementia does not necessarily directly cause death,  but does contribute to an earlier demise because the senior forgets to take her medication, or is unable to recognize symptoms that should lead to prompt treatment.  Another study, led by an economist from the RAND Corporation, concludes that 15 percent of Americans over age 71 — about 3.8 million people — have dementia, and that number will increased to 9.1 million by 2040.  The study also found that the direct health care costs for dementia patients, at nursing homes and other care facilities, is $109 billion, and the costs of care also are expected to increase dramatically.

As a society, we must worry about how we are going to pay for such care, but as individuals we worry about becoming one of those statistics.  If you’ve been around someone with dementia, you realize it is an awful way to go.  So many of the afflicted appear to be perpetually frightened, or angry, or both.  They don’t recognize family members, or understand when people are trying to help them.  The disease works terrible, fundamental changes to their personalities and characters, turning the quick-minded former executive into a simpleton or the happy, encouraging aunt into a bitter font of hateful, deeply wounding comments.

So much of life’s joy and richness comes from our interaction with spouses, children, and loved ones; what must it be like to be stripped of those pleasures, left to cope with strangers with only a dim understanding of who you are and why you are there?  It’s a depressing, terrifying prospect.