Rethinking Alzheimer’s

Alzheimer’s disease has been a known condition since it was discovered, in 1906, by a German doctor, and it has been the focus of lots of attention and research for decades.  It ranks as one of the top causes of death in the United States and is the third leading cause of death among people 60 and older, just behind heart disease and cancer.

So, after more than a hundred years, why haven’t we figured out how to treat this dread and deadly disease that robs people of their minds and personalities and leaves them empty shelves of their former selves?  Why, for example, have doctors and drug companies been able to develop effective treatments for HIV and AIDS, but not Alzheimer’s?

alzheimer_brainIt’s not that the scientific and medical community isn’t trying — but identifying the real cause of Alzheimer’s, and then devising a meaningful treatment, is proving to be an incredibly elusive challenge.  A brain with Alzheimer’s is like a car crash with no witnesses, where the accident reconstruction expert tries to find clues from the physical evidence.  Do those skid marks indicate that the driver was going too fast, or do they suggest that the driver was distracted, or was the driver paying attention when something like a deer unexpectedly came onto the road?  In the case of Alzheimer’s the brain is mangled and distorted and physically changed, both chemically and structurally.  Are those changes what caused the disease, or are they mere byproducts of the active agent that does the real harm?

For more than a quarter century, Alzheimer’s researchers and drug companies have been focusing on the “amyloid hypothesis,” which posits that an increase in amyloid deposits causes the disease, and have worked to develop drugs to target amyloid.  The hypothesis was devised because Alzheimer’s patients have an unusual buildup of amyloid in their brains, amyloid buildups have been found to be harmful in other bodily organs, and people with a genetic history of Alzheimer’s in their families also have been found to have mutations in the genes responsible for amyloid production.  With this kind of evidence, it’s not surprising that amyloid production has been the focus of treatment efforts.

Unfortunately, though, the trials of drugs that address amyloid production haven’t been successful — and after repeated failures, some scientists are wondering whether the amyloid hypothesis should be scrapped, and the disease should be examined afresh.  The amyloid hypothesis remains the prevailing view, but a minority of researchers think that the focus on amyloid buildup is like trying to close the barn door after the livestock have already escaped.  And they wonder whether the amyloid hypothesis has become entrenched with so many people, who invested so much time and money in developing amyloid-based treatments, that work on alternative approaches is being undercut.

It’s a classic test for the scientific method.  Over the years, there are countless examples of instances where prevailing views on medical, or physical, problems were overturned in favor of new approaches that turned out to accurately identify cause and effect.  The scientific method is supposed to objectively find the right answers.  For Alzheimer’s disease, maybe it is just a matter of tweaking how to develop the right treatment for the amyloid build-up — or maybe it’s something else entirely.

Those of us who have dealt with Alzheimer’s in our families hope the scientific and medical community put aside preconceived notions, dispassionately assess the evidence, and explore every avenue for developing a successful treatment.  This disease is just too devastating to go unaddressed.

Changing Over Time

Here’s some welcome, but not especially surprising, news:  scientists have concluded that our personalities change over time.

seniors_teensThe University of Edinburgh did an interesting study that confirms what should be obvious — people in their teenage years are a lot different from those same people as geriatrics.  The study looked at data compiled about the personality and character traits of people who were evaluated in 1947, at age 14, as part of the Scottish Mental Survey, and then tried to track down those same people down years later, when they hit age 77, to evaluate them again.  The study looked a personal qualities like self-confidence, perseverance, stability of moods, conscientiousness, originality, and desire to excel, and found very little correlation between the 14-year-olds and the 77-year-olds on the conscientiousness and stability of moods qualities, and no correlation on the others.

Any study of personality and character traits is not going to be as precise as, say, measuring the flow or neutrinos, because of observer bias.  The University of Edinburgh results, for example, rely on teacher assessments of the 14-year-olds — it’s not hard to imagine that your gym teacher might have a different take on self-confidence than your English teacher, for example —  and the 77-year-olds rated themselves and identified a close friend or family member to complete the survey.  I imagine, however, that by age 77 most people are going to drop the posturing and evaluate themselves pretty honestly.

So life, and time, change you.  No surprise there!  It would be weird indeed if a lifetime of experiences, good and bad, didn’t actually alter the way you reacted to other people and the world at large.  I carry around memories from my 14-year-old self, but other than that I don’t really feel a great connection to that awkward, tubby, dreamy, self-absorbed person on the verge of high school — which is kind of a relief, really.  I imagine that if most of us met our 14-year-old selves, we’d find it fascinating, but then conclude that we really weren’t all that likable back then, and give our parents, siblings, and friends a lot more credit for putting up with us.

The key, of course, is to change for the better.  It’s a worthy goal.

A Limit To Aging

It’s no secret that average life expectancy for men and women has been steadily increasing for years.  With advances in medicine, science, disease control, and other factors that affect mortality, it’s now commonplace for people to live well into their 80s and 90s, and more people than ever are hitting triple digits.

100-candlesBut if you read the occasional stories about the acknowledged oldest person in the world, you note that the maximums don’t seem to be advancing.  You see the report on the oldest person being presented with a birthday cake with more than 110 candles, and a few months later you read that that person has gone to the great beyond and a new “oldest person in the world” has taken on that designation.

This leads scientists to wonder whether there’s a natural limit to life expectancy in humans.  One recent study, which explored a mass of human mortality information, has concluded that the human life span is naturally limited to a maximum of about 115 years, and that it would be exceptionally rare for any human to hit 125.  The study noted that only one human, a French woman named Jeanne Calment who died in 1997 at the age of 122, has even come close.

Some scientists pooh-pooh this conclusion, noting that the current crop of super-old codgers may have had their life expectancies affected by malnutrition or childhood diseases that have since been eradicated, and that up-and-coming generations of people who have not been exposed to such life-affecting circumstances may easily break through the 115 or even the 125 barrier.  Others argue that extreme old age logically should have genetic limits, as the lives of different species of animals seemingly do.  And, of course, it’s possible that new advances in medicine — such as finding a cure for cancer or the development of readily available artificial organs — could have an impact.

For now, though, I guess we’ll just have to settle for going toes up in the prime of life at the age of 115.  That’s bitterly disappointing for those of us who want desperately to see the year 2100, but at least having a presumed end date of 115 will add some welcome structure to our retirement planning.

Slowing Down

I’m sorry to report that our dog Kasey seems to be slowing down.  That’s OK — it’s what happens to old dogs, and to old people, too.  But it also makes us sad.

We first noticed it because Kasey is now having trouble jumping onto couches and chairs.  In the old days, she could spring onto just about anything from a standing position.  Then, it took a running start, but she made it.  Now, she just puts her front paws on the seat and looks around beseechingly for a friendly face who might give her a lift up to one of her accustomed spots.

IMG_2601There are other signs as well.  She limps from time to time, and she doesn’t seem to like long walks quite as much, and she doesn’t strain at the leash like she used to.  Her head is turning white.  Her eating habits have become more erratic.  She’s more content to sit in the backyard in a cool, quiet spot.  And she’s had a few of those unfortunate “accidents” around the house.

When you notice these kinds of things, the antenna go up and you begin looking for more indications of health problems.  So far, though, we haven’t had to deal with any of those — aside from Kasey’s awful teeth, which seem to be more a product of bad care when she was little than advancing age.

We don’t know how old Kasey is, because she was a fully grown rescue dog when we first met her at the Erie County Humane Society.  We guess that she’s 14 or so, but she’s a smaller dog, and they are supposed to live longer.  We’re hoping that’s true.

In the meantime, Kish is watching Kasey like a hawk, keeping an eye out for gimpiness or apparent bowel problems, so we can get ol’ Kase to the vet at the first sign of trouble.  Kish’s careful observation of Kasey for signs of aging is a bit unnerving, though.  Now that I’ve passed 59, I’m squarely in the zone of scrutiny, too.

Alzheimer’s Isn’t Funny

Last week there were reports that Will Ferrell was pursuing a new movie in which he would portray Ronald Reagan.  The project was pitched as a comedy set during Reagan’s second term, in which he is depicted as already in the grip of Alzheimer’s disease and an intern is charged with convincing Reagan that he is an actor portraying the President.  After an outcry about the insensitivity of the concept from Reagan’s children and others, one of Ferrell’s representatives said the actor wasn’t going to do the movie.

brain-tree-dementia-624x295I get why the Reagan children reacted as they did, and I think Ferrell was wise to back away from the project.  The reality is that Alzheimer’s disease really isn’t very funny.  Sure, many people who have had to deal with a family member with the disease probably have shaken their heads and had a rueful laugh about a particular episode that demonstrates how the ill person has changed — whether by repeating themselves, or by not knowing a friend or family member, or by showing radical changes to their personality as the disease ravages their brain — but it’s defensive humor, designed to help you cope with the realization that a person you know and love is falling into a black pit from which they will never emerge, and there’s absolutely nothing you can do about it.

I’ve read several memoirs written by children who’ve cared for parents with Alzheimer’s or dementia.  When the books share a “humorous” anecdote, as they sometimes do, it’s uncomfortable reading because the victim of the disease is inevitably the butt of the humor — because they’ve forgotten where they are, or have taken a shower with their pants on, or have used a word that they would never had said before in polite company.  It’s not really funny at all.  It’s tragic, and it’s not fair to the person whose intellect and personality and consciousness is being irreversibly stripped away, bit by bit, until only an unfamiliar shell remains.  They can’t help themselves.

I suppose a hard-bitten, cynical Hollywood agent might think a script about an intern deceiving a character in the grip of Alzheimer’s was a laugh riot, but only if that agent didn’t know anyone who had experienced the disease.  These days, there aren’t many people who fall into that category, and those who have been touched aren’t going to go watch a “comedy” that reminds them of the devastation the disease inflicted.  And if such a movie ever gets made, how many members of the audience are going to erupt in belly laughs about the lead character’s painful confusion?

My guess is that most people who watched such a movie would leave with the same fervent vow found among people who have dealt with Alzheimer’s in their families.  It goes like this: “Please don’t let me ever, ever get Alzheimer’s.”

Birthday Wishes

  
Today is my birthday.

It’s great to live in modern times because, among other things, it’s easier to wish people happy birthday, and in more communication methods and forms, than ever before.  I’ve received grossly inappropriate, unforgivably ageist cards from family and friends, Facebook congratulations from pals old and new and a post from UJ with a picture of us as toddlers, text message birthday greetings, and nice emails from clients and colleagues.  It’s been great to be the target of so many good wishes.

I’ve even received happy birthday emails from my optometrist, my periodontist, and the America Red Cross.  I suppose there’s a kind of message there, too.

Password Obscenity Roulette

Hacking hackers are everywhere these days, and all at once.  For the IT guys amongst us, that means tinkering with firewalls and new defensive software and systems vulnerability checks and incident response plans and all of the other technical gibberish that makes IT guys boring death at a party.  For the rest of us, we can only groan in grim anticipation, because we know that we’re going to be asked to change our password . . . again.

rouletteOne of the great challenges of modern life is remembering all of the different “passwords” that we must inevitably use to access our various electronic devices and internet accounts and computer access points.  Unfortunately, we can’t use passwords like Allen Ludden would recognize. In fact, they can’t be a properly spelled word at all.  So that it’s a “strong” password, it’s got to include a weird combination of capitalized and lower case letters, numbers substituting for letters, and random characters, like ampersands and pound signs and question marks.  The result often looks like the sanitized representation of cursing that you might see from the Sarge in a Beetle Bailey cartoon — minus only the lightning bolts.  (@#%*$^@#!)  In a way, that’s pretty appropriate.

Of course, all of these suB5t!tu+ed characters, plus the fact that you need different passwords for different devices and accounts, plus the fact that passwords now must be changed much more frequently, make it impossible for the average human being to remember the passwords in the first place.  How many of us sit down at a computer or pick up our tablet and idly wonder for a moment what the &*%$# the password is?  And there’s the new year/check writing phenomenon to deal with, too.  When a new year comes, how long does it take you to stop automatically writing the old year in the date, because you’d been doing that for the past 346 days?  I had to change my iPhone password several weeks ago, and I still reflexively type in the old password every time I’m prompted, until I dimly realize that I’ve changed it and it’s time to key in the new one — if I can remember it.

There’s a positive aspect to this.  We’re all getting older, and people who deal with aging say that if you want to stay mentally sharp as the joints creak and the brain cells croak you need to play word games or solve puzzles.  Well, this generation has got that covered.  We don’t need silly games, because we’ve got frustrating passwords.