Designing And Decorating For Dementia

Many people are familiar with the concept of “child-proofing” a house. When a baby is on the way, the parents-to-be will go through their home to try to make it as baby-safe as possible. That means doing things like putting inserts into electrical outlets, moving breakable items out of reach of curious toddler hands, and locking cabinets or drawers that contain cleaning supplies, sharp items, or other things that little kids shouldn’t touch.

Now, many Americans are putting the same concepts into play in another context: caring for elderly parents or spouses that are dealing with dementia. The goal is to design and decorate your home in a way that is as safe, helpful, calming, and supportive as possible.

For example, experts in the field note that people with Alzheimer’s often experience anxiety, so decorating in soothing colors, like shades of blue, can help. Because forgetfulness and confusion are symptoms, labeling things like dresser drawers to identify the contents can help the individual feel more self-sufficient. And safety devices, like smoke alarms that can detect when a stovetop burner has been left on by a forgetful senior, are a must.

Vision and spatial orientation issues also can be a problem, so creating color contrasts will allow the person to, say, find the handle to a cabinet more quickly. Picking out plates that make it easier for a vision-challenged person to see the food is useful, too. Other ideas include adopting lighting the helps with alertness during the day and calmness at night and putting out family photos that might trigger happy memories. There also are products that use spoken-word technology designed to help people who are struggling to read.

Caring for someone who is experiencing the early ages of Alzheimer’s or other forms of dementia can be exhausting and emotionally challenging. Anything you can do to make what is inevitably a difficult process a bit easier, for both the afflicted and the caregiver, is bound to help.

The Perils Of Picking

Kids learn that they aren’t supposed to pick their noses at an early age, when their horrified mothers tell them it’s disgusting, ill-mannered behavior and they should stop doing it–right now! A recent study suggests that there is another potential reason to heed your mother’s edict: there could be a connection between damage to the internal tissues of the nostrils and dementia.

The study looked at the transmission of bacteria in mice and focused on Chlamydia pneumoniae, a form of bacteria that is common to mice and humans. That species of bacteria not only can cause pneumonia, as its name suggests, it also is found in many human brains that are afflicted with late-onset dementia. The study determined that the bacteria can travel quickly up the olfactory nerve that connects the mouse nasal cavity and brain to reach and infect the central nervous system and brain cells. The study also found that when the bacteria reached the mouse brains, the mice responded by creating amyloid-beta proteins to combat the infection–and amyloid beta protein clumps are typically found in humans suffering from Alzheimer’s Disease.

Moreover, the study showed that when there is damage to the nasal epithelium, the delicate membrane at the roof of the nasal cavity that separates it from the brain, the nerve infections get worse. And that’s where nose-picking–which can damage that protective layer of tissue between the nasal cavity and the brain–enters the picture.

We have a lot to learn about the causes of dementia, and studies of mice obviously don’t necessarily translate to humans. But if it’s even remotely possible that you can reduce your chances of developing dementia by refraining from self-inflicted nostril probes, it’s yet another reason to heed your Mom’s advice and keep your fingers away from your nose.

Dementia Diagnoses

A recent metastudy conducted by the University of Michigan shows a sharp increase in the diagnosis of dementia among older adults. The study examined 3.5 million individuals over the age of 67 who died between 2004 and 2017 and specifically focused on the bills their providers had submitted to the Medicare system–and any diagnosis of dementia that was provided in connection with the bills.

The study found that, in 2004, 35 percent of the invoices submitted for specific patients contained some mention of dementia, and by 2017 that number had risen to 47 percent. A similar increase was shown when researchers limited the data to individuals where providers had submitted two or more bills that referenced dementia, with multiple mentions increasing from 25 percent of patients in 2004 to 39 percent in 2017.

So, is the condition of dementia increasing, or is there some other cause? There are two reasons to suspect that alternative causes for the increase in diagnosis may be responsible. First, Medicare billing practices changed between 2004 and 2017 to allow providers to identify more diagnoses on their requests for payment, and second, during that time period there has been increased emphasis on dementia and its treatment, including the adoption of the National Plan to address Alzheimer’s Disease. With billing practices allowing for more diagnoses and heightened sensitivity to signs of dementia, it is not surprising that the number of diagnostic mentions has gone up.

Whatever the cause for the increase in formal diagnosis, it’s clear that many elderly Americans suffer from at least some symptoms of dementia–and I also suspect that people are a lot more open about it than used to be the case. The U of M metastudy showing the prevalence of this dreaded condition may be of comfort to the family members who have a loved one who is sinking into the depths of dementia: they are not alone.

Instant Recall

Let’s say that Key to the Highway by Derek and the Dominos is one of your favorite songs, as it is one of mine.  How long would it take you to hear the first few notes and recognize that it’s being played on the radio?

According to some recent research, the answer is exactly 0.1 to 0.3 seconds.  That’s virtually instantaneous.

anim_homepageThe research focused on pupil dilation and certain brain activity that was triggered by hearing a favorite, familiar song and compared it to the reaction to listening to unfamiliar tunes. The study determined that hearing even a fraction of a second of a favorite song caused pupil dilation and brain activity related to memory retrieval — which would then cause you to immediately remember every note and every lyric.  One of the researchers noted that “[t]hese findings point to very fast temporal circuitry and are consistent with the deep hold that highly familiar pieces of music have on our memory.”

Why do researchers care about the brain’s reaction to familiar music?  Because the deeply engrained neural pathways that are associated with music might be a way to reach, and ultimately treat, dementia patients who are losing other forms of brain function.

The human brain is a pretty amazing thing, and its immediate recall of music is one compelling aspect of its functioning.  But here’s the thing the researchers didn’t consider:  immediate recall isn’t limited to favorite music.  In fact, it’s provoked by familiar music, whether it’s a tune you’d happily binge listen to or whether its a piece of music that you wish you could carve out of your synapses.  If I mention the Green Acres theme song, and you then think of the first few guitar notes for that song, I guarantee that every bit of the song will promptly come to mind, whether you want it to or not.  (Sorry about that!)  And isn’t it a bit disturbing to think that, if you eventually lose your marbles some day far in the future, one of the last things to go will be the tale of the Douglases and their “land, spreading out so far and wide”?

Rethinking Alzheimer’s

Alzheimer’s disease has been a known condition since it was discovered, in 1906, by a German doctor, and it has been the focus of lots of attention and research for decades.  It ranks as one of the top causes of death in the United States and is the third leading cause of death among people 60 and older, just behind heart disease and cancer.

So, after more than a hundred years, why haven’t we figured out how to treat this dread and deadly disease that robs people of their minds and personalities and leaves them empty shelves of their former selves?  Why, for example, have doctors and drug companies been able to develop effective treatments for HIV and AIDS, but not Alzheimer’s?

alzheimer_brainIt’s not that the scientific and medical community isn’t trying — but identifying the real cause of Alzheimer’s, and then devising a meaningful treatment, is proving to be an incredibly elusive challenge.  A brain with Alzheimer’s is like a car crash with no witnesses, where the accident reconstruction expert tries to find clues from the physical evidence.  Do those skid marks indicate that the driver was going too fast, or do they suggest that the driver was distracted, or was the driver paying attention when something like a deer unexpectedly came onto the road?  In the case of Alzheimer’s the brain is mangled and distorted and physically changed, both chemically and structurally.  Are those changes what caused the disease, or are they mere byproducts of the active agent that does the real harm?

For more than a quarter century, Alzheimer’s researchers and drug companies have been focusing on the “amyloid hypothesis,” which posits that an increase in amyloid deposits causes the disease, and have worked to develop drugs to target amyloid.  The hypothesis was devised because Alzheimer’s patients have an unusual buildup of amyloid in their brains, amyloid buildups have been found to be harmful in other bodily organs, and people with a genetic history of Alzheimer’s in their families also have been found to have mutations in the genes responsible for amyloid production.  With this kind of evidence, it’s not surprising that amyloid production has been the focus of treatment efforts.

Unfortunately, though, the trials of drugs that address amyloid production haven’t been successful — and after repeated failures, some scientists are wondering whether the amyloid hypothesis should be scrapped, and the disease should be examined afresh.  The amyloid hypothesis remains the prevailing view, but a minority of researchers think that the focus on amyloid buildup is like trying to close the barn door after the livestock have already escaped.  And they wonder whether the amyloid hypothesis has become entrenched with so many people, who invested so much time and money in developing amyloid-based treatments, that work on alternative approaches is being undercut.

It’s a classic test for the scientific method.  Over the years, there are countless examples of instances where prevailing views on medical, or physical, problems were overturned in favor of new approaches that turned out to accurately identify cause and effect.  The scientific method is supposed to objectively find the right answers.  For Alzheimer’s disease, maybe it is just a matter of tweaking how to develop the right treatment for the amyloid build-up — or maybe it’s something else entirely.

Those of us who have dealt with Alzheimer’s in our families hope the scientific and medical community put aside preconceived notions, dispassionately assess the evidence, and explore every avenue for developing a successful treatment.  This disease is just too devastating to go unaddressed.

Alzheimer’s Isn’t Funny

Last week there were reports that Will Ferrell was pursuing a new movie in which he would portray Ronald Reagan.  The project was pitched as a comedy set during Reagan’s second term, in which he is depicted as already in the grip of Alzheimer’s disease and an intern is charged with convincing Reagan that he is an actor portraying the President.  After an outcry about the insensitivity of the concept from Reagan’s children and others, one of Ferrell’s representatives said the actor wasn’t going to do the movie.

brain-tree-dementia-624x295I get why the Reagan children reacted as they did, and I think Ferrell was wise to back away from the project.  The reality is that Alzheimer’s disease really isn’t very funny.  Sure, many people who have had to deal with a family member with the disease probably have shaken their heads and had a rueful laugh about a particular episode that demonstrates how the ill person has changed — whether by repeating themselves, or by not knowing a friend or family member, or by showing radical changes to their personality as the disease ravages their brain — but it’s defensive humor, designed to help you cope with the realization that a person you know and love is falling into a black pit from which they will never emerge, and there’s absolutely nothing you can do about it.

I’ve read several memoirs written by children who’ve cared for parents with Alzheimer’s or dementia.  When the books share a “humorous” anecdote, as they sometimes do, it’s uncomfortable reading because the victim of the disease is inevitably the butt of the humor — because they’ve forgotten where they are, or have taken a shower with their pants on, or have used a word that they would never had said before in polite company.  It’s not really funny at all.  It’s tragic, and it’s not fair to the person whose intellect and personality and consciousness is being irreversibly stripped away, bit by bit, until only an unfamiliar shell remains.  They can’t help themselves.

I suppose a hard-bitten, cynical Hollywood agent might think a script about an intern deceiving a character in the grip of Alzheimer’s was a laugh riot, but only if that agent didn’t know anyone who had experienced the disease.  These days, there aren’t many people who fall into that category, and those who have been touched aren’t going to go watch a “comedy” that reminds them of the devastation the disease inflicted.  And if such a movie ever gets made, how many members of the audience are going to erupt in belly laughs about the lead character’s painful confusion?

My guess is that most people who watched such a movie would leave with the same fervent vow found among people who have dealt with Alzheimer’s in their families.  It goes like this: “Please don’t let me ever, ever get Alzheimer’s.”

Ordinary Forgetfulness, Or Alzheimer’s?

The Neal side of our family, unfortunately, has a history of dementia and Alzheimer’s disease that has been growing lately.  Mom and Grandma Neal had dementia, Uncle Gilbert had Alzheimer’s, and my great-aunt, who another relative described as “crazy as a bedbug” when I was a kid, had mental problems so debilitating that she was put into a care facility at about the time she reached retirement age.

When you’ve got such a history in the family, and seen what these terrible degenerative brain diseases can do to bright, kind, loving people, you can’t help but wonder if there is a gene lurking somewhere in your DNA mix that will ultimately turn you down that same dark street.  And, you also pause at every instance of forgetfulness and ask yourself whether it is a sign that the dreaded downhill slide has begun.

It’s important to remember that an infallible memory is not part of the normal human condition.  With the richness of daily experience flooding our brains with new memories during every waking moment, it’s entirely normal to not remember every incident or person from the past with perfect clarity.  And the memory failure that most frequently causes people to question whether they’re losing it — the mental block that leaves you temporarily unable to recall a name, or a word — is commonplace in healthy, average humans.  Other normal issues include the tendency to forget facts or events over time, absent-mindedness, and having a memory influenced by bias, experiences or mood.

Fortunately, too, there are tests that can be taken that can help doctors distinguish between these ordinary conditions and the onset of dementia or Alzheimer’s.  The tests range from simple screening tests of cognitive functioning that can be given by a family doctor as part of an annual exam and completed in a few minutes to intense and extensive neuropsychological examinations that involve multiple days of evaluation.

The existence of such tests raises an interesting question.  Aging Americans are routinely poked, prodded, and scanned for heart disease, cancers and other bodily ailments.  Even though, for many of us, the prospect of being diagnosed with Alzheimer’s is as dreaded as any finding of a debilitating physical disease, there seems to be less of a focus on early detection and treatment of degenerative mental diseases.  With recent studies showing that significant percentages of older Americans are afflicted with dementia, shouldn’t that approach change?  Why shouldn’t a short cognitive screening test be as much a part of the annual physical as the rubber-gloved prostate probe?

Dying With Dementia

Several recent studies about dementia among America’s aged are profoundly disturbing — especially for those of us who aspire to live to a ripe old age.

IMG_1111One study, by the Alzheimer’s Association, concludes that one in three elderly dies with Alzheimer’s disease or some other form of dementia.  The dementia does not necessarily directly cause death,  but does contribute to an earlier demise because the senior forgets to take her medication, or is unable to recognize symptoms that should lead to prompt treatment.  Another study, led by an economist from the RAND Corporation, concludes that 15 percent of Americans over age 71 — about 3.8 million people — have dementia, and that number will increased to 9.1 million by 2040.  The study also found that the direct health care costs for dementia patients, at nursing homes and other care facilities, is $109 billion, and the costs of care also are expected to increase dramatically.

As a society, we must worry about how we are going to pay for such care, but as individuals we worry about becoming one of those statistics.  If you’ve been around someone with dementia, you realize it is an awful way to go.  So many of the afflicted appear to be perpetually frightened, or angry, or both.  They don’t recognize family members, or understand when people are trying to help them.  The disease works terrible, fundamental changes to their personalities and characters, turning the quick-minded former executive into a simpleton or the happy, encouraging aunt into a bitter font of hateful, deeply wounding comments.

So much of life’s joy and richness comes from our interaction with spouses, children, and loved ones; what must it be like to be stripped of those pleasures, left to cope with strangers with only a dim understanding of who you are and why you are there?  It’s a depressing, terrifying prospect.