Designing And Decorating For Dementia

Many people are familiar with the concept of “child-proofing” a house. When a baby is on the way, the parents-to-be will go through their home to try to make it as baby-safe as possible. That means doing things like putting inserts into electrical outlets, moving breakable items out of reach of curious toddler hands, and locking cabinets or drawers that contain cleaning supplies, sharp items, or other things that little kids shouldn’t touch.

Now, many Americans are putting the same concepts into play in another context: caring for elderly parents or spouses that are dealing with dementia. The goal is to design and decorate your home in a way that is as safe, helpful, calming, and supportive as possible.

For example, experts in the field note that people with Alzheimer’s often experience anxiety, so decorating in soothing colors, like shades of blue, can help. Because forgetfulness and confusion are symptoms, labeling things like dresser drawers to identify the contents can help the individual feel more self-sufficient. And safety devices, like smoke alarms that can detect when a stovetop burner has been left on by a forgetful senior, are a must.

Vision and spatial orientation issues also can be a problem, so creating color contrasts will allow the person to, say, find the handle to a cabinet more quickly. Picking out plates that make it easier for a vision-challenged person to see the food is useful, too. Other ideas include adopting lighting the helps with alertness during the day and calmness at night and putting out family photos that might trigger happy memories. There also are products that use spoken-word technology designed to help people who are struggling to read.

Caring for someone who is experiencing the early ages of Alzheimer’s or other forms of dementia can be exhausting and emotionally challenging. Anything you can do to make what is inevitably a difficult process a bit easier, for both the afflicted and the caregiver, is bound to help.

The Perils Of Picking

Kids learn that they aren’t supposed to pick their noses at an early age, when their horrified mothers tell them it’s disgusting, ill-mannered behavior and they should stop doing it–right now! A recent study suggests that there is another potential reason to heed your mother’s edict: there could be a connection between damage to the internal tissues of the nostrils and dementia.

The study looked at the transmission of bacteria in mice and focused on Chlamydia pneumoniae, a form of bacteria that is common to mice and humans. That species of bacteria not only can cause pneumonia, as its name suggests, it also is found in many human brains that are afflicted with late-onset dementia. The study determined that the bacteria can travel quickly up the olfactory nerve that connects the mouse nasal cavity and brain to reach and infect the central nervous system and brain cells. The study also found that when the bacteria reached the mouse brains, the mice responded by creating amyloid-beta proteins to combat the infection–and amyloid beta protein clumps are typically found in humans suffering from Alzheimer’s Disease.

Moreover, the study showed that when there is damage to the nasal epithelium, the delicate membrane at the roof of the nasal cavity that separates it from the brain, the nerve infections get worse. And that’s where nose-picking–which can damage that protective layer of tissue between the nasal cavity and the brain–enters the picture.

We have a lot to learn about the causes of dementia, and studies of mice obviously don’t necessarily translate to humans. But if it’s even remotely possible that you can reduce your chances of developing dementia by refraining from self-inflicted nostril probes, it’s yet another reason to heed your Mom’s advice and keep your fingers away from your nose.

Dementia Diagnoses

A recent metastudy conducted by the University of Michigan shows a sharp increase in the diagnosis of dementia among older adults. The study examined 3.5 million individuals over the age of 67 who died between 2004 and 2017 and specifically focused on the bills their providers had submitted to the Medicare system–and any diagnosis of dementia that was provided in connection with the bills.

The study found that, in 2004, 35 percent of the invoices submitted for specific patients contained some mention of dementia, and by 2017 that number had risen to 47 percent. A similar increase was shown when researchers limited the data to individuals where providers had submitted two or more bills that referenced dementia, with multiple mentions increasing from 25 percent of patients in 2004 to 39 percent in 2017.

So, is the condition of dementia increasing, or is there some other cause? There are two reasons to suspect that alternative causes for the increase in diagnosis may be responsible. First, Medicare billing practices changed between 2004 and 2017 to allow providers to identify more diagnoses on their requests for payment, and second, during that time period there has been increased emphasis on dementia and its treatment, including the adoption of the National Plan to address Alzheimer’s Disease. With billing practices allowing for more diagnoses and heightened sensitivity to signs of dementia, it is not surprising that the number of diagnostic mentions has gone up.

Whatever the cause for the increase in formal diagnosis, it’s clear that many elderly Americans suffer from at least some symptoms of dementia–and I also suspect that people are a lot more open about it than used to be the case. The U of M metastudy showing the prevalence of this dreaded condition may be of comfort to the family members who have a loved one who is sinking into the depths of dementia: they are not alone.

Countering The Cabal

One of the admirable things about modern science is its inherent skepticism.  Scientists are supposed to be constantly challenging accepted ideas, developing hypotheses, and designing experiments to try to disprove the hypotheses — all in the name of gathering data, advancing our knowledge and developing new ways to analyze or address problems.  Whether it is physics, or biology, or the treatment of disease, the “scientific method” has reliably produced enormous gains in our understanding and huge advances in numerous fields.

investigacic3b3n-cientc3adfica-pac38ds-vasco-1024x683-1But what if scientists stopped behaving as skeptical scientists?  What if, instead, scientists came to believe so deeply in a particular theory that they became zealous advocates for that theory — almost as if they were adherents to a religious belief, rather dispassionate, objective scientists?

That’s the sad story that this article tells about research into Alzheimer’s disease, which affects nearly 6 million Americans and one in 10 people 65 and older.  Unlike other areas of medical research where great strides have been made — think of the rapid developments in the treatment for HIV and AIDS, for example — research into Alzheimer’s disease has not produced much progress.  Some of that may be attributable to the fact that the human brain is complicated, but many observers now are saying the absence of significant gains is attributable, at least in part, to what they call “the cabal”:  a group of influential researchers and related individuals who believed so fervently in a particular theory about Alzheimer’s that they thwarted research into other approaches to the disease.

The particular theory is that a substance called beta-amyloid accumulates in the brain, creating neuron-killing clumps that cause Alzheimer’s.  It quickly became so accepted in the Alzheimer’s world that scientists, venture capitalists, scientific journals, and research funding entities wouldn’t support or publish work on alternative theories — even if that’s what the scientific method teaches.  One observer quoted in the article linked above said:  “Things shifted from a scientific inquiry into an almost religious belief system, where people stopped being skeptical or even questioning.”  That’s a pretty chilling indictment, because it’s directly contrary to what is actually supposed to happen.

Notwithstanding the impact of the claimed “cabal,” some alternatives hypotheses that appear to be promising have been developed, and some small trials of potential treatments have occurred.  Still, it’s clear that not much progress has been made in treating dementia over the past few decades, and many people now believe that the near-universal acceptance of the beta-amyloid theory is at least partly to blame.  It’s a disturbing, cautionary tale about the bad things that can happen when scientists stop acting like scientists.

Alzheimer’s Isn’t Funny

Last week there were reports that Will Ferrell was pursuing a new movie in which he would portray Ronald Reagan.  The project was pitched as a comedy set during Reagan’s second term, in which he is depicted as already in the grip of Alzheimer’s disease and an intern is charged with convincing Reagan that he is an actor portraying the President.  After an outcry about the insensitivity of the concept from Reagan’s children and others, one of Ferrell’s representatives said the actor wasn’t going to do the movie.

brain-tree-dementia-624x295I get why the Reagan children reacted as they did, and I think Ferrell was wise to back away from the project.  The reality is that Alzheimer’s disease really isn’t very funny.  Sure, many people who have had to deal with a family member with the disease probably have shaken their heads and had a rueful laugh about a particular episode that demonstrates how the ill person has changed — whether by repeating themselves, or by not knowing a friend or family member, or by showing radical changes to their personality as the disease ravages their brain — but it’s defensive humor, designed to help you cope with the realization that a person you know and love is falling into a black pit from which they will never emerge, and there’s absolutely nothing you can do about it.

I’ve read several memoirs written by children who’ve cared for parents with Alzheimer’s or dementia.  When the books share a “humorous” anecdote, as they sometimes do, it’s uncomfortable reading because the victim of the disease is inevitably the butt of the humor — because they’ve forgotten where they are, or have taken a shower with their pants on, or have used a word that they would never had said before in polite company.  It’s not really funny at all.  It’s tragic, and it’s not fair to the person whose intellect and personality and consciousness is being irreversibly stripped away, bit by bit, until only an unfamiliar shell remains.  They can’t help themselves.

I suppose a hard-bitten, cynical Hollywood agent might think a script about an intern deceiving a character in the grip of Alzheimer’s was a laugh riot, but only if that agent didn’t know anyone who had experienced the disease.  These days, there aren’t many people who fall into that category, and those who have been touched aren’t going to go watch a “comedy” that reminds them of the devastation the disease inflicted.  And if such a movie ever gets made, how many members of the audience are going to erupt in belly laughs about the lead character’s painful confusion?

My guess is that most people who watched such a movie would leave with the same fervent vow found among people who have dealt with Alzheimer’s in their families.  It goes like this: “Please don’t let me ever, ever get Alzheimer’s.”