Pages

Tuesday, September 26, 2017

Getting Out While the Getting's Good




By Dena Davis







Dr. Davis is currently at Lehigh University. She taught at Cleveland-Marshall College of Law (Cleveland State University) and Central Michigan University. She received her doctorate in religion from the University of Iowa and her J.D. from University of Virginia. Her specialty is bioethics, and her specific focus is on the ethics of genetic medicine and genetic research. Dr. Davis’ latest book is Genetic Dilemmas: Reproductive Technology, Parental Choices, and Children’s Futures (2nd Edition, University of Oxford Press, 2010). Dr. Davis has been a Fulbright scholar in India, Italy, Israel, Indonesia, and Sweden. Dr. Davis serves on the Central Institutional Review Board of the National Cancer Institute, and is a member of the NIH Embryonic Stem Cell Eligibility Working Group.





A number of times in the last two years I have been invited to speak about Alzheimer’s disease (AD). The venues have all been academic, but nonetheless have differed widely: South Carolina and New York City; bioethicists; physicians; undergraduates; hospital staff. I always begin by inviting people to participate in a thought experiment. I tell them that I am going to describe two people and then ask them which of the two they would prefer to be. (These people are actually my parents, but I don’t tell them that.) I first describe “M,” who remains cognitively intact and lives independently until his death from an aneurysm at 87. Then I describe “F,” who died at 99, after a ten year decline into Alzheimer’s disease. (I usually give a few details, such as when F was no longer able to live independently, when she became incontinent, when she no longer recognized family and friends.)



At this point, I hold my breath. I am about to ask my audience to choose whether they would prefer to be M or F, but the rest of my presentation relies on the assumption that most people will choose M. What if they don’t? At a recent conference in South Carolina, I almost funked it, made nervous by my own stereotypes about the South, and also because the previous speaker had given a heart-warming presentation of elderly people with dementia responding to music and clowns. At Emory University, my talk was preceded by a tremendously appealing presentation from a gentleman with a family history of Alzheimer’s. He spoke movingly of his aunt’s life and death with the disease; dare I suggest that most of us would prefer to die before we become symptomatic?






A portrait of a man with dementia.

(Image courtesy of Pixabay.)

However, despite the different venues, the response is always the same: virtually everyone in the room would prefer to die suddenly than to live a decade longer but with dementia. I know that there is a big gap between preferring to die before dementia, and taking that death into one’s own hands. Nonetheless, I know that I am not the only person who plans to “get out while the getting’s good,” and to attempt to end my life before dementia robs me of the ability to act. 





Achieving this goal, however, requires overcoming some daunting obstacles. One of the most difficult is how to find what I call the sweet spot, between not ending one’s life too early and missing out on some good years, and not missing the window of opportunity by putting it off too late and sliding into dementia. There are conflicting accounts of whether one is still able to make and carry out a plan while in the early stages of dementia, but that is a risk I am not willing to take. We do know that two of the earliest signs of dementia are loss of executive function, and lack of awareness of one’s own disability. In other words, one may be unaware of one’s increasing dementia, and unable to make a plan and carry it through. Here are two vignettes from the beginning years of my mother’s life with dementia, when she was still living independently, driving, paying bills, and legally capacitated:





My mother returns from an outing with another elderly woman. “I feel so sorry for Angela! She doesn’t know what’s what. She keeps repeating the same question over and over again. It’s very irritating.” My mother exhibits no awareness that she herself asks the same questions over and over again. 




My mother decides to get a cat. She mentions this to me and my brother, in fact she mentions it every time in our almost daily phone conversations. “Great idea, Mom, you should definitely get a cat. You know where the animal shelter is. Get a friend to go with you.” Next day, next week, next month, “I am going to get a cat.” Finally, I asked my son, on his next visit to his grandmother, to take the initiative to go with her to get a cat. They did and brought back Raven, who was a great success. My mother had a reasonable wish to do an easy and appropriate thing, well within her capacities, but there was a disconnect between her wish and her ability to act on it, almost like pushing down on the gas pedal and discovering that the cable has been cut. 









Image courtesy of Flickr.

Although my mother had spoken often during her life of her intention to commit suicide rather than live with dementia, she had missed the window of opportunity and had left it too late. But what would have been the right time? My mother did not begin to experience dementia until she was nearly 90, so had she arbitrarily decided to end her life at 85 (a point at which half of all Americans have some form of dementia) she would have lost some good years.



Where is the sweet spot?



This problem is brilliantly portrayed in Lisa Genova’s best-selling novel, Still Alice.1 Alice is a successful academic, at the top of her game, when she is diagnosed with early onset AD. She knows that research and teaching will soon be beyond her, but she is hoping for a few more years of enjoying her family and the mundane pleasures of an ice cream cone or a walk in the park. On the other hand, she is protective of her dignity and is determined not to end her life with a protracted decline into dementia. She crafts a strategy in which she programs her smartphone to buzz her every week with a simple quiz; when she is no longer able to respond appropriately to questions about the date or the names of her daughters, she will be directed to open a folder on her computer in which she has left a letter, written by Alice now to her later, demented self. However, Alice fails to realize when she begins to fail the quiz, and eventually she leaves the phone in the freezer, which ruins it. But one day, aimlessly clicking through files on her computer, she finds the letter she had written to her later self. The letter opens with words of love and reassurance, and then directs Alice to go upstairs to her bedroom, find a bottle at the back of her nightstand drawer marked “Alice,” and to take all the pills in the bottle with a big glass of water, get into bed, and go to sleep. The letter warns Alice not to discuss this with anyone—just do it. Alice wants to comply, but as she walks up the stairs to her bedroom, she forgets her purpose. She goes downstairs again to read the letter, remembers her purpose, but forgets again as she climbs the stairs. She wishes she could print out the letter to bring it with her, but no longer remembers how to work the printer. Eventually, she is distracted by her husband’s voice, and forgets the whole thing.






Image courtesy of Wikimedia Commons.

Although I doubt we will ever see a perfect solution to the “sweet spot” problem, the last decade has seen progress in a number of areas that can help individuals assess their background risk for Alzheimer’s, and the immanence of its approach. First, it is now possible to use direct-to-consumer genetic testing, such as 23andMe, to test oneself for an important genetic variable that influences one’s risk of getting Alzheimer’s: APOE.  Although even having two APOE4 variants does not doom one to the disease, it substantially raises the likelihood. Those who inherit one copy of the e4 form have a three-fold higher risk of developing AD than those without the e4 form, while those who inherit two copies of the e4 form have an 8- to 12-fold higher risk.2 People who have themselves tested and discovered a higher than average risk might wish to take further steps to monitor for any signs of the disease. Intriguingly, Alzheimer’s is now seen as a “3-stage disease,” of which the first stage occurs even before symptoms develop, perhaps decades before.3 It is increasingly possible to identify those people for whom the disease process has begun, but before they are symptomatic. Even better, it might be possible to track the disease progression, so as to end one’s life as close as possible to the last “good” moment.



Current efforts to diagnose AD in the presymptomatic stage are driven by two scientific goals. First, finding the disease at the earliest possible stage identifies appropriate patients for treatments that could slow or perhaps even reverse the course of the disease. This is crucial, because there is general consensus that the reason there are no effective medications for Alzheimer’s is that by the time the disease produces symptoms, it is way too late. Second, presymptomatic diagnosis is a crucial building block in medical research that seeks to find and test those at high risk for the disease.



Presymptomatic testing runs a gamut that includes neuroimaging to track volume loss and cerebral blood flow in the brain, concentrations of amyloid in the cerebral spinal fluid, PET scans, blood tests, and noninvasive tests of episodic memory.4 Other possibilities include motion sensors and “smart carpets” that diagnose impending dementia from changes in gait.5 These monitoring systems are part of a general movement to use technological surveillance to aid people in aging “successfully” at home, but there is no reason why a savvy and determined person could not make use of them to direct information only to herself.






Image courtesy of Wikimedia Commons.

The degree of certainty one needs to act is, obviously, a matter for each individual to decide. Each of us has a different balance of how we weigh more years of life against the value of not becoming demented. People have been making these kinds of judgments for years, usually in the face of uncertainty. Women, for decades, have been asked to weigh the risk of having a child with Down Syndrome, against the risk of losing a pregnancy through amniocentesis. People with cancer balance the possible benefits of various treatments (many of them experimental) against the possibility of side effects that can include cardiac damage and even secondary cancers. This is no different. Death is irreversible, but so is dementia. And once one has started down the dementia road, it is too late to turn back.





References




1. Lisa Genova. Still Alice (New York: Pocket Books, 2009).





2. Alzheimer’s Association. In Brief for Professionals. My Mother Has Alzheimer’s Disease: Am I Next? https://www.alz.org/health-care-professionals/documents/InBrief_GeneticLink.pdf Accessed August 9, 2017.





3. US Food and Drug Administration, Guidance for Industry: Alzheimer’s Disease: Developing Drugs for the Treatment of Early-Stage Disease: Draft Guidance (for comment purposes only), February 2013. http://www.fda.gov/downloads/Drugs/ GuidanceComplianceRegulatoryInformation/Guidances/UCM338287.pdf (accessed 10 August 2017).





4. Dena S Davis,“Alzheimer disease and pre-emptive suicide,” Journal of Medical Ethics 2014;40:543-549.





5. Pam Belluck, “Footprints to Cognitive Decline and Alzheimer’s Are Seen in Gait,” New York Times 16 July 2012.







Want to cite this post?




Davis, D. (2017). Getting Out While the Getting's Good. The Neuroethics Blog. Retrieved on , from http://www.theneuroethicsblog.com/2017/09/getting-out-while-gettings-good.html


No comments:

Post a Comment