Autonomy on Endless Trial

The Promise and Perils of Predictive Testing for Alzheimer's Disease

May 24, 2016 • 39:05

What if your brain showed signs of Alzheimer's disease decades before any symptoms occurred? Would you want to know?

Alzheimer’s disease, which “can’t be slowed, stopped, or prevented,” is the sixth leading cause of death in the United States, according to the Alzheimer’s Association. Among older Americans, Alzheimer’s is more feared than any other disease, including cancer, stroke, heart disease, and diabetes. Research suggests that the molecular changes of Alzheimer’s disease and other dementias may occur up to two decades before symptoms appear.

Clinicians are able to clarify diagnoses of Alzheimer’s disease using molecular “biomarkers” found through techniques such as lumbar punctures or molecular brain scans. Someday, these techniques could be applied to cognitively normal people to predict whether or not they’ll develop the disease. But should they?

Professional societies have cautioned against this use, given the lack of proven treatments to prevent Alzheimer’s disease in cognitively normal individuals who test positive. Many of us would value knowing this health information, either in its own right or to help us plan for our futures. But society hasn’t caught up to living with a brain at risk. There are laws prohibiting employment insurance discrimination based on our genetic information; however these laws don’t apply to molecular biomarkers. Those who seek predictive testing may also face serious unintended consequences from receiving this information.