By Jessie Ginsberg
Jessie Ginsberg is a second year student in the Master of Arts in Bioethics program and a third year law student at Emory University.
A father stood at the door of his local Minneapolis Target, fuming, and demanding to speak to the store manager. Holding coupons for maternity clothes and nursing furniture in front of the manager, the father exclaimed, “My daughter got this in the mail! She’s still in high school, and you’re sending her coupons for baby clothes and cribs? Are you trying to encourage her to get pregnant?”
Target was not trying to get her pregnant. Unbeknownst to the father, his daughter was due in August.
In his February 16, 2012 New York Times article entitled, “How Companies Learn Your Secrets,” Charles Duhigg reported on this Minneapolis father and daughter and how companies like Target use marketing analytics teams to develop algorithms to anticipate consumers’ current and future needs. Accumulating data from prior purchases, coupon use, surveys submitted, emails from Target that were opened, and demographics, a team of analysts render each consumer’s decision patterns into neatly packaged data sets tailored to predict their future buying choices.
Flash forward to 2017, a time where online stores like Amazon dominate the market and cell phones are reservoirs of personal information, storing intimate details ranging from your location to your desired body weight to your mood. Furthermore, data analysis algorithms are more sophisticated than ever before, gobbling up volumes of information to generate highly specific and precise profiles of current and potential consumers. For example, plugging information into an algorithm ranging from social media activity to Internet searches to data collected from smart phone applications unlocks a goldmine of sensitive information that reveal the proclivities, thought processes, self-perception, habits, emotional state, political affiliations, obligations, health status, and triggers of each consumer. We must then ask ourselves, in the age of Big Data, can we expect mental privacy? That is, in a society replete with widespread data collection about individuals, what safeguards are in place to protect the use and analysis of information gathered from our virtual presence?
In addition to the information deliberately submitted to our phones and computers, we must also worry about the data we subconsciously supply. Take, for example, the brain training program Lumosity. Over the past 10 years, this website has lured over 70 million subscribers with promises that their product will “bring better brain health,” delay conditions like Alzheimer’s and dementia, and help players “learn faster,” be sharper.” Though Lumosity and other similar companies like LearningRx were sued by the Federal Trade Commission for false advertising and must now offer a disclaimer about the lack of scientific support backing their product, has the damage already been done?
Image courtesy of Pixabay. |
More troubling than a brain training company’s use of unsubstantiated claims to tap into consumer fears of losing mental acuity for financial gain, the information collected by these brain training programs may serve as yet another puzzle piece for big data firms. Now, not only can applications and search engine histories provide a robust portfolio of what an individual consciously purchases and searches, but now these brain training websites can provide deeper insights into how individuals reason and analyze information. In their article entitled “Internet-Based Brain Training Games, Citizen Scientists, and Big Data: Ethical Issues in Unprecedented Virtual Territories,” Dr. Purcell and Dr. Rommelfanger express this concern: brain training program (BTP) data “are being interpreted as current demonstrations of existing behaviors and predispositions, and not just correlations or future predictions of human cognitive capacity and performance. Yet, the vulnerability of cognitive performance data collected from BTPs has been overlooked, and we believe the rapid consumption of such games warrants a sense of immediacy to safeguarding these data” (Purcell & Rommelfanger 2015, 357). The article proceeds to question how the data collected through brain training programs will be “secured, interpreted, and used in the near and long term given evolving security threats and rapidly advancing methods of data analysis” (Purcell & Rommelfanger, 357).
Even more worrisome are the lack of protections currently afforded to those who turn to websites and phone applications for guidance in coping with mental health issues. According to a 2014 article entitled “Mental Health Apps: Innovations, Risks and Ethical Considerations,” research shows a majority of young adults with mental health problems do not seek professional help, despite the existence of effective psychological and pharmacological treatments (Giota & Kleftaras 2014, 20). Instead, many of these individuals turn to mental health websites and phone applications, which “are youth-friendly, easily accessible and flexible to use” (Giota & Kleftaras 2014, 20). Applications such as Mobile Therapy and MyCompass collect and monitor data ranging from lifestyle information, such as food consumption, exercise and eating habits, to mood, energy levels, and requests for psychological treatments to reduce anxiety, depression, and stress (Proudfoot et al 2013). Alarmingly, users of these programs are not guaranteed absolute protection from the developers. That is, current legal mechanisms in the United States do not fully prevent developers from selling personal health information submitted into apps to third party marketers and advertisers.
Justice Allen E. Broussard of the Supreme Court of California declared in a 1986 opinion, “If there is a quintessential zone of human privacy it is the mind” (Long Beach City Emps. Ass'n. v. City of Long Beach). Indeed, with the advent of cell phones, widespread use of the internet, data analysts, and complex algorithms that predict future behaviors, our claim to privacy is waning. Until laws and regulations are designed to protect information collected from phone applications and Internet use, it is crucial that consumers become fully aware of just how much of themselves they share when engaging in Internet and phone activity.
References
Giota, K.G. and Kleftaras, G. 2014. Mental Health Apps: Innovations, Risks and Ethical Considerations. E-Health Telecommunication Systems and Networks, 3, 19-23.
Long Beach City Emps. Ass'n. v. City of Long Beach, 719 P.2d 660, 663 (Cal. 1986).
Proudfoot, J., Clarke, J., Birch, M.R., Whitton, A.E., Parker, G., Manicavasagar, V., et al. (2013) Impact of a Mobile Phone and Web Program on Symptom and Functional Outcomes for People with Mild-to-Moderate Depression, Anxiety and Stress: A Randomised Controlled Trial. BMC Psychiatry, 13, 312.
Purcell, R. H., & Rommelfanger, K. S. 2015. Internet-based brain training games, citizen scientists, and big data: ethical issues in unprecedented virtual territories. Neuron, 86(2), 356-359.
Want to cite this post?
Ginsberg, J. (2017). Mental Privacy in the Age of Big Data. The Neuroethics Blog. Retrieved on , from http://www.theneuroethicsblog.com/2017/06/mental-privacy-in-age-of-big-data.html
Interesting, well-written post. I worry about even the "subtle" ways that this knowledge of brain/mind data might be used. I loved the sentence, "More troubling than a brain training company’s use of unsubstantiated claims to tap into consumer fears of losing mental acuity for financial gain, the information collected by these brain training programs may serve as yet another puzzle piece for big data firms." Pretty eye-opening (and frightening, yikes!).
ReplyDelete