By Somnath Das
Somnath Das recently graduated from Emory University where he majored in Neuroscience and Chemistry. He will be attending medical school at Thomas Jefferson University starting in the Fall of 2017. The son of two Indian immigrants, he developed an interest in healthcare after observing how his extended family sought help from India's healthcare system to seek relief from chronic illnesses. Somnath’s interest in medicine currently focuses on understanding the social construction of health and healthcare delivery. Studying Neuroethics has allowed him to combine his love for neuroscience, his interest in medicine, and his wish to help others into a multidisciplinary, rewarding practice of scholarship which to this day enriches how he views both developing neurotechnologies and the world around him.
----
Humans in the 21st century have an intimate relationship with technology. Much of our lives are spent being informed and entertained by screens. Technological advancements in science and medicine have helped and healed in ways we previously couldn’t dream of. But what unanticipated consequences may be lurking behind our rapid expansion into new technological territory? This question is continually being explored in the British sci-fi TV series Black Mirror, which provides a glimpse into the not-so-distant future and warns us to be mindful of how we treat our technology and how it can affect us in return. This piece is part of a series of posts that will discuss ethical issues surrounding neuro-technologies featured in the show and will compare how similar technologies are impacting us in the real world.
*SPOILER ALERT* - The following contains plot spoilers for the Netflix television series Black Mirror.
Medical scientists and researchers have pioneered a plethora of technologies that seek to prolong the lives of patients in a state of disease or disability. For example, brain-computer interfaces are actively being investigated in multiple therapeutic contexts that can fundamentally alter the lives of patients with various disabilities. But, what if you could bring someone back from the dead by replicating their brain? The series Black Mirror raises this critical neuroethical issue in the episode “Be Right Back” by presenting a scenario describing the creation of artificial androids that use data to mirror the minds and behaviors of the recently deceased.
Plot Summary
Image courtesy of Wikimedia Commons. |
“Be Right Back” centers around Martha and Ash, a young couple whose lives are fundamentally changed shortly after their move to a new house. Ash is killed in a tragic accident on his way to return a rental vehicle, upsetting the life of his partner, Martha, who is largely alone in their remote community. Martha also discovers that she is pregnant, further complicating her newer, lonelier life. Martha struggles to cope with her lover’s death, but eventually finds some consolation in an online service that uses Ash’s messaging history to create a chat “bot” with which Martha can communicate. As time passes, Martha submits more data from Ash, including videos and photos, so that she can chat with the bot, which now mimics Ash’s voice, on the phone. Soon, Martha’s emotional solace in the bot evolves into an emotional dependency. This is exemplified when Martha damages her phone and has a significant panic attack. The creators of the bot service then inform Martha that she can participate in an experimental phase of their service, which involves the creation of an artificial android that looks just like Ash and uses his data to communicate with Martha. She consents, agreeing to provide data and help the company create the android Ash. The android then comes to live with Martha, and, at first, life is very enjoyable for her with her newfound companion. However, over time, she realizes that the android is not able to fully replace Ash and she grows increasingly horrified when the android displays behavioral anomalies that only remind her of what she lost.
The technology in Black Mirror is therefore, incomplete. The android Ash lacks the emotional and empathetic capacities (or simply put, the human je ne sais quoi) that Ash possessed (or rather acquired) in real life. The show also subtly points out the issue of informed consent; if Ash were alive, would he have consented to this technology? Finally, the technology itself possesses a crucial gap – likely made in the name of convenience. To fully replicate behavior, software must first understand the structure of the brain and how this structure influences and is influenced by its environment. In seeking to simply mirror Ash, the show’s technology thus commits a crucial error by paying attention to Ash’s behavior and not his brain – a flaw that made it seem so horrifying to viewers.
The State of Current Technology
Whole Brain Emulation
Image courtesy of Vimeo. |
Black Mirror’s version of the future proposes a technology that mirrors behavior, whereas some transhumanists and futurists advocate the development of technologies that either upload or replicate the structure of the brain itself. This “digital mortality” aims to preserve the mind of the human indefinitely, allowing for a type of communication that theoretically could be superior to the technology of Black Mirror. Other Futurists have proposed building software that runs as if it were a human brain itself, thereby emulating the brain without having to convert its structure into data. In “Whole Brain Emulation (WBE), A Roadmap,” Sandberg and Bostrom (2008) contend that the technology required to emulate a human brain is within our reach, given the advances of computational neuroscience in understanding the inner neuronal workings of animal nervous systems. They contend that the possibilities for WBE range from its research “being the logical endpoint of computational neuroscience’s ability to accurately model [the brain]” to its potential to test of various philosophical constructs of identity. Sandberg and Bostrom (2008) also contend that WBE is a necessary step in order to achieve mind and (subsequently) personality emulation. Should they turn out to be correct, then perhaps the fault with the technology in Black Mirror was that it was proposed too soon. That being said, it is difficult to predict when the technology proposed by Sandberg & Bostrom could come to fruition, but their own estimates—assuming super-computer technologies being used for simulation and a high level of funding—place complete development within the next century. Thus, the technology of Black Mirror may seem more feasible because the simulation is based off of behavioral, rather than brain, data.
Animal rights activists hope that as more complex WBE emulations are developed, the software will eventually be able to take the place of research of brains from live animal. Duda & Evers (2014) postulate that, in theory, simulation software could further enhance brain-machine interfaces - “for example by replacing missing dopaminergic input in Parkinsonian patients or by replacing visual input in the blind.” In order to produce a true substitute however, neuroscientists must likely test increasingly complex organisms to fine-tune the software’s emulation of complex brain processes, which poses animal protection and welfare concerns (Sandberg 2014). Perhaps the most concerning aspect of WBE is its assumption that we currently possess fully comprehensive knowledge about the brain and the components that influence its functions. Undiscovered neurotransmitters, brain nuclei, and neurohormones could further complicate the ability of WBE software to perform its functions accurately.
Sandberg & Bostrom themselves contend that these technologies could constitute a form of human enhancement. Thus, the critical issue of access poses an ethical concern: how do we choose whose brains get emulated first? It is likely that this technology will be very expensive if made available to consumers; however, who will the serve as the initial “guinea pig”?? An additional question is the emulation’s moral status. Sandberg (2014) highlights that if this emulation were to possess all of the brain’s capabilities, including consciousness, then the emulation may be afforded specific rights (as per the Cambridge Declaration of Consciousness). The emulation’s conscious abilities, however, depend entirely on both the accuracy to which neuroscientists simulate brain structure and function, and the nature of how consciousness arises itself (Duda & Evers 2014). Should the emulation be conscious of its environment and truly behave like a human, then it may display human-like attributes such as being aware of its captive state. If we were to manipulate this emulation’s free will to contain it, would this manipulation be considered an actual crime similar to imprisonment?
Head Transplantation
Image courtesy of Wikipedia. |
While WBE remains a technology of the distant future, neurosurgeons today are aiming to preserve the brains of individuals via head transplantation. Dr. Sergio Canavero claims that the first head transplantation procedure will occur this year. The Italian neurosurgeon bases his claims off of a previous experiment by Dr. Robert White, who transplanted the head of one rhesus monkey to another. The newly formed monkey survived for eight days “without complications,” although details about the monkey’s ability to feel pain remain unclear. Canavero has published papers describing proof of concept experiments. In a 2016 correspondence to Surgical Neurology International, he details both the severing and reattachment of a canine spinal cord, claiming that polyethylene glycol (PEG) – which he plans on using in his head transplant surgeries - contains the ability to fuse neuronal cell membranes following a sharp cut to the spinal cord. In 2017, Canavero and his colleague Xiao-Ping Ren piloted the creation of bicephalic (one body controlled by two brains) Wistar rats. These rats survived for about 36 hours on average.
Dr. Canavero has repeatedly used news and media outlets, including a TED talk, to promote his work. However, there remains to be a clear consensus on the validity of his proof-of-concept experiments. There are also clear neuroethical issues that have yet to be answered when it comes to head transplantation. In a previous post for The Neuroethics Blog, neuroscientist Dr. Ryan Purcell highlights three key ethical issues associated with the procedure: risk vs. benefit (the possibility of the head being alive yet belonging to a paralyzed body or in a constant state of pain), justice and fairness (who donates their head versus who donates their body?), and issues of personal identity post-transplantation. In the case personal-identity, the juncture between head and body could potentially prove disastrous to conscious perception of reality if the self is indeed static. Our external sensations influence how our internal identities develop, and therefore if a head were transplanted to a different body, then surgeons may end up creating a new person entirely. This issue is vitally important for ethicists and legal experts to debate in the context of informed consent. In an op-ed for The Washington Post, Dr. Nita Farhany notes that that the procedure could be considered active euthanasia under U.S. law because the head transplantation in theory requires the removal of one – if not two – identities prior to completion.
Conclusions
This episode of Black Mirror presents a reality in which data is used to mirror the behavior of the deceased. This technology has obvious benefits, including not having to fully replicate the brain of the individual (as occurs in WBE) and the mere fact that this technology can be used after the original person has passed away. At least some current efforts are attempted to create technologies that either replicate or save the brain while it is still alive, which pose a multitude of ethical issues that neuroscientists, clinicians, and ethicists are still actively debating. Black Mirror’s technology seems to circumvent these ethical issues by not focusing on brain structure; however, the show itself notes that this technology is an incomplete duplication. Both realities (recreating a person by reproducing their brain or simply reproducing their behavior) therefore propose compelling solutions to extending the lives of our loved ones, and it is clear that there are a host of issues to be resolved before these technologies can become a reality in the future.
References
Canavero, S. (2013). HEAVEN: The head anastomosis venture Project outline for the first human head transplantation with spinal linkage (GEMINI). Surgical Neurology International, 4(Suppl 1), S335–S342. http://doi.org/10.4103/2152-7806.113444
Canavero S, Ren X. Houston, GEMINI has landed: Spinal cord fusion achieved. Surg Neurol Int 13-Sep-2016;7: Available from: http://surgicalneurologyint.com/surgicalint-articles/houston-gemini-has-landed-spinal-cord-fusion-achieved/
Dudai, Y., & Evers, K. To Simulate or Not to Simulate: What Are the Questions? Neuron, 84(2), 254-261. doi:10.1016/j.neuron.2014.09.031
Hopkins, P. D. (2012). Why Uploading Will Not Work, or, the Ghosts Haunting Transhumanism. International Journal of Machine Consciousness, 04(01), 229-243. doi:10.1142/s1793843012400136
Li, P.-W., Zhao, X., Zhao, Y.-L., Wang, B.-J., Song, Y., Shen, Z.-L., . . . Ren, X.-P. (2017). A cross-circulated bicephalic model of head transplantation. CNS Neuroscience & Therapeutics, 23(6), 535-541. doi:10.1111/cns.12700
Sandberg, A. (2014). Ethics of brain emulations. Journal of Experimental & Theoretical Artificial Intelligence, 26(3), 439-457. doi:10.1080/0952813X.2014.895113
Sandberg, Anders; Boström, Nick (2008). Whole Brain Emulation: A Roadmap (PDF). Technical Report 2008. Future of Humanity Institute, Oxford University.
Want to cite this post?
Das, S. (2017). The Neuroethics Blog Series on Black Mirror: Be Right Back. The Neuroethics Blog. Retrieved on , from http://www.theneuroethicsblog.com/2017/06/the-neuroethics-blog-series-on-black_30.html
No comments:
Post a Comment