Pages

Tuesday, May 28, 2013

Let’s Put Our Heads Together and Think About This One: A Primer on Ethical Issues Surrounding Brain-to-Brain Interfacing

By John Trimper

Graduate Student, Psychology

Emory University

This post was written as part of the Contemporary Issues in Neuroethics course




Remember the precogs in Minority Report? The ones who could sync up their brains via the pale blue goo to see into the future?




The precogs from the movie Minority Report

Recent findings published in Scientific Reports (Pais-Vieira et al., 2013) suggest that the ability to sync up brains is no longer purely sci-fi fodder, and instead, has moved into the realm of laboratory reality. The relevant set of experiments, conducted primarily at the Nicolelis laboratory at Duke University, demonstrated that neural activity related to performance on a discrimination task could be recorded from one rat (“the encoder”) and transferred into a second rat’s brain (“the decoder”) via electrical stimulation. This brain-to-brain transfer of task-relevant information, provided the encoder rat was performing the task correctly, significantly enhanced the decoder’s ability to perform the task correctly (see Figure 2 for task description). That is, the decoder rat, who received no external clues as to which of two levers would provide a food reward, responded to the brain-to-brain transfer of information as if it cued him to choose the correct, food-rewarding lever. As a further proof of concept, the experimenters demonstrated that it wasn’t necessary for the rats to be hooked up to the same laboratory computer. In fact, it wasn’t even necessary for the rats to be on the same continent. Using the internet, the researchers were able to transfer information from the brain of an encoder rat at Duke University in real time to the brain of a decoder rat located in Brazil. Performance enhancements in this scenario were similar to those noted above (i.e., decoders chose the correct lever more often if brain-to-brain transfer was allowed).



The work by Pais-Vieira and colleagues (2013) is an important step forward for the field. As the authors suggest, the present findings mark progress towards “an organic computer capable of solving heuristic problems that would be deemed non-computable by a general Turing machine” (Pais-Vieira et al., 2013, p. 8-9). Indeed, Nicolelis’s group continues to be at the forefront of brain interfacing technologies. For example, in previous experiments, Carmena, Nicolelis, and colleagues (Carmena et al., 2003) trained monkeys to reach for and grasp virtual objects with a robotic arm using only their brains and visual feedback (i.e., without moving their own arms). Task relevant neuronal activity was recorded with implanted microelectrode arrays and computer algorithms converted the recorded signals into commands for the robot arm. The technologies hold great promise for the future of prosthetics and stroke rehabilitation.








Two of the mice (an "encoder" and a "decoder") from the experiment



But do the present findings mean that a brain-net is right around the corner? That human brains can be synced up to exchange to the kind of complex thoughts that one intuitively associates with terms like “telepathy”?



Well, no – not quite. Despite what the titles of many popular press articles seem to suggest (search Google for ‘telepathy rat brain transfer' for a laugh), at present, progress is hampered by several considerable limitations. These include the number of neurons that can be sampled, their neuroanatomical locations, and neural decryption/encryption capabilities. Brain-to-brain transfer of complex human thoughts will have to remain a sci-fi fantasy for the time being.



But, given that we’re at least at the point where this sort of technology is being discussed in earnest, it’s appropriate for ethical discourse surrounding the topic to receive a proportional degree of attention. Brain-to-brain interfacing (BTBI), in its current embodiment, involves extracting information from one individual’s brain and delivering this information to a second individual via implanted microstimulation electrodes (microstimulation electrodes, which are similar to the recording microelectrode arrays noted above, allow for the delivery of highly spatio-temporally precise stimulation patterns into the brain).  Thus, BTBI is associated with the same sorts of ethical issues that surround mind-reading (e.g., Kuebrick, 2012), deep brain stimulation (e.g., Schermer, 2011), and brain-computer interfacing (BCI) technologies (e.g., Vlek et al, 2012). However, given that BTBI involves a direct transfer of information between two individuals’ brains, the technique is also privy to its very own host of other ethical issues (e.g., legal and moral responsibility, issues of identity, and privacy).



Consider an illustrative example that is at least somewhat grounded within the framework of the technique’s current capacity. It’s not unreasonable to think that the military, with its appreciably liberal approach to “enhancement,” would be the first to employ BTBI technologies in humans. Imagine that one soldier in ground combat (“the decoder”), fitted with microstimulation electrodes and a helmet-mounted 365-degree camera, was able to neurally receive information directly from a second soldier (“the encoder”) watching the video-feed in a separate location. When the encoder detected a threat on the video feed, this information could be immediately transferred to the decoder, who could respond appropriately. This brain-to-brain transfer of threat information has the potential to be far faster than verbal transmission, and could potentially save many lives. Now imagine that the encoder, watching the video feed, accidentally recognizes a fellow soldier as a threat and a friendly-fire incident ensues. The decoder soldier fires the bullet that ends his comrade’s life. Who is responsible for the soldier’s death – decoder or encoder? What if the neural stimulation pattern was misinterpreted by the encoder (or, importantly, by the computer’s transfer algorithm)? What if the transfer was intentional on the part of the decoder?



According to some, issues surrounding liability as it relates BCI (e.g., one brain only) already have a strong framework for legal consideration. Tamburrini (2009) points out that by using the technology, the decoder is accepting some degree of responsibility for actions of the machine he/she becomes integrated with. To extend this to the current context, one would assume that the encoder individual would also be acknowledging his/her responsibility by taking part. Does this suggest that both would be tried as equally responsible if something went awry?



Of extreme importance for assessing liability would be the transfer algorithm’s ability to accurately decode the extracted neural information. Recording this information would ideally facilitate the post-hoc dissociation between transferred information and decoded/interpreted information, as well as, perhaps, providing some information regarding intention. A feat such as this, however, capable of identifying neural information content with 100% confidence, may be even farther out of reach than the transfer technology itself. As is currently the case for brain-interfacing technologies, each decoding algorithm employed must be carefully and rigorously calibrated (and re-calibrated later on) based on the individual implanted (e.g., “Nigel” in Vlek et al., 2012). One-hundred percent accuracy for the decoding of an intricate neural representation may not be feasible. Thus, a general decoding algorithm that could also extract intention, especially considering the brain regions that would be sampled from, seems unlikely (at present).



Of course, as is the case with any ethical dilemma, each scenario would need to be considered in terms of the specific conditional variables surrounding the event. The findings of Pais-Vieira and colleagues (2013) suggest an exciting future with BTBI technologies, and an equally spirited future of ethical discourse on the topic.







References




Pais-Vieira, M., Lebedev, M., Kunicki, C., Wang, J., & Nicolelis, M.A.L. (2013). A Brain-to-brain interface for real-time sharing of sensorimotor information. Scientific Reports, 3, 1319.



Wessberg, J., Stambaugh, C.R., Kralik, J.D., Beck, P.D., Laubach, M., Chapin, J.K., Kim, J., Biggs, S.J.,Srinivasan, M.A., & Nicolelis, M.A. (2000). Real-time prediction of hand trajectory by ensembles of cortical neurons in primates. Nature, 408(6810), 361-365.



Carmena, J.M., Lebedev, M.A., Crist, R.E., O’Doherty, J.E., Santucci, D.M., Dimitrov, D.F., & Nicolelis, M.A. (2003). Learning to control a brain-machine interface for reaching and grasping by primates. PLoS Biology, 1(2), E42.



Kuebrich , B. (2012). When the government can read your mind. The Neuroethics Blog. Retrieved on April 13, 2013, from http://www.theneuroethicsblog.com/ Schermer, 2011



Vlek, R.J., Steines, D., Szibbo, D., Kubler, A., Schneider, M-J., Haselage, P., & Nijboer, F. (2012). Ethical issues in brain-computer interface research, development, and dissemination. Journal of Neurologic Physical Therapy, 36(2), 94-99.



Tamburrini, G. (2009). Brain to computer communication: Ethical perspectives on interaction models. Neuroethics, 2, 137-149.





Want to cite this post?

Trimper, J. (2013). Let’s Put Our Heads Together and Think About This One: A Primer on Ethical Issues Surrounding Brain-to-Brain Interfacing. The Neuroethics Blog. Retrieved on , from

Friday, May 24, 2013

Now Available! Neuroethics Journal Club Video Archives on YouTube


The Neuroethics Journal Club videos are now available on YouTube. Watch each discussion to learn about a variety of neuroethics issues, from treatments for pedophilia to neural plasticity in mice. For each video, one presenter introduced the journal topic and opened discussion to the audience. 








Neuroethics Journal Club: The Sexed Brain


The Sexed Brain: Between Science and Ideology


Catherine Vidal, Neuroethics, 2012 













Abstract: Despite tremendous advances in neuroscience, the topic “brain, sex and gender” remains a matter of misleading interpretations, that go well beyond the bounds of science. In the 19th century, the difference in brain sizes was a major argument to explain the hierarchy between men and women, and was supposed to reflect innate differences in mental capacity. Nowadays, our understanding of the human brain has progressed dramatically with the demonstration of cerebral plasticity. The new brain imaging techniques have revealed the role of the environment in continually re-shaping our brain all along our lifetimes as it goes through new experiences and acquires new knowledge. However, the idea that biology is a major determining factor for cognition and behavioral gender differentiation, is still very much alive. The media are far from being the only guilty party. Some scientific circles actively promote the idea of an innate origin of a gender difference in mental capacities. Experimental data from brain imaging, cognitive tests or genetics are often distorted to serve deterministic ideas. Such abuse of “scientific discourses” have to be counteracted by effective communication of clear and unbiased information to the citizens. This paper presents a critical analysis of selected examples which emphasize sex differences in three fields e.g. skills in language and mathematics, testosterone and financial risk-taking behavior, moral cognition. To shed light on the data and the methods used in some papers, we can now—with today’s knowledge on cerebral plasticity—challenge even more strongly, many false interpretations. Our goal here is double: we want to provide evidence against archaic beliefs about the biological determinism of sex differences but also promote a positive image of scientific research.






Neuroethics Journal Club: Pedophilia


Real-time functional magnetic imaging—brain–computer interface and virtual reality: Promising tools for the treatment of pedophilia 


Renaud et al, 2011




 








Abstract: This chapter proposes a prospective view on using a real-time functional magnetic imaging (rt-fMRI) brain-computer interface (BCI) application as a new treatment for pedophilia. Neurofeedback mediated by interactive virtual stimuli is presented as the key process in this new BCI application. Results on the diagnostic discriminant power of virtual characters depicting sexual stimuli relevant to pedophilia are given. Finally, practical and ethical implications are briefly addressed.







Neuroethics Journal Club: Social Pain 


The pain of social disconnection: examining the shared neural underpinnings of physical and social pain 


Eisenberger, 2011 


Nature Reviews Neuroscience













Abstract: Experiences of social rejection, exclusion or loss are generally considered to be some of the most 'painful' experiences that we endure. Indeed, many of us go to great lengths to avoid situations that may engender these experiences (such as public speaking). Why is it that these negative social experiences have such a profound effect on our emotional well-being? Emerging evidence suggests that experiences of social pain--the painful feelings associated with social disconnection--rely on some of the same neurobiological substrates that underlie experiences of physical pain. Understanding the ways in which physical and social pain overlap may provide new insights into the surprising relationship between these two types of experiences.





Neuroethics Journal Club: Plasticity and Learning 


Forebrain Engraftment by Human Glial Progenitor Cells Enhances Synaptic Plasticity and Learning in Adult Mice 


Han et al, 2013 Cell Stem Cell













Abstract: Human astrocytes are larger and more complex than those of infraprimate mammals, suggesting that their role in neural processing has expanded with evolution. To assess the cell-autonomous and species-selective properties of human glia, we engrafted human glial progenitor cells (GPCs) into neonatal immunodeficient mice. Upon maturation, the recipient brains exhibited large numbers and high proportions of both human glial progenitors and astrocytes. The engrafted human glia were gap-junction-coupled to host astroglia, yet retained the size and pleomorphism of hominid astroglia, and propagated Ca2+ signals 3-fold faster than their hosts. Long-term potentiation (LTP) was sharply enhanced in the human glial chimeric mice, as was their learning, as assessed by Barnes maze navigation, object-location memory, and both contextual and tone fear conditioning. Mice allografted with murine GPCs showed no enhancement of either LTP or learning. These findings indicate that human glia differentially enhance both activity-dependent plasticity and learning in mice.




Stay tuned for more Neuroethics Journal Club videos next year. Also, to get daily updates about emerging ideas in neuroethics, follow Emory Neuroethics on Facebook here and Twitter here.


Tuesday, May 21, 2013

The identification of risk for serious mental illnesses: Clinical and ethical challenges



By Elaine Walker, Ph.D., Sandy Goulding, MPH, MA., Arthur Ryan, MA., Carrie Holtzman, MA., Allison MacDonald, MA.



Elaine Walker is Samuel Candler Dobbs Professor of Psychology and Neuroscience in the Department of Psychology at Emory University.  She leads a research laboratory that is funded by the National Institute of Mental Health  to study risk factors for major mental illness.  Her research is focused on child and adolescent development and the brain changes that are associated with adolescence.



The identification of risk factors for illness is receiving increased attention in all fields of medicine, especially cardiology, oncology, neurology and psychiatry.  There are three potential benefits to identifying risk factors. The first is to reduce morbidity by reducing risk exposure. The second is to enhance opportunities for targeting preventive treatment toward those who are most likely to benefit. Finally, the identification of risk factors can shed light on pathological mechanisms.



There are, of course, costs as well as benefits involved in this endeavor.  The benefits, in terms of reducing morbidity and mortality, are noncontroversial.  The costs, however, can be very controversial and they have generated discussion among ethicists. Foremost among the costs is the potential discomfort and distress that results from the identification of an individual as being at statistical risk for future illness.  There are also significant concerns about whether treatment should be initiated prior to the manifestation of symptoms that reach clinical threshold.  These concerns are especially salient in the field of psychiatry. In this post, we discuss current efforts to identify risk factors for serious mental illness and the ethical considerations they raise.





Among researchers and practitioners in the field, the term “serious” mental illness is typically used to refer to psychotic and chronic mood disorders.  Such disorders include schizophrenia, schizoaffective disorder, bipolar disorder, and major depression.  By definition, schizophrenia and schizoaffective disorders always entail psychotic symptoms; hallucinations, delusions and/or thought disorder.  Bipolar disorder and depression can occur with or without psychotic symptoms. Currently, the general consensus in the field is that these disorders stem from brain dysfunction that is caused by both biological and environmental factors. These illnesses also have substantial adverse social and personal consequences. In fact, this is particularly true of schizophrenia and other psychotic disorders, as they affect 1-2% of the population, typically begin in late adolescence/early adulthood, and are often chronic.



Consistent with other fields of medicine, research aimed at identifying risk factors for serious mental illnesses has burgeoned in the past few decades.  The first ‘generation’ of these investigations, referred to as “genetic” high-risk research, focused on the biological offspring of parents who suffered from serious mental illnesses such as schizophrenia. While this work yielded some useful findings, it was limited by the fact that the psychosis risk-rate for children of parents with schizophrenia is about 12% to 15%. In other words, the overwhelming majority do not develop the illness.  Further, the majority of schizophrenia patients do not have a biological relative with the illness, meaning that this approach failed to include a substantial portion of individuals with schizophrenia.



As a result of these limitations, researchers shifted their attention to individuals who manifest clinical signs of risk. This line of investigation draws on the results of prospective and retrospective studies of the behavioral signs that precede the onset of clinical psychosis. These signs include, but are not limited to, mood disturbances, unusual ideations and perceptual experiences, and declines in occupational, academic and social functioning.  The period when these signs emerge can vary from months to years, and is referred to as the “prodrome”.



Assessment instruments have been developed to measure prodromal signs, and criteria have been established to designate those at greatest statistical risk. Most of those who meet these criteria are in subjective distress and have previously sought or received mental health services.  In the context of research, the term “clinical high risk” is typically used to refer to these help-seeking individuals.   Further, based on the results of prospective studies, it is estimated that the risk rate for subsequent psychosis ranges from 25% to 40% - - - far higher than the risk rate for individuals with an affected family member.  The other 60% to 75% manifest a range of outcomes, with some remaining stably symptomatic and others showing a decline in symptoms and no subsequent psychiatric disorder.





There are numerous ongoing research projects in many countries that are aimed at enhancing our ability to predict psychosis risk beyond the 25-40% level.  Our research group is pursuing this goal as part of a multi-site project funded by the NIMH.  This project, the North American Prodrome Longitudinal study (NAPLS), is also concerned with elucidating the neural mechanisms that are associated with the development of psychotic disorders. We are now in the 5th year of this project, and just beginning the stage of data-analysis that will test alternative prediction algorithms.



In the interim, clinical researchers are confronted with questions concerning the potential stigma and psychological distress associated with psychosis-risk identification, as well as the provision of treatment for these individuals. The issues of stigma and subjective distress have some unique aspects when it comes to disorders of brain function, as opposed to disorders involving other organ systems. The brain is generally assumed to be the organ that subserves our mind and personal identity, and the behavioral problems that can result from its dysfunction are sometimes viewed as being under volitional control.  The latter view is one source of the stigma associated with mental disorders: patients are assumed to be responsible for their symptoms.  At the same time, there is a tendency for many to view individuals with mental illnesses as being dangerous because they are unable to control their behavior.   Together, these factors reduce sympathy for individuals with mental illness.  They also place additional burdens on those who are already experiencing the mood, perceptual and ideational abnormalities that are associated with the prodrome to psychosis.





With regard to the stigma associated with clinical risk for psychosis, the issues vary among individuals.   Among those who participate in our collaborative NIMH project clinical risk, some self-identify as being “at-risk”, through statements such as “I am afraid something is wrong with my mind,” or “I feel like I am going crazy.”  In contrast, others report nonspecific concerns with recurring, distressing ideas, and are seeking help to reduce their distress.   Finally, some individuals who are manifesting signs of risk for psychosis do not perceive a problem. This is especially likely to be true of adolescents whose behavior raises concerns in parents and teachers, although the child views him or herself as being “misunderstood.”



At the present time, there is no standard practice for informing individuals about risk signs for psychosis, and the principle of doing no harm guides practitioners and clinical researchers. Furthermore, clinical researchers emphasize that we cannot predict the onset or course of any illness with precision, and that many who manifest risk signs, especially during adolescence, subsequently experience remission and show no signs by the time they enter young adulthood.



There are also challenging issues when it comes to the provision of treatment for those manifesting signs of psychosis risk.  While there is a consensus that psychotherapy/counseling should be available to any individual in significant distress, the issue of medication for individuals at clinical risk for psychosis is controversial.  At the present time there is no cure for psychotic disorders, and the antipsychotic medications that are used to treat psychosis only partially reduce symptoms, and they are ineffective for a substantial proportion of patients.   These medications also have a variety of side effects (e g., metabolic, neurological) and can be highly sedating.  Finally, there is no consistent evidence that antipsychotic medications can prevent psychosis, although some studies suggest that they reduce the severity of preclinical symptoms.



Based on the currently available data, most researchers in the field of psychosis risk do not advocate the preventive use of antipsychotic medication.  Yet, many individuals who manifest preclinical signs of risk for psychosis do receive antipsychotics from mental health practitioners.  There are several reasons for this, including the practitioner’s desire to reduce the severity of the subclinical symptoms, the broad marketing of antipsychotics (e g., Abilify) to the general public for the treatment of depression, and the desire of patients and their families to intervene in any way possible to prevent the a first episode of psychosis.  The latter concern is especially salient for families who already have an adult child affected by a psychotic illness.



In summary, it is clear that we have a long way to go before we are able to predict serious mental illness at a level of positive predictive power that would justify targeted preventive intervention, especially with agents that can have adverse side effects.  Moreover, we do not yet have consistent evidence of any effective preventive treatment.  But, as indicated in the references listed below, clinical studies are underway around the world. Findings from these studies are yielding important information about prediction, as well as the brain changes that accompany the emergence of psychosis.   As we await more scientific progress in these key areas, we should also continue to carefully consider the ethical issues that surround the identification of individuals who are at risk for serious illnesses.  This is especially the case for illnesses that entail brain dysfunction.









References



Addington, J., Cadenhead, K. S., Cornblatt, B. A., Mathalon, D. H., McGlashan, T. H., Perkins, D. O., Walker, E. F. & Cannon, T. D. (2012). North American Prodrome Longitudinal Study (NAPLS 2): Overview and recruitment. Schizophrenia research.



Addington, J., Cornblatt, B. A., Cadenhead, K. S., Cannon, T. D., McGlashan, T. H., Perkins, D. O., ... & Heinssen, R. (2011). At clinical high risk for psychosis: outcome for nonconverters. The American journal of psychiatry, 168(8), 800.



Cornblatt, B. A., Carrión, R. E., Addington, J., Seidman, L., Walker, E. F., Cannon, T. D., ... & Lencz, T. (2012). Risk factors for psychosis: impaired social and role functioning. Schizophrenia bulletin, 38(6), 1247-1257.



Walker, E. F., Trotman, H. D., Pearce, B. D., Addington, J., Cadenhead, K. S., Cornblatt, B. A., ... & Woods, S. W. (2013). Cortisol Levels and Risk for Psychosis: Initial Findings from the North American Prodrome Longitudinal Study. Biological psychiatry.



Weiser, M. (2011). Early intervention for schizophrenia: the risk-benefit ratio of antipsychotic treatment in the prodromal phase. American Journal of Psychiatry, 168(8), 761-763.





Want to cite this post?



Walker, E., Goulding, S., Ryan, A., Holtzman, C., MacDonald, A. (2013). The identification of risk for serious mental illnesses: Clinical and ethical challenges. The Neuroethics Blog. Retrieved on , from http://www.theneuroethicsblog.com/2013/05/the-identification-of-risk-for-serious.html

Tuesday, May 14, 2013

Dancing with the Devil

Hysteria usually calls to mind thoughts of the Salem Witch Trials and delirious frenzies from history. However, mass hysteria, or mass psychogenic illness, is not simply an improbable, incomprehensible madness of the past. It has occurred throughout history and into our current generation, taking form as dancing plagues, dissociative possession of nuns, and involuntary twitches of high school girls in New York. Is it something they all ate? Or maybe there is something in the water… How is it that anxiety manifests itself into a dance that spreads among populations?



Fear and distress terrorized populations in Medieval

Europe and made them more prone to psychogenic illness. Certainly it seems there must be more to the story than merely these common denominators, for fear, anguish, stress, and trauma are commonly faced and dealt with sans mass hysteria. But the other factors needed for the exact formula of mass hysteria are difficult to pinpoint. 



Is it the perfect combination of despair, devastation, and distress that manifests itself into a psychosomatic reaction? Does it require a specific threshold of suggestion and susceptibility in our belief and cultural context? The panic and frenzy that overtook groups throughout history is a fascinating and frightening occurrence. Epidemics surged along the Rhine River, taking hundreds as victims to the dancing plague [8].  This affliction of compulsive dancing ran rampant in regions where the population believed dancing to be some sort of sickness or a curse that could be cast upon them. Once they formed the belief that they had caught the dancing disease, or they had been cursed to dance, dance, dance, there was no stopping them. People would dance until muscles were strained; they would even dance to their deaths. In 1374 a plague swept through Germany and France that drove thousands to dance in “agony for days or weeks, screaming of terrible visions and imploring priests and monks to save their souls.” Also, in years to come, people danced for as long as six months, some even dying after breaking “ribs or loins” [7].

In our scientifically-informed hindsight, the mass anxiety and physical hysteria that plagued hundreds in medieval Europe may seem improbable and unbelievable. It is difficult to understand by what means anxiety translates to such mysterious physical manifestations, especially in the form of a dance. Psychogenic illness continues to plague the modern world today. Usually occurring within tightly knit communities, a fearful belief manifests and spreads from one susceptible victim to another. Teenage girls in New York recently suffered from involuntary body movements [2]. Eleven people follow suit after one teenage girl fainted during Sunday mass at an Australian church [6] and then chemistry students in a lab suffered from nosebleeds and asthma-like symptoms one by one [5].











These physical manifestations create the impression of being contagious. Most of us know what it is to say that someone’s laugh is contagious, or that we suddenly yawn because someone next to us yawns. The imitation of these phenomena as well as other behavior within our society is an interesting aspect of human psychology. Much like a laugh or a yawn, these psychogenic symptoms are of an infectious nature. However, it seems much easier to imagine why laughter spreads rather than a compulsive dancing affliction. We all imitate behavior and mannerisms to a certain extent; this opens channels of interpersonal connection and is illustrated by our mirror system response. Mirror neurons are thought to play a role in action performance. In a study investigating the dissociation between object directed and non-object directed actions, Agnew, Wise, and Leech discuss the importance of mirror neurons in theories of motor simulation, “which proposes that observed actions are mapped onto existing motor schema, supporting both imitation and understanding” [1].



These means of imitation support give rise to a possible causal explanation of mass psychogenic illness. Our mirror response system observes and executes actions – lending to problematic consequences if left unchecked by the brain’s executive control. Perhaps mirror neurons were the perpetuators of the dancing plague.



But our brains’ abilities of executive control keep us in check and usually allow us to “opt out” of unwanted behaviors. The prefrontal cortex is the house of executive control in the brain and this self-control is an important social and evolutionary tool. Kühn, Haggard, and Brass investigated self-control in the “veto area” of the brain. “Activation of the dorsal fronto-median cortex (dFMC) was associated with vetoing the neural processes translating intentions into actions” [5]. Studies “suggest that dFMC provides an intentional mechanism for stopping an ongoing action in a top-down fashion. Inputs from dFMC to pre-SMA therefore potentially control whether actions occur or not” [5].



Executive control’s ability to interrupt lower levels of processes (veto power) seems to be missing in the instances of psychogenic illness. If that initial “opt out” checkpoint is bypassed, then the other checkpoints are easily bypassed as well, and we get lost in the hysteria. This raises the question if the individuals who endure psychogenic illness have free will. Don’t the involuntary dance movements impede upon the plagued dancers’ agency? A study of the neurology of volition looked at several types of movement disorders in relation to free will. Kranick and Hallett categorize psychogenic movement disorders (PMDs) from other tic disorders by way of saying that the movements often share features of volitional or intentional movement, but are experienced as involuntary: “what sets [patients with PMD] clinically apart from most patients with tics, however, is the denial of any sense of volition for the movements; they are not performed on a compulsory basis, but rather the patient states that the movements ‘just happen’ without any warning or opportunity for suppression. The patient with PMD often has a strong sense that their disorder is an organic disease, such as a brain tumor, and not psychological” [3]. 



Utilizing a model that illustrates the influence of top-down processing and belief systems on volition, they explain how an action might be generated while the agent denies their volition of the movement. 








Fig. 1 from Kranick and Hallett

Kranick and Hallett argue, “the structural and functional neuroimaging studies suggest a network of abnormal inputs from the limbic areas that may trigger movement (or block it in the case of paralysis), not producing a normal feedforward signal. With the resultant mismatch between the actual movement genesis, feed-forward signal and the prior expectation about how movements should be willed, there would be a loss of the sense of both willing and agency. Hypoactivation of the right temporoparietal junction with psychogenic movements is consistent with this idea” [3].



All possibilities considered (fainting, seizures, involuntary tics), why would the mass anxiety manifest itself as a dance? Of course, all I may offer forward is speculation on this point. As a dancer myself, I find it particularly intriguing that human beings suffered from dancing afflictions. Perhaps the mere force of groups gathering and dancing together has something to do with it. Dancing within a group is a powerful phenomenon, allowing for individuals to basically lose themselves to the group and to the movement. At a certain point, the dance “just happens."



This could easily account for the altered states of consciousness the dancers would have to achieve in order to dance for such extended amounts of time. It also accounts for the loss of volition and being consciously aware of intentions of movement. While dance itself is not some form of hysteria, it is easy to see how hysteria may take form as a dance. Dancing is an explicit from of expressive behavior, exaggerated when aggregated within a group. Easily, the power and feeling of dance could be fearfully misunderstood to be the hand of a malevolent supernatural being. Now, however, we know better than to believe dancing to be a plague from Saint Vitus and we no longer see psychogenic illness taking form as his dance, but in other forms as well. While I cannot address all types of psychogenic illness in one blog post, evaluating the manifestation of anxiety as a dance leads me to conclude that belief context (just as in Kranick’s Figure 1) plays a largely significant role in determining which path to physical manifestation the anxiety will take. Emerging neuroscience data suggests this occurrence is something beyond a simple distress or character flaw. Perhaps the neuroscience can illuminate which demons exactly are at work, exposing something less mysterious than a supernatural possession.



References

1. Agnew, Z.K., Wise, R.J.S., & Leech, R. (2012). Dissociating Object Directed and Non-Object Directed Action in the Human Mirror System; Implications for Theories of Motor Simulation. PloS One, 7 (4). doi: 10.1371/journal.pone.0032517. Retrieved from http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3323585/



2. Dominus, Susan. (2012). What Happened to the girls in Le Roy. The New York Times. Retrieved from

http://www.nytimes.com/2012/03/11/magazine/teenage-girls-twitching-le-roy.html?pagewanted=all&_r=0



3. Kranick, S. M., & Hallett, M. (2013). Neurology of volition. Retrieved from

http://link.springer.com/article/10.1007/s00221-013-3399-2/fulltext.html



4. Kühn, S., Haggard, P., & Brass, M. (2009). Intentional Inhibition: How the “Veto-Area” Exerts Control. Human Brain Mapping, 30, 2834-2843. Retrieved from http://onlinelibrary.wiley.com/store/10.1002/hbm.20711/asset/20711_ftp.pdf;jsessionid=BFC7F45F52CDF163E90524534D335765.d02t01?v=1&t=hg3v2vyh&s=7de6586dff09a4a3240c872e281fa4ccda19792d



5. Moran, Lee. (2013). Dozens of students hospitalized from ‘adverse reactions’ after chemistry class experiment. New York Daily News. Retrieved from http://www.nydailynews.com/news/national/30-sickened-chem-class-villanova-university-article-1.1262919



6. Morri, Mark. (2013). Mystery mass fainting…at mass. The Daily Telegraph. Retrieved from

http://www.dailytelegraph.com.au/news/mystery-mass-fainting-at-church/story-e6freuy9-1226580274412



7. Waller, John. (2009). A forgotten plague: making sense of dancing mania. The Lancet, Volume 373 (Issue 9664). Retrieved from http://www.thelancet.com/journals/lancet/article/PIIS0140-6736(09)60386-X/fulltext



8. Waller, John. (2009). Looking Back: Dancing plagues and mass hysteria. The Psychologist. Volume 22, Part 7. Retrieved from http://www.thepsychologist.org.uk/archive/archive_home.cfm?volumeID=22&editionID=177&ArticleID=1541




Want to cite this post?

McCoyd, C. (2013). Dancing with the devil. The Neuroethics Blog. Retrieved on

, from

Wednesday, May 8, 2013

Social and Physical Pain on Common Ground

By Guest Contributor Jacob Billings 

Neuroscience Graduate Student 

Emory University



Societal changes, when they occur, coincide with changing outlooks among the populace. Take for example the American Civil Rights movement of the 1960’s. Largely, the motivations corresponding to economic and political enfranchisement for African-Americans and women resulted from changing identities among these groups during the mobilization of all of America’s resources during World War II. Notably, African Americans observed naturally pleasant interactions with European whites during tours of duty in WWII [1]. When returning to the US, it was impossible to allow American racism to continue unchallenged. During that same period, women acquired expertise in a great variety of professions for which they had been refused the opportunity to work [2]. The expectation that women return to a subordinate place in the household was immediately risen against.



In our modern age, the outlooks held by our friends and neighbors are being changed daily by new evidence from neuroscience. Using an arsenal of tools and techniques at colleges and hospitals around the world, including functional magnetic resonance imagers (fMRI) that can peer into our brains as we think and dream [3], the science marshals each facet of lived experience in turn to hold fast to territory mapped onto the physical domains of the central nervous system. The ground acquired during the campaign is that which is lost by ignorance and outmoded tradition.



How should our societies change as a result of new facets of evidence-based understanding, particularly when that evidence grounds lived experience directly to the material and functioning of our nervous systems?



In the following blog post, we’ll examine this question as it relates to developments in a particular topic: the shared neurological bases of physical and social pain. We’ll consider this facet of neurosciences discoveries as it measures against the often tragic decisions that perpetuate inter-group strife and identify a possible initiative each of us may take to help curb those attitudes that cause suffering.



The Findings



Naomi Eisenberger, Assistant Professor of Social Psychology at UCLA, works to expand our understanding of how social relationships affect our emotional and physical well-being. In a 2012 review entitled, “The Pain of Social Disconnection: Examining the shared neural underpinnings of physical and social pain [4],” Eisenberger identifies that the brain regions activated during the experience of physical pain are the same regions activated during socially painful experiences. Physical pain is considered to be represented in the brain through two interdependent though dissociable networks. The first, containing elements of the somatosensory cortex and the posterior insula, is involved when an insult is physically sensed and localized to an area in the body. The second, comprised of the dorsal anterior cingulate cortex and the anterior insula, is involved in the generation of an insults affective components that distress us, driving us to end the noxious stimulus.




The medial view on the left shows the dorsal anterior cingulate cortex (dACC). The lateral view on the right shows the primary somatosenory cortex (S1) on the outer surface of the brain, the secondary somatosensory cortex (S2) on the edge of the cut-out of the brain and the anterior insula (AI) and posterior insula (PI) in the middle of the cut-out of the brain. Sensory components are shown in green and affective components in red [4]

The review cites social pain as activating the sensory component of the pain network during instants of extreme social rejection, for example, during an unexpected break-up. When lovers scorned speak of a throbbing heart, modern science can now insert that pain to a physical space in the brain whose activation results in the sensation of physical pain. The two experiences have the capacity to be painful in the same ways.



While the above activity patterns arise in only extreme circumstances, the affective portion of the pain network is consistently activated during social pain. The network is a component of the attention regulation system that informs our awareness that something is wrong. The system is an integral part of our cognitive machinery providing the physical space for realization—from those observing that the body is damaged to realizing that one’s mind has wandered from meditative concentration [5].



As we become aware of just what is wrong, our sensing and perceiving brains engage somatic responders to mobilize the endocrine system for flight or a fight via the pituitary and adrenal glands [6]. Heart pounding with epinephrine, eyes widened, and forehead sweating, asking a person on a date can feel in some ways as intimidating as staring down a lion. The body’s chemical immune responders are also activated during this stress response. Under habitually stressful conditions, both social and physical, this immune activation can run the body ragged, killing otherwise healthy cells and impeding normal bodily functions [7], leading to anhedonia, weakness, and depression at best, cardiovascular and autoimmune diseases at worst [8].



Through experiment after experiment, the power of socially distressing events to affect us physically and experientially is becoming apparent.



Implications



Famously, the United States, the country of my birth, assigned itself the emblem of a bird of prey, the bald eagle, when forming a unified state. Some, Ben Franklin particularly, thought the Turkey a nobler bird. But here we have it, a nation that honors a predator. To be sure, the fact of different groups weathering through a violent maelstrom of legal oppression for what are proclaimed inalienable rights—the right that the makings of one’s own hands be lawfully inviolable, the right to govern as a representative of the people—demonstrates the added difficulties that forming a nation based around between-group predation brings. Over the years, this nation has made some strides to extend basic human rights to groups who aren’t white male landowners. Recently, the electorate promoted the nation’s first president with a Nigerian parent to two terms in office. But predation of various kinds still exists—subtly in the comments of Emory University President Wagner who touted the legal definition of the African people as 3/5ths of a person [9]; and more overtly in the failed foreign policy decision to conduct a supposed War on Terror that murdered 200,000th Afghani and Iraqi civilians through cluster bombs, drone attacks, and other senseless instruments of death [10]. In the face of laborious political change, lingering racist attitudes, and flagrant abuses of power, what does it mean knowing that the social disconnection feels physically painful?



Whereas our nation still fosters the cancers of racism and sexism that bends certain outlooks to grant, by law and by custom, certain people with a privileged caste, neuroscientific findings trumpet forth a new era of social change, one with eyes opened to the common properties shared by all humanity. Eisenberger’s work contributes to the growing understanding of the vast similarities between us. Particular, her review asserts an evolutionary path by which social pain co-opted the brain space developed for physical pain to pique within us a sensitivity to social connection necessary for our social species to survive and thrive. With evidence demonstrating that positive social interactions are inherently tied to each person’s well-being, we can no longer have a basis to ignore that callousness, of whatever variety, is harmful.



Final Thoughts



The community of Tibetan refugees proclaims that compassion for each other is an ideal contemplation to adopt during those idle moments of introspection. Personally, I agree. The perspectives that allowed the ousting of Native Americans, Africans, and Tibetans from the places of their birth, that established rules of order caring little for the investiture of women with the reins of power have made this world, filled with bountiful resources and boundless creativity, a more troubling place to live. Eisenberger’s review helps us to see that actions we may have thought to be innocuous, antisocial and un-diplomatic behavior, leads to the kind of painful sensations that rouse people to end the noxious social stimuli. The cultivating of compassion, individually and perhaps institutionally, is, I think, a good way to begin sewing up the discord and strife that separate our species.



References 

1. Highmam, J., Civil Rights and Social Wrongs: Black and white relations since World War II. 1997: Balch Institute for Ethnic Studies.

2. Weatherford, D., American Women during World War II: An Encyclopedia. 2009: Taylor and Francis.

3. Horikawa, T., et al., Neural Decoding of Visual Imagery During Sleep. Science, 2013.

4. Eisenberger, N., The Pain of Social Disconnection: Examining the Shared Neural Underpinnings of Physical and Social Pain. Nature Reviews Neuroscience, 2012.

5. Hasenkamp, W., et al., Mind wandering and attention during focused meditation: A fine-grained temporal analysis of fluctuating cognitive states. NeuroImage, 2012. 59(1): p. 750-760.

6. Raison, C.L., L. Capuron, and A.H. Miller, Cytokines sing the blues: inflammation and the pathogenesis of depression. Trends Immunol, 2006. 27(1): p. 24-31.

7. Maes, M., et al., The effectts of psychological stress on human: increased production of inflammatory cytokines and Th1-lik response in stress-induced anxiety. Cytokine, 1998. 10(4): p. 313-318.

8. Cohen, S., D. Janicki-Deverts, and G.E. Miller, Psychological stress and disease. JAMA: the journal of the American Medical Association, 2007. 298(14): p. 1685-1687.

9. Severson, K.B., Robbie Emory University President Revives Racial Concerns. New York Times;, 2013.

10. University, W.I.f.I.S.a.B. Costs of War. 2011 2013 [cited 2013 4/28/2013]; Available from: costsofwar.com.



Want to cite this post?

Billings, J. (2013). Social and Physical Pain on Common Ground. The Neuroethics Blog. Retrieved on

, from http://www.theneuroethicsblog.com/2013/05/social-and-physical-pain-on-common_8.html