Pages

Tuesday, January 27, 2015

Neuroscience in the Courtroom: An Attempt for Clarity

*Editor’s note: You can catch a lengthier discussion of this topic at our Jan 29th session of Neuroscience and Neuroethics in the News.



When people think about functional magnetic resonance imaging (fMRI) and the courtroom, many often think of mind reading or colorful images of psychopathic brains. Portable fMRI machines capable of reading our personal thoughts pop into our heads and arouse a fear that one day a neuroscientist could reasonably discern our deepest secrets through a brain scan. Despite recent scholarship that suggests a world filled with covert fMRI lie detection devices is far away (if ever attainable), I think further attention should be paid to how people think about neuroscience and interpret scientific information that draws on brain-laden language, particularly in the courtroom (Farah, Hutchinson, Phelps, & Wagner, 2014). This topic is of special interest to me as it is the focus of my undergraduate research thesis. I also think it should be relevant to neuroscientists, ethicists, and journalists as well because the way in which people interpret and understand aspects of the brain and human behavior is perhaps a consequence of how such information is portrayed to the public.






Photo from Ali, Liftshitz, & Raz, 2014

The seductive allure of neuroscience information has captivated many researchers as brain imaging and neural explanations begin to seep into the legal realm and fascinate the media (Jones, Wagner, Faigman, & Raichle, 2013). This idea—the seductive allure hypothesis—refers to the notion that people find neurological justifications of behavior to be a marker of a sound explanation for an action or tendency, regardless of the quality of the information (Weisberg, Keil, Goodstein, Rawson, & Gray, 2008; McCabe & Castel, 2008). Understanding whether people are captivated by neural information has largely shifted into a debate about the persuasive and informative value of brain images (Farah & Hook, 2013).  Some of this research has involved measuring the impact of brain images on sentencing verdicts and punishment determinations in legal cases (Schweitzer, Saks, Murphy, Roskies, Sinnott-Armstrong, & Gaudet, 2011). The results of these studies are largely mixed, with many more research findings not corroborating the seductive allure hypothesis (Roskies, Schweitzer, & Saks, 2013), suggesting that neurological explanations are particularly compelling. Should this lead us to believe that the debate surrounding the persuasiveness of brain images is over?



I think not. In an effort to infuse some clarity in the neuro-seduction debate, I will discuss two overarching questions that I find particularly relevant to this ongoing discussion: first, what precisely does it mean to be neuroscientific? And, second, assuming that neuroscience has some potential to unduly sway people, is it also reasonable to assume neuroscience has equal pull amongst people with differing beliefs about the mind, the brain, neuroscience, and psychology generally? I will attempt to address both of these issues below.

   

Prefrontal cortex, impulse control, brain images—oh my!

 Lots of explanations, pictures, graphs, journal articles, and books could constitute neuroscience. Take, for instance, an fMRI brain image of a person with psychopathy: to a general audience, such a picture could convey lots of different types of information, ideas, or concepts about the mind and the brain. To some, an fMRI image may suggest that a particular pathology is “real” or that someone’s deviant or anti-social behavior is “hardwired” in the brain. Now, take a lawyer merely describing adolescents as particularly impulsive bunch because of their delayed development of the pre-frontal cortex. This latter form of argumentation may also convey very similar ideas to certain people, as is the case in the former example, even though it does not rely on an image per se.

   

In studies examining the persuasive power of brain images, we need to be careful not to conflate the power of an explanation and the power of an image. In other words, if we are going to argue that neuroscience is unduly persuasive, we need a better conceptualization of what it means to be neuroscientific, and I think neuroscience is much more than just fMRI images. The distinction between explanation and image is of particular relevance; one less discussed yet consistent finding is that neurological information (which I will later refer to as neuro-information) tends to affect people’s judgments, such as a defendant’s guilt, an article’s scientific credibility, or a supposed criminal’s deserved punishment (Weisberg, et al., 2008; Schweitzer, Saks, Murphy, Roskies, Sinnott-Armstrong, & Gaudet, 2011; Michael, Newman, Cumming, & Garry, 2013; Roskies, et al., 2013). Given this finding, it is still not clear what part of the explanation (e.g., the neuro-language, the image, or both) sways people to think that neuroscience tells us something above and beyond the neuroscience explanation itself.

   

This lack of clarity is particularly problematic insofar as a lengthy debate surrounding just the admission of brain images as evidence has also unfolded in recent years (Morse, 2014). In my opinion, the role of just plain ol’ brain-sounding language has been overshadowed by the debate about the glitziness of brain images. I am not attempting to provide a solution to this definitional and conceptual conundrum; however, I do think that it would be erroneous to conclude either that brain images are entirely not biasing or that all neuroscience possesses unparalleled persuasive power given the disparate and sometimes confusing findings within this area of research.






Image from BosLaw



Shouldn’t individual differences matter? 

It is also important to consider whether all people are truly likely to be swayed by neural language. Researchers have yet to fully explore if there are specific people who are particularly compelled by neural language. One study has examined education level differences, but there are potential other factors that may influence one’s likelihood to fall prey to inaccurately interpreting neural information, such as previous beliefs about neuroscience and motivation to confirm such beliefs (Weisberg, et al., 2008; Scurich & Shniderman, 2014). For instance, people often differ in how they conceptualize psychology. I could see how people who tend to think psychology lacks scientific rigor may tend to believe that neuroscience offers a greater opportunity to understand behavior. Similarly, for some people, the motivation to confirm or disconfirm an issue that a neuroscientific explanation seeks to uphold may matter. For instance, in an interesting variation on these neuro-seduction studies, a group of researchers had people rate the validity of an article that described how neuroscience could or could not support the notion that the death penalty deters people from committing crimes (Scurich & Shniderman, 2014). The authors found that people tended to give more favorable ratings to a particular neuroscientific article when it supported their initial beliefs about the death penalty. Overall, it seems unlikely that neuroscience or neuro-images have the power to overwhelmingly persuade everyone in all circumstances or overturn existing beliefs.



What should we do?

The verdict is not out on the influence of brain imaging in the courtroom. This area of research continues to grow and change as people devise nuanced ways to test why brain images may change behavioral outcomes and who is most likely to succumb to the seductive power of brain information or brain images. Nonetheless, this research has potential to impact our legal system. Ultimately, aside from addressing my two aforementioned questions, I do think that it is important for scientists of all disciplines to continue attempting to explain findings regarding the brain and behavior in the clearest terms possible. As more people learn about what brain imaging and brain information can tell us about behavior, we as researchers must be ever aware of the potential for our findings to be misconstrued in the public or in the courtroom.



References




Farah, M. J., & Hook, C. J. (2013). The seductive allure of “seductive allure”. Perspectives on Psychological Science, 8(1), 88-90.



Farah, M. J., Hutchinson, J. B., Phelps, E. A., & Wagner, A. D. (2014). Functional MRI-based lie detection: scientific and societal challenges. Nature Reviews Neuroscience, 15(2), 123-131.



Jones, O. D., Wagner, A. D., Faigman, D. L., & Raichle, M. E. (2013). Neuroscientists in court. Nature Reviews Neuroscience, 14(10), 730-736.



Morse, S. J. (2014). Brain imaging in the courtroom: the quest for legal relevance. AJOB Neuroscience, 5(2), 24-27.



Roskies, A. L., Schweitzer, N. J., & Saks, M. J. (2013). Neuroimages in court: less biasing than feared. Trends in cognitive sciences, 17(3), 99-101.



Saks, M. J., Schweitzer, N. J., Aharoni, E., & Kiehl, K. A. (2014). The impact of neuroimages in the sentencing phase of capital trials. Journal of Empirical Legal Studies, 11(1), 105-131.



Schweitzer, N. J., Saks, M. J., Murphy, E. R., Roskies, A. L., Sinnott-Armstrong, W., & Gaudet, L. M. (2011). Neuroimages as evidence in a< em> mens rea</em> defense: No impact. Psychology, Public Policy, and Law, 17(3), 357.



Scurich, N., & Shniderman, A. (2014). The Selective Allure of Neuroscientific Explanations. PloS one, 9(9), e107529.



Weisberg, D. S., Keil, F. C., Goodstein, J., Rawson, E., & Gray, J. R. (2008). The seductive allure of neuroscience explanations. Journal of Cognitive Neuroscience, 20(3), 470-477.








Want to cite this post?




Marshall, J. (2015). Neuroscience in the Courtroom: An Attempt for Clarity. The Neuroethics Blog. Retrieved on

, from http://www.theneuroethicsblog.com/2015/01/neuroscience-in-courtroom-attempt-for.html

Tuesday, January 20, 2015

“Believe the children”? Childhood memory, amnesia, and its implications for law

How reliable are childhood memories? Are small children
capable of serving as reliable witnesses in the courtroom? Are memories that
adults recall from preschool years accurate? These questions are not only
important to basic brain science and to understanding our own autobiographies,
but also have important implications for the legal system. At the final
Neuroscience, Ethics and the News journal club of the 2014 Fall semester, Emory
Psychologist Robyn
Fivush
led a discussion on memory development, childhood amnesia, and the
implications of neuroscience and psychology research for how children form and
recall memories.




This journal club discussion was inspired by a recent NPR
story
that explored the phenomenon of childhood amnesia. Why is it that most
of us cannot form long-term memories as infants, at least in the same way that
we can as adults? This fundamental question has fascinated many researchers and
psychologists and neuroscientists today are tackling it in innovative ways. Even
adult memory of the recent past is not nearly as reliable as most people (and
jurors) believe1
and while 2-year-old children can report long-term memories from several months
prior2,
adults typically cannot recall memories from before age 3.5. The emergence of autobiographical memory may
arise from the realization of the self (~2 years) and acquisition of language
skills, but it seems to happen gradually. Childhood amnesia may actually be the
result of a slow conversion to recalling self-experienced episodes rather than
just events themselves.3






Via medimoon.com



However, the general public has been shown to have a rather
poor understanding of memory,1 perhaps due to “common sense” beliefs
and cultural
traditions
. These common sense and cultural notions are deep-seated and may
even have more influence in our society than the latest research, especially if
those findings are not effectively communicated to the public. In fact, there
is significant disagreement between the memory experts and judges, jurors, and
law enforcement on the reliability of childhood memories recalled by adults.4
For example, nearly 70% of experts surveyed agreed that “Memories people
recover from their own childhood are often false or distorted in some way”, but
only about 30% of jurors thought that statement was true.3




Our collective understanding of childhood memory has had
profound effects on American culture and society. The 1980 bestseller Michelle Remembers (Smith and
Pazder) set off what Dr. Fivush described as a moral panic that swept the
country in the 1980s and 1990s. The book was presented as a nonfiction account
of Michelle Smith’s repressed childhood memories of satanic ritual abuse (SRA),
which had been uncovered by her therapist (and eventual husband) Lawrence
Pazder. Within the decade, two major criminal cases were colored by the
hysteria. The prosecution of the “West
Memphis Three”
where victim mutilation was presented as evidence of SRA
(when in fact it may have been due to animal predation), and the lengthy and
exorbitantly expensive McMartin
Preschool Trials
, which ended with no convictions. More recently, the first
season of HBO’s drama True Detective (2014), may in fact
be guilty of introducing a whole new generation to this discredited yet
endlessly intriguing conspiracy theory.






Michelle Remembers



A key component to this chapter in American history is the
media coverage of these events. Some have pointed
out
that powerful voices in the media were not nearly as skeptical of these
allegations as they perhaps should have been. Of course, this is a difficult
line to walk in cases of alleged abuse – nobody wants a victim to endure
additional suffering – but there is also a risk of innocent lives being ruined
by unsubstantiated claims. Dr. Fivush discussed how questioning techniques that involved
leading questions, coercion, and suggestion – tactics that are particularly apt
to produce inaccurate testimony in children – were used in the McMartin
Preschool Trials and have since been replaced with more accurate,
evidence-based methods. There is also an issue of experts being willing to
testify. Dr. Fivush gave an example of a different case where an attorney who
told her that if she did not agree to serve as an expert, then their next
choice was a school guidance counselor without any recent research experience.
To the jury, they might both have appeared as reliable experts on the topic of
childhood memory.




Early childhood memory is a particularly salient example of
a topic that has been raised many times on this blog – the gaps that often
exist between current views among experts in the field, what is reported by
major news outlets, and what the public believes. Less accurate information
seems to make it across each gap, like a game of telephone being played at a Drive-by Truckers concert.




It doesn’t necessarily have to be this way. Effective
communication of science has received a great deal of attention and will hopefully
continue to improve. However, as Dr. Fivush pointed out, it is also important
for researchers to be willing to serve as experts in legal proceedings to
ensure that their work and that of their field is interpreted faithfully and
accurately.












[1]
Lacy, J.W. and Stark, C.E.L. (2013) “The neuroscience of memory: implications
for the courtroom.” Nature Reviews
Neuroscience
(14) 9: 649 – 58.




[2]
Fivush, R., Gray, J.T., Fromhoff, F.A. (1987) “Two-Year-Olds Talk About the
Past” Cognitive Development (2)
393-409.




[3]
Nelson, K. and Fivush, R. (2004) “The emergence of autobiographical memory: a
social cultural developmental theory.” Psychological
review
(111) 2: 486 – 511.




[4]
Howe, M.L. (2013) “Memory development: implications for adults recalling
childhood experiences in the courtroom.” Nature
Reviews Neuroscience
(14) 12: 869 – 876.








Want to cite this post?




Purcell, R. (2015). “Believe the children”? Childhood memory, amnesia, and its implications for law. The Neuroethics Blog. Retrieved on

, from http://www.theneuroethicsblog.com/2015/01/believe-children-childhood-memory.html

Tuesday, January 13, 2015

Ethical Issues in Neurosurgery: A Special Issue of Virtual Mentor

This month the American Medical Association's journal Virtual Mentor published a series of articles about the ethical issues pertaining to neurosurgery. Some of the articles include discussions about deep brain stimulation in early-stage Parkinson Disease, simulation and neuro-surgery teaching tools, and integrating ethics into science education. The special issue also featured two members of the American Journal of Bioethics Neuroscience: editor-in-chief Dr. Paul Root Wolpe, and editor Dr. John Banja. The issue was guest edited by a neurosurgical resident at Emory University, Jordan Amadio. Click here to view the special issue.

















Emerging Ethical Issues in Neurosurgery: An Interview with Dr. Wolpe by Dr. Jordan Amadio





"The single most important thing to remember is that when we intervene in the brain it is a completely different kind of intervention than when we intervene in any other part of the body. It has the potential of altering those aspects of ourselves that we think of as most human--our personalities, our ability to communicate, and our subjective world. When we begin to think about something like deep brain stimulation, which has been shown to induce personality shifts, or when we talk about adding some sort of information technology to our brain and processes, or when we talk about the potential of chemically altering the brain through psycho-pharmaceuticals, ... it is a different qualitative kind of shift in the patient than if we were for example intervening in the function of their kidney, their heart, or their liver."



Listen to more of the interview here.

_________________________________________________________________________



Disclosure of Experience as a Risk Factor in Informed Consent for Neurosurgery: The Case of Johnson v. Kokemoor

By Dr. John Banja



"A problem that has bedeviled both medical law and medical ethics for decades concerns the scope of risk-related information that a health professional should provide to patients, especially when that information involves the health professional’s experience and success rates with a certain procedure. For example, I take a rather perverse delight in provoking medical students with the following line of questioning: “How do you determine what risks you’re going to disclose to a patient?”"



View the rest of the article here.






Want to cite this post?




Marshall, J. (2015). Ethical Issues in Neurosurgery: A Special Issue of Virtual Mentor. The Neuroethics Blog. Retrieved on

, from http://www.theneuroethicsblog.com/2015/01/ethical-issues-in-neurosurgery-special.html

Tuesday, January 6, 2015

Neuroscience and Human Rights


Last month, I had the privilege of attending the International Neuroethics Society Meeting in Washington, DC, made possible by a travel award from the Emory Neuroethics Program. This year's meeting featured panelists from diverse backgrounds: government, neuroscience, ethics, law, engineering, public health, and others. Each participant and attendee offered her unique perspectives on topical issues in neuroethics.



As I listened to many thought-provoking presentations and discussions, a question kept arising in my mind: to what extent should scientists engage with issues of social justice if their research findings support changes in public policy? As a "war on science" continues to be waged by members of the U.S. Senate and Congress (see Senator Coburn's 2014 "Wastebook," and the recent NPR Science Friday response by targeted scientists) and the American public lags in scientific literacy (A NSF report this year found that 1 in 4 Americans think the sun orbits the earth), this question carries a particular sense of urgency. Isn't science supposed to support human flourishing and maximize our well-being, as the American Association for the Advancement of Science puts it, "for the benefit of all people?" How accountable should scientists be in ensuring that this actually happens, beyond the scope of their laboratories?



My reflections on these questions were ignited by a fascinating example of how neuroscience can inform policy, provided by Katy de Kogel of the Dutch Ministry of Justice. Dr. de Kogel spoke of recent shifts in Dutch criminal law that reflect neuroscientific consensus: the neural substrates that support decision-making are not fully "online" in the developing, adolescent brain. In contrast to United States legal code, which specifies that individuals above the age of 18 be prosecuted as adults, thus barring them from legal protections offered to minors, Dutch courts have incorporated scientific understanding of neurodevelopment into their criminal code by advancing the age at which individuals are tried as minors: from 18 to 22 years of age. Criminal research findings support this change, as minors housed in adult detention centers tend to have higher rates of recidivism than those detained in juvenile centers. In my view, this is a refreshing and somewhat unexpected example of how society can benefit from advancements in neuroscience. We often think of science producing technological or medical innovations that improve our lives, rather than ancillary benefits like this that are impossible to foresee at the outset of a project.






Katy de Kogel of the Dutch Ministry of Justice (Courtesy of Dr. Gillian Hue)



The next panel discussion, themed "Neuroscience and Human Rights," provided another example of how neuroscience and society can intersect. One of the participants, Dr. Mariana Chilton of Drexel University, presented her research on food insecure communities here in the United States. Food insecurity is defined by the USDA as "a household-level economic and social condition of limited or uncertain access to adequate food." Dr. Chilton's talk highlighted the striking prevalence of food insecurity among American children: an astonishing 21.6%, according to a 2014 report by the non-profit organization, Feeding America. She continued by pointing out overwhelming epidemiological evidence linking early-life malnutrition and its associated psychosocial stressors to adverse health outcomes in adulthood. These negative outcomes include, but are certainly not limited to, impairments in social skills, language development, emotional self-regulation, and problem-solving abilities derived from neurodevelopment deficits that come along with macro- and micronutrient deficiencies. With billions of dollars spent annually on psychiatric medications and innumerable losses in productivity due to mental health issues such as Attention Deficit Hyperactivity Disorder (ADHD) and depression, I wondered whether the rising prevalence of childhood malnutrition could be a key mechanism by which our national public mental health crisis has arisen? (For context on the American mental health crisis, the CDC estimates that only 17% of US adults are considered to be in a state of optimal mental health.)



Dr. Chilton's presentation continued with a neuroethically-minded suggestion that inadequate US policies to support childhood nutrition and healthcare constitute a human rights violation. First off, she cites the 1989 United Nations Convention on the Rights of the Child (UNCRC), which asserts that "the child, by reason of his physical and mental immaturity, needs special safeguards and care, including appropriate legal protection, before as well as after birth," which support the child's right "to the enjoyment of the highest attainable standard of health," as outlined in Article 24 of the Convention. Secondly, Chilton points out that contemporary findings from developmental neuroscience and maternal and child health epidemiology unequivocally demonstrate that nutritional deficiencies do not support the "highest attainable standard of health" for children, as advocated by the UNCRC. The implication is that with such a large proportion of American children undernourished, publicly-funded programs such as the Special Supplemental Nutrition Program for Women, Infants and Children (WIC), the Supplemental Nutrition Assistance Program (SNAP), and the National School Lunch Program (NSLP), may be inadequate to fully address their needs. Therefore, the inalienable human rights of the child to flourish are, in my estimation, jeopardized by US policy (or lack thereof). As an aside, I was astonished to learn that of the 194 United Nations member states to sign the treaty, only 3 countries have failed to ratify it: Somalia, South Sudan, and the United States.








Dr. Mariana Chilton speaking at the "Neuroscience and Human Rights" Panel (Courtesy of Dr. Gillian Hue)



By suggesting that human rights can serve as a rationale for changes in US social policy, Dr. Chilton contributes a novel and persuasive approach to political arguments around issues like access to food, housing, and early education. Arguments against empirically-validated government programs that support childhood nutrition and health often helicopter around the respective economic and political views of their proponents, with both sides of the political divide overlooking human rights concerns. One example of this is the justification provided by Rep. Paul Broun (R-GA) in 2011 for his proposed 10% cut in federal funding for WIC, by suggesting that the measure would "save us from spending hundreds of millions of dollars we don't have," and that WIC is "seemingly designed to hold a section of the population in limbo rather than helping them grow out of poverty." I concede Rep. Broun's obvious point that budget cuts would save money; however, there is extensive empirical support for the beneficial effects of the WIC program on short- and long-term health outcomes. While more efficient approaches to support childhood nutrition may exist, such as reducing healthcare costs for families and increasing the federal minimum wage, political rhetoric and discourse on public health policies should consider their implications for human rights.



Returning to my original question on the extent to which scientists should engage in conversations with policymakers and the public, I would argue that they do in fact have an ethical responsibility to do so. While it may not be the responsibility of scientists to advocate for specific policies per se, I personally think that they and their academic institutions have an ethical responsibility, as publicly-funded entities, to communicate their research findings and particularly its broader implications to the public. Unfortunately, one of the biggest obstacles that even the most well-intentioned, socially-engaged scientist faces is the lack of time or incentive to get involved. As NIH and NSF pay-lines continue to decline, scientists spend more of their valuable time competing for grant funding, and the institutions within which scientists serve continue to weigh a researcher's "success" by grant and publication record, rather than activities like civic engagement. I challenge colleges and universities to refine their science faculty evaluation to be more in accord with the goals of the AAAS. For example, how has Dr. X "provided a voice for science on societal issues," or "promoted the responsible use of science in public policy?" I think most scientists would support such a change.



As Dr. Chilton demonstrated at INS 2014, neuroscientific research findings can be incorporated into human rights arguments for updating US social policies and legal statutes. Knowing what they know about human health and disease, some scientists may be particularly well-positioned to advocate for social change. It is incumbent upon the public and their elected officials to honestly ask themselves, for example, whether they are taking adequate measures "to combat disease and malnutrition…through the provision of adequate nutritious foods and clean drinking-water," as stated in the UNCRC. Whether these discussions are initiated by neuroscientists who understand that the adolescent neocortex is underdeveloped and therefore lacks impulse control, or those who understand that early-life deprivation has profound, persistent effects on the human capacity to flourish, someone must speak and act for members of society who are not empowered to do so for themselves.





Want to cite this post?




Kohn, J. (2015). Neuroscience and Human Rights. The Neuroethics Blog. Retrieved on

, from http://www.theneuroethicsblog.com/2015/01/neuroscience-and-human-rights.html