By Jonah Queen
Image courtesy of Pixabay. |
As neurotechnology advances and our understanding of the brain increases, there is a growing debate about if, and how, neuroscience can play a role in the legal system. In particular, some are asking if these technologies could ever be used to accomplish things that humans have so far not been able to, such as performing accurate lie detection and predicting future behavior.
For September’s Neuroethics and Neuroscience in the News event, Dr. Eyal Aharoni of Georgia State University spoke about his research on whether biomarkers might improve our ability to predict the risk of recidivism in criminal offenders. The results were published in a 2013 paper titled “Neuroprediction of future rearrest1," which was reported in the media with headlines such as “Can we predict recidivism with a brain scan?” The study reports evidence that brain scans could potentially improve offender risk assessment. At the event, Dr. Aharoni led a discussion of the legal and ethical issues that follow from such scientific findings. He asked: “When, if ever, should neural markers be used in offender risk assessment?”
Dr. Aharoni started by explaining that determining the risk an individual poses to society (“risk triage”) is an important part of the criminal justice system and that it is used when making decisions around bail, sentencing, parole, and more. He presented the cases of Jesse Timmendequas and Darrell Havens as opposite extremes of what can happen when risk is miscalculated. Timmendequas is a repeat sex offender who had served less than seven years in prison for his crimes and had not been considered a serious threat before he raped and murdered a seven-year-old girl, a crime which led to the passing of Megan’s Law. Havens, a serial car thief, is serving a 20-year prison sentence for assaulting a police officer, despite being rendered quadriplegic after being shot by police, because parole boards are reluctant to grant him an early release due to his extensive criminal history.
Risk triage is currently done through unstructured clinical judgements, where a clinician will offer his or her opinion based on an interview of the subject, and the more accurate evidence-based risk assessment, which assesses various known risk factors, such as age, sex, criminal history, drug use, impulsivity, and level of social support. Dr. Aharoni and the other authors of the paper propose that neurological data could potentially be introduced as an additional risk factor to help improve the accuracy of such assessments1.
With the understanding that impulsivity is a major risk factor for recidivism2, the researchers focused their study on the anterior cingulate cortex (ACC), a limbic brain region shown to be heavily involved in impulse control and error monitoring (in fact, behavioral changes in people with damage to the ACC are often extreme enough for those individuals to be classified as having an “acquired psychopathic personality3”).
In Aharoni’s paper1, the volunteers (96 currently incarcerated adult men) were presented with a go/no-go (GNG) task (which tests impulse control) while their ACC activities were monitored with functional magnetic resonance imaging (fMRI, which measures changes in blood flow within different regions of the brain—an increase in blood flow is taken to mean that a region has increased neural activity). The researchers found that participants with greater activation of the ACC during impulse control errors were half as likely to be arrested within four years of their release (when controlling for other factors such as age at release, Hare Psychopathy Checklist scores, drug and alcohol use, and performance on the GNG task). In other words, the study seems to show that, when used in conjunction with currently recognized risk factors, the fMRI data improved the accuracy of the risk assessment. The authors conclude that this finding “suggest[s] a possible predictive advantage” of including the neurological data in risk assessment models1.
Image courtesy of Flickr user Janne Moren. |
After emphasizing the need for additional research, the authors discuss several possible applications, for the use of these “neuromarkers.” One of the more controversial ones (and the one that the media has mostly focused on) is to add neuromarkers (such as ACC activity during a GNG task) to the other factors that are currently used for risk triage in the criminal justice system. The authors recognize that this will raise ethical and legal issues, specifically that such scans might not meet the legal standard of proof, and that such techniques might threaten offenders’ civil rights in ways that currently used risk assessment methods do not.
In his presentation, Dr. Aharoni expanded on some of these concerns, focusing on the scientific limitations, legal limitations, and ethical implications of this research. The scientific limitations refer to the accuracy and replicability of this method and the general question of whether current neuroimaging techniques can provide useful data to criminal risk assessments. The legal limitations include questions of how and when such methods could legally be used. Would they be legally admissible, or would they be found to be unconstitutional if used in certain ways? Would the results of a brain scan be legally classified as physical evidence (which, under the Fourth Amendment, can be obtained with a warrant) or testimony (under the Fifth Amendment, an individual cannot be forced to testify if it would incriminate them)? Similar questions are being asked regarding fMRI lie detection.
And then there are the ethical implications. Using such a technique to keep people in jail who would not be otherwise (for lengthened sentences or denying parole, for example) is worrisome to many and runs the risk of violating offenders’ civil rights in an attempt to increase public safety. Dr. Aharoni mentioned that neuromarkers could also be used in an offender’s best interests if, for example, MRI data showed that they might be less likely to reoffend. An audience member pointed out, though, that this could be unfair to the people whose brain data does not help their case.
Another application that the authors mention is how this research could pave the way for possible interventions (including therapies, programs, and medications) for people with poor impulse control caused by low ACC activity. This could still raise concerns around convicts being required to undergo medical treatments (like medication or even surgery) if their criminal activity is thought to be caused by “defective” brain regions. And even if no practical applications come of this research, the authors point out that their findings still contribute to our understanding of the brain and human behavior.
Image courtesy of Wikimedia Commons. |
Media outlets that reported on the study mostly focused on the predictive aspect, often referencing the film Minority Report, in which people are arrested for crimes they have not yet committed. Dr. Aharoni explained that incarcerating people based on the likelihood of re-offense is currently happening in cases of involuntary civil commitment, where defendants who are found not guilty by reason of insanity can be confined to psychiatric hospitals until they are deemed safe. If neuromarkers such as brain scans are used to improve the accuracy of the predictions, it might not be as much of a radical change as it seems.
But still, as explained above, even if brain scans were to be incorporated into the predictive models currently used, it would raise many ethical issues. And things could become even more worrisome if this technology were to be (mis)used in ways the researchers have not intended and the science does not support. For example, the criminal justice system could buy into the hype around brain imaging and develop a process that only looks at the scans and not at the other factors. Scans could also be performed on people who have not committed a crime to see if they need “monitoring” or “treatment,” possibly even non-voluntarily, even though they have not done anything wrong (in something more similar to a Minority Report-like scenario). Even without any intervention, there could also be the issue of stigma, like there is with testing for predisposition to mental illness. If someone is found to have a “criminal brain” how would people view them? How would they view themselves? And an audience member raised the possibility of this technology being used in the private sector. There are companies that offer MRI lie detection services—what if a company were to start testing people for predisposition to criminal behavior?
In the paper, the authors admirably discuss the ethical issues that could arise from their research. And the discussion Dr. Aharoni led at the event showed the importance of looking at controversial research such as this with a critical eye and in context in order to avoid resorting to sensationalist claims and unfounded fears. Not only is it important to make sure the science behind new neurotechnologies is accurate, but we also need to consider the societal effects of new technologies, whether they are used in the way their creators intended or not.
References
1) Aharoni, E., Vincent, G. M., Harenski, C. L., Calhoun, V. D., Sinnott-Armstrong, W., Gazzaniga, M. S., & Kiehl K. A. (2013). Neuroprediction of future rearrest. PNAS, 110(15), 6223-6228. doi:10.1073/pnas.1219302110
2) Monahan, J. D. (2008) Structured risk assessment of violence. Textbook of Violence Assessment and Management, eds Simon, R., Tardiff, K. (American Psychiatric Publishing, Washington, DC), pp 17–33.
3) Devinsky, O., Morrell, M. J., Vogt, B. A. (1995) Contributions of anterior cingulate cortex to behaviour. Brain 118(pt 1), 279–306. doi:10.1093/brain/118.1.279
Want to cite this post?
Queen, J. (2017). Too far or not far enough: The ethics and future of neuroscience and law. The Neuroethics Blog. Retrieved on , from http://www.theneuroethicsblog.com/2017/10/too-far-or-not-far-enough-ethics-and.html
No comments:
Post a Comment