Pages

Wednesday, November 23, 2011

Lie Detection and the Jury



Much virtual and actual ink has been spilled of late about the dangers of rushing to bring brain-imaging technologies into the courtroom.  Not only neuroskeptics,[1] but also preeminent neuroscientists,[2] have urged caution when it comes to the prospect of fMRI data being admitted as trial evidence.  And brain-based lie detection, as one of the most alluring areas of imaging research, has in particular come in for a great deal of hand-wringing.











These portents of doom are perhaps even more premature than would be the use of fMRI “polygraphy” as evidence.  Worrying now about that prospect is a bit like throwing out the bathwater before the baby has even gotten into the tub.  While it’s true that a few ill-informed judges have made a few ill-conceived decisions along these lines (and those mostly in India, not the United States), the vast weight of judicial precedent, procedure, and practice makes it overwhelmingly likely that courts will move too slowly, rather than too fast, in admitting new techniques of lie detection.  As a rule, courts are exceptionally wary of any kind of evidence that they view as usurping the function of the jury, and conventional wisdom has it that the primary function of the jury is to determine the credibility of witnesses.  Although most jurisdictions exclude polygraph evidence on the ground that it is not sufficiently reliable, in truth it is at least as reliable as much other “scientific” evidence[4] that is routinely admitted under Daubert,[5] the Supreme Court case that governs admissibility of scientific evidence in the federal courts and the majority of state jurisdictions.  But lie detection evidence speaks to witness credibility and thus it is the poster child for usurpation of the jury’s role.  While there may come a day when brain-based lie detection becomes so reliable that courts can no longer rely on Daubert to keep it out, history suggests that they will avoid that day as long as they can.



 In an article on fMRI lie detection and the role of the jury,[6] I entertained a thought experiment that asked what would happen if fMRI lie detection were to become as reliable as DNA evidence, currently considered the “gold standard” of scientific evidence.  If we could easily see whether witnesses were testifying truthfully at trial, would that make the jury obsolete?  What, I wondered, were judges so afraid of in opening the door to polygraph and other techniques of assessing witness credibility?





I think the answer goes something like this:  Our current criminal justice system hides what the jury does inside a black box.  Rules of jury secrecy, combined with rules of evidence and procedure, ensure that most verdicts are unimpeachable on the facts.  In other words, so long as there is some evidence against a criminal defendant a reviewing court can (and will) assume that a guilty verdict reflects a jury’s assessment of witness credibility and not, for example, racial bias, failure to follow jury instructions on the law, or even simple mistake.  The Supreme Court has written, affirming a conviction in a case involving allegations of extensive drug and alcohol use by jurors during the trial, that “[i]t is not at all clear . . . that the jury system could survive such efforts to perfect it.”[7]  Instead, all of the evidence goes into the jury room, it gets shaken up and – abracadabra! – out comes the verdict.  If we knew that particular witnesses were lying or telling the truth, it would be much harder to accept certain verdicts without drastically reimagining what it is that we think juries are supposed to be doing.




The Troy Davis[8] provides a depressing example of this dynamic.  Despite the claims by advocates on either side of the debate, the evidence [9] in the case was ambiguous and murky.  The answer to the question of guilt or innocence depends wholly on which witnesses were telling the truth and which were lying, because it’s clear that someone must have been lying.  Since we can’t look into their hearts and minds, we have to rely on some fact-finder to answer this question; in our system, and usually for very good reason, that fact-finder is the jury. 



The jury may very well have gotten it wrong.  There surely was much reason – in the form of witness recantations and evidence of police pressure – to doubt whether the jury got it right.  But in the end, the “reasonable doubt” that many observers urged came from the (very reasonable) suspicion that the main witness against Troy Davis was probably lying.  The main problem is that as long as a reviewing court could hold that it was not irrational to believe this witness and to disbelieve the testimony of the defendant, there would be no real basis for overturning the conviction.







In the end, human beings (including jurors) are not very good lie detectors.  Indeed, we do little better than a coin flip, though we tend to believe we are very skilled at detecting deception through demeanor and other clues.  Though brain imaging based lie detection is still in its infancy, it may someday be capable of supplementing our inadequate abilities in this area.  Though we should be cautious – and courts are very cautious – we should also consider the relative reliability of alternative available techniques including that lowest tech of all techniques: the unaided jury.  And we should ask ourselves whether it makes sense to enshrine the lie detection role of the jury, in all of its glorious imperfection, at the expense of considering techniques that could help the jury to fulfill this role as well as its other important roles in our system of justice.




--Julie Seaman, PhD

Associate Professor of Law, Emory University





Want to cite this post?


Seaman, J. (2011). Lie Detection and the Jury. The Neuroethics Blog. Retrieved on
, from http://www.theneuroethicsblog.com/2011/11/lie-detection-and-jury.html





[1]http://neuroskeptic.blogspot.com/2009/09/fmri-gets-slap-in-face-with-dead-fish.html

[2] http://www.nytimes.com/2011/11/01/science/telling-the-story-of-the-brains-cacophony-of-competing-voices.html?_r=2&scp=1&sq=gazzaniga&st=cse


[3] http://singularityhub.com/2010/05/06/another-attempt-to-use-fmri-lie-detector-in-us-court-fails-in-brooklyn-more-on-the-way/



[5] http://www.law.cornell.edu/supct/html/92-102.ZS.html


[6] http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1352648


[7]http://scholar.google.com/scholar_case?case=6212097998214052620&hl=en&as_sdt=2&as_vis=1&oi=scholarr


[8] http://www.huffingtonpost.com/2011/09/21/troy-davis-executed_n_975109.html



Friday, November 18, 2011

Project Guerrilla Science: Neurobiological Origins of Zombies!

The Neuroethics Program made some new friends at the Society for Neuroscience meeting (SfN). Project Guerrilla Science presents, *ah hem*, research on necroneurology.
From the office of: Bradley Voytek, Ph.D. (Post-doctoral Fellow, University of California, San Francisco) & Timothy Verstynen, Ph.D. (Post-doctoral Research Associate University of Pittsburgh) ---- "HUMANS! ... We appreciate your interest in the zombie sciences. Necroneurology is the most exciting new thing to hit neuroscience since mirror neurons! We look forward to future possible collaborative opportunities. We encourage all scientists to take part in Project Guerrilla Science at next year's SfN. Be on the lookout for other, fun (maybe even non-zombie!?) research projects in the future. This has been surprisingly fun and successful (and has garnered us way more media attention than our actual research... ::sigh::). Please send any and all brains you may encounter--zombie or otherwise!"

Sunday, November 13, 2011

Neuroethics Blog Post on CNN Blog by Dr. Paul Root Wolpe: No mind-reading allowed!

Director of the Center for Ethics at Emory University, Dr. Paul Root Wolpe puts his foot down on CNN's Belief Blog.






Dr. Paul Root Wolpe, neuroethics expert


"Throughout human history, the inner workings of our minds were impenetrable, known only to us and, perhaps, to God. No one could see what you were thinking, or know what you were feeling, unless you chose to reveal it to them."



Read more about it by following the link below.

My Take: Keep government out of mind-reading business

International Neuroethics Society: Summary of what you (may have) missed!

Greetings from DC!  The Neuroethics Program is busy hobnobbing with some of world's most cutting-edge, interdisciplinary group of innovative thinkers at the International Neuroethics Society (INS)!







In case you didn't get the chance to attend this year, here is a brief summary of what you missed. The full list of events can be seen here and featured events from Day 1 of this year's meeting can be seen here.



This year INS hosted its annual meeting at the Carnegie Institution for Science. Day 2 of the annual INS meeting was an exciting and inspiring day featuring outstanding sessions. Each session highlighted some of the most pressing topics in the field of neuroethics.













The day opened with a panel on Neuroscience, National Security, and Society. The panel featured Jonathan Moreno, University of Pennsylania; William Casebeer, DARPA; and James Giordano, Potomac Institute for Policy Studies. Dr. Moreno is the author of the book Mind Wars. Moreno outlined the past, current, and potential uses of current neuroscience research in National Defense weaving a narrative from ingestion of cognitive-enhancing drugs, to external brain imaging, to more invasive brain-machine interface and beyond. Moreno noted that some experts had testified the science is not "ready" for such applications. However, those whose primary goal was National Defense might be more compelled by the reality that the technology is currently something a 20-year old soldier could learn to use in 20 minutes. Dr. Giordano suggested that conversations about neuroscience and defense have moved beyond whether or not we ought to weaponize neurotechnologies, but what what we should do when neurotechnologies do become weaponized. Giordano suggested that we should limit transparency about efforts to weaponize neurotechnologies to the general public and move forward with prudent communications in order to avoid inducing unwarranted mass public panic. Dr. Casebeer ended on a more optimistic front, stating that while there is peril associated with the use of neurotechnologies for national defense, by taking careful measures to protect human flourishing and autonomy, neurotechnologies hold the promise to help create a new generation of effective modes of neuro-defense.





The next session was led by self-described real life cyborg and author of World Wide Mind, Michael Chorost. Dr. Chorost has cochlear implants that allow him to hear, and in his talk he challenged us to explore how neurotechnology could improve humanity. Chorost noted that there are approximately 2 billion computers being used by 2 billion internet users, but this is no where close to the 100 billion neurons and all their synaptic connections in the human brain. Chorost envisions a future where brain transplants in otherwise healthy people might be used to make more sufficient and meaningful connections between people thereby creating a deeper awareness of those around us. Chorost views the internet as humanity's evolutionary assistant.






Image from http://www.thesocializers.com
The third session was a panel on Neuroethics and Novel Treatment in Neuropsychiatry moderated by Barbara Sahakian, Cambridge University. The panel featured Husseini Manji, Johnson and Johnson Pharmaceutical; Helen Mayberg, Emory University; and Jorge Moll, D'Or Institute for Research and Education. Dr. Manji pointed out that, according to studies by the World Health Organization, 30% of the Burden of Disease by 2015 will be attributed to neuropsychiatric disorders, an ominous figure which translates into an enormous loss of society's most important capital, its cognitive and mental capital. This outlook highlights the importance of refining our techniques for defining and diagnosing prodromal periods for psychiatric disorders such as schizophrenia, and neurodegenerative diseases such as Alzheimer's disease, as well as developing novel interventions for the the prodromal period. Helen Mayberg discussed her work using deep brain stimulation (DBS), an experimental intervention involving the surgical implantation of an electrode into the brain and battery pack in the pectoral muscle of patients, to treat intractable major depression. She describes the ethical problem of how physicians are to help patients who continue to need DBS after the pilot stages of experimentation. She also expressed ethical concerns about the evolving responsibility of the patient as their symptoms resolve. Patients may be miraculously improved while on DBS, but will need to continue having their DBS hardware maintained (i.e. battery replacement) to preserve these beneficial effects over time. Or on the flip side, how will patients disengage from the stimulator should they not want the treatment anymore? Grants for this research may cover costs to implant devices, but not to remove them. Finally, Jorge Moll discussed the implications of inferring cognitive and psychological states from neuroimaging such as for diagnosing psychopathy.








Illustration by Jeffrey Decoster
The afternoon closed with a discussion of Real Cases in Law and Neuroscience moderated by Hank Greely, Stanford University.  The panel included two attorneys Steve Greenberg and Houston Gordon, along with Russell Swerdlow, a neurologist at the University of Kansas Medical Center. Greenberg is a criminal defense lawyer who attempted to use brain images to prevent his client, an individual who has repeatedly raped and murdered young girls, from receiving the death penalty. Greenberg argues that brain imaging evidence suggests that psychopaths have brain defects, or birth defects (since they might have been born with these" brain defects") and asks how we should hold people who engage in criminal behavior accountable for their crimes, especially if they suffer essentially a birth defect. Attorney Houston Gordon has also tried to submit neuroimaging data in criminal cases, but the judge handling his case stated that the science wasn't ready. Gordon believes, however, that the science is ready, stating that neuroimaging data are the products of an unbiased computer algorithm and multiple, rigorous peer-reviewed studies. Finally, Dr. Swerdlow shared his story of a patient who had acquired pedophilia, which was due to an enormous tumor growing from the base of his skull. When the tumor was removed, the patient's tendencies toward pedophilia subsided, and his tendency for pedophilia returned when the tumor grew back. Swerdlow posed a number of questions based on this work: How hard wired is decision-making and how free is free will? And given this case as an example, are the current legal standards adequate?



If you missed this year's annual INS conference, don't despair.  INS is already planning its next conference so stay tuned and check back at neuroethicssociety.org for updates.



For those of you that attend the Society for Neuroscience meeting, please let us know if you'd like to share any work that struck a neuroethics chord in you by commenting below (We are especially interested in events we may have missed from Nov 14-16, but we welcome your notes from previous days at SFN).




Want to cite this post?


Rommelfanger, K. (2011). International Neuroethics Society: Summary of what you (may have) missed! The Neuroethics Blog. Retrieved on
, from http://www.theneuroethicsblog.com/2011/11/international-neuroethics-society_13.html

Saturday, November 12, 2011

International Neuroethics Society: Careers in Neuroethics Session

Greetings from Washington DC! The Neuroethics Program is on the road attending the International Neuroethics Society Meeting and Society for Neuroscience.



Have you been wondering how to begin your journey toward a career in neuroethics?



The 2011 International Neuroethics Society (INS) Meeting featured a Neuroethics Careers Session.  INS meeting organizers, including Emory Neuroethics Program's Gillian Hue, put together a stellar panel of speakers including Alan Leshner, AAAS (American Association for the Advancement of Science); Paul Root Wolpe, Emory University; Emily Murphy, Stanford and Hank Greely, Stanford.







"You enter the field almost always obliquely," Paul Root Wolpe of Emory told the audience. "You get into bioethics through a story."



To learn more about his story, a summary of this panel discussion can be found on the Dana Foundation's Blog.

Monday, November 7, 2011

Neuroethics Playlist

We have put together a playlist of songs about neuroethics, the brain, and the mind. Below you will find a Prezi presentation that includes the music and brief descriptions of each of the songs.







Neuroethics Playlist on Prezi (Updated 4/15/2018)





Special thanks to the followers on our Facebook page for their helpful suggestions.


Wednesday, November 2, 2011

Ted Talk: Trust, morality -- and oxytocin

"What drives our desire to behave morally? Neuroeconomist Paul Zak shows why he believes oxytocin (he calls it "the moral molecule") is responsible for trust, empathy and other feelings that help build a stable society."



For more read our previous blog post "Liquid Trust and Artificial Love" here.