Pages

Tuesday, November 29, 2011

Welcome Our Inaugural Neuroethics Scholars!

It is with great pleasure that the Emory Neuroethics Program announces its inaugural neuroethics scholars!  The Neuroethics Program invited graduate students to create and to join collaborative, interdepartmental faculty teams at Emory and in the Atlanta community to pursue Neuroethics scholarship.  Graduate students were free to propose projects of interest to them. Proposals included innovative ideas in the arena of teaching, empirical research, new media, and beyond. By the completion of their one year appointments, each scholar is expected to co-author a paper and present his/her work.  The selection process was quite competitive. Abstracts of their proposed projects can be found below.





Cyd Cipolla and Kristina Gupta (Innovative Neuroethics Teaching)












Cyd Cippola and Kristina Gupta



We both work in the field of feminist science studies, a field that has challenged the gender biases of scientific knowledge. In her dissertation research, Cyd examines the role of religious, psychiatric and popular representation in the creation of “violent sex offender” legislation in the United States, and the relationship between this criminal category and sexual identity categories. In her dissertation research, Kristina examines the interplay between scientific and medical approaches to “nonsexuality” and the efforts by some individuals to define “asexuality” as a sexual identity category. Through our research, we both became interested in the role that neuroscientific research plays in defining some types of sexuality as deviant or pathological and in influencing public understandings of certain types of sexuality.



Based on this interest, we applied to the Neuroethics Scholars Program both to increase our own knowledge about the field of Neuroethics and to contribute to this emerging field. As Neuroethics scholars, we will develop and teach a course during the spring of 2012 titled “Feminism, Sexuality, and Neuroethics.” The course is being offered through the Department of Women’s, Gender, and Sexuality Studies and is cross-listed with the Department of Neuroscience and Behavioral Biology. Students in this class will learn the major topics and themes within the field of Neuroethics through critically examining historical and contemporary scientific research on sexuality and the brain. We will cover a variety of topics, including homosexuality, sex/gender differences in sexuality, violent sexual offenses, sex addiction, sexual desire disorders, and monogamy. Students will read a scientific study or studies on the topic alongside reports about the study in news media outlets, and then follow this by reading critiques of the work from both inside and outside the scientific community. No previous experience with neuroscience research or sexuality research is required to take the class. Our goal is to enable students from all disciplines to understand the scientific research on its own terms, to develop the skills required to analyze the ethical implications of this research, and to develop an understanding of how neuroscientific research is conveyed to the public through media.



In addition to teaching the course, we plan to make our syllabus publicly available and to write an article reflecting on our experiences teaching the course. In this way, we will contribute to the resources available for teaching about Neuroethics. We are very excited about this opportunity and we look forward to sharing our experiences with you. We would also appreciate any feedback, suggestions, or advice you have to offer!

 





Jason Shepard (Innovative Empirical Neuroethics Research)








Jason Shepard



I am interested in exploring the links between beliefs in free will and pro- and anti-social behaviors. Some neuroscientists and psychologists often claim that data from the brain and behavioral sciences are providing evidence against the existence of free will. These claims range from the more modest (but still controversial) claims that the data is showing that our free will is much more limited than we suppose to the much stronger claims that the data is showing that free will is an illusion. These anti-free-will claims are no longer confined to the pages of academic journals; these claims have also been regularly making their way into the popular media. In a separate line of research, psychologists have experimentally demonstrated that by exposing people to texts that claim that free will is an illusion, people tend to cheat more (Vohs & Schooler, 2008) and they tend to be less willing to help and tend to be more aggressive (Baumeister, et al , 2009). These findings have raised some important ethical questions such as: If exposing people to anti-free-will texts can have deleterious effects on people’s behaviors, might there be harmful social consequences of scientists publically making anti-free-will claims? If there are harmful social consequences of scientists publically making anti-free-will claims, are there any ethical constraints placed on those who might be tempted to publically make anti-free-will claims? Though the current evident suggests that these are questions that deserve serious attention, the current evidence does not yet justify an answer to these questions. From the current studies is not really clear what are the specific mechanisms that lead to reduced beliefs in free will and the behavioral changes, whether the results will generalize beyond the lab , or whether the behavioral effects will persist beyond a single testing session. All of these issues need to be adequately addressed in order to have a clear understanding of what exactly is at stake, whether the stakes warrant any proscriptive advice, and what exactly should be the content of the proscriptive advice. In order to help answer these questions, Jason proposes (1) to explore the specific mechanisms that can lead to reduced beliefs in free will at a finer grain level than previous studies; (2) to try to generalize the results to a wider range of ecologically valid measures of pro- and anti-social behaviors; and (3) to explore the time course of the behavioral effects.


Jason Shepard is a first-year psychology PhD student in the Cognition and Development Program at Emory, where he works in Phillip Wolff’s Cognition and Linguistic Systems Lab. He also holds an MA in philosophy with a concentration in Neurophilosophy from Georgia State University. In addition to studying the behavioral effects of beliefs in free will, Jason also studies intentional action, causal structure, and other related phenomenon.







Want to cite this post?


Rommelfanger, K. (2011). Welcome Our Inaugural Neuroethics Scholars! The Neuroethics Blog. Retrieved on
, from http://www.theneuroethicsblog.com/2011/11/welcome-to-our-inaugural-neuroethics.html





References



Baumeister, R., Masicampo, E.J., DeWall, C. N. (2009) Prosocial Benefits of Feeling Free: Disbelief in Free Will Increases Aggression and Reduces Helpfulness. Personality and Social Psychology Bulletin, 36, pp. 260-268



Vohs, K. and Schooler, J. (2008). The value of believing in free will: Encouraging a belief in determinism increases cheating. Psychological Sciences, 19, pp 49-54.

Monday, November 28, 2011

"The Ethics of Designer Brains": Interview with Paul Root Wolpe on Big Think

Director of Emory's Center for Ethics talks about the ethics of designer brains on Big Think.





"Our values as a society will determine which psychopharmaceuticals and (down the road) which genetic enhancement technologies we choose to develop and how we use them.


That's what concerns Dr. Paul Root Wolpe, senior Bioethicist at NASA and a pioneer in the field of neuroethics. Peering into his children's and grandchildren's future, he sees an America that rewards competitiveness and productivity over relationship-building, and suspects that future generations will face intense pressure to enhance their minds and bodies in unhealthy ways.



The politics of technophilia vs technophobia aside, our power to manipulate our brains and genes is increasing dramatically – and it raises serious ethical questions."

Neuroethics Journal Club documented by artist Jon Ciliberto

Jon Ciliberto artist and all around jack-of-all-trades documented our last Neuroethics Journal Club on Neurotechnologies and Lie Detection via painting/drawing.  Thanks, Jon!






by Jon Ciliberto



Our next Neuroethics Journal Club
will be on December 14, 2011. We will be discussing the AJOB Neuroscience article, "Deflating the Neuroenhancement Bubble," and Emory Neuroscience Graduate student David Nicholson will facilitate this session.

Wednesday, November 23, 2011

Lie Detection and the Jury



Much virtual and actual ink has been spilled of late about the dangers of rushing to bring brain-imaging technologies into the courtroom.  Not only neuroskeptics,[1] but also preeminent neuroscientists,[2] have urged caution when it comes to the prospect of fMRI data being admitted as trial evidence.  And brain-based lie detection, as one of the most alluring areas of imaging research, has in particular come in for a great deal of hand-wringing.











These portents of doom are perhaps even more premature than would be the use of fMRI “polygraphy” as evidence.  Worrying now about that prospect is a bit like throwing out the bathwater before the baby has even gotten into the tub.  While it’s true that a few ill-informed judges have made a few ill-conceived decisions along these lines (and those mostly in India, not the United States), the vast weight of judicial precedent, procedure, and practice makes it overwhelmingly likely that courts will move too slowly, rather than too fast, in admitting new techniques of lie detection.  As a rule, courts are exceptionally wary of any kind of evidence that they view as usurping the function of the jury, and conventional wisdom has it that the primary function of the jury is to determine the credibility of witnesses.  Although most jurisdictions exclude polygraph evidence on the ground that it is not sufficiently reliable, in truth it is at least as reliable as much other “scientific” evidence[4] that is routinely admitted under Daubert,[5] the Supreme Court case that governs admissibility of scientific evidence in the federal courts and the majority of state jurisdictions.  But lie detection evidence speaks to witness credibility and thus it is the poster child for usurpation of the jury’s role.  While there may come a day when brain-based lie detection becomes so reliable that courts can no longer rely on Daubert to keep it out, history suggests that they will avoid that day as long as they can.



 In an article on fMRI lie detection and the role of the jury,[6] I entertained a thought experiment that asked what would happen if fMRI lie detection were to become as reliable as DNA evidence, currently considered the “gold standard” of scientific evidence.  If we could easily see whether witnesses were testifying truthfully at trial, would that make the jury obsolete?  What, I wondered, were judges so afraid of in opening the door to polygraph and other techniques of assessing witness credibility?





I think the answer goes something like this:  Our current criminal justice system hides what the jury does inside a black box.  Rules of jury secrecy, combined with rules of evidence and procedure, ensure that most verdicts are unimpeachable on the facts.  In other words, so long as there is some evidence against a criminal defendant a reviewing court can (and will) assume that a guilty verdict reflects a jury’s assessment of witness credibility and not, for example, racial bias, failure to follow jury instructions on the law, or even simple mistake.  The Supreme Court has written, affirming a conviction in a case involving allegations of extensive drug and alcohol use by jurors during the trial, that “[i]t is not at all clear . . . that the jury system could survive such efforts to perfect it.”[7]  Instead, all of the evidence goes into the jury room, it gets shaken up and – abracadabra! – out comes the verdict.  If we knew that particular witnesses were lying or telling the truth, it would be much harder to accept certain verdicts without drastically reimagining what it is that we think juries are supposed to be doing.




The Troy Davis[8] provides a depressing example of this dynamic.  Despite the claims by advocates on either side of the debate, the evidence [9] in the case was ambiguous and murky.  The answer to the question of guilt or innocence depends wholly on which witnesses were telling the truth and which were lying, because it’s clear that someone must have been lying.  Since we can’t look into their hearts and minds, we have to rely on some fact-finder to answer this question; in our system, and usually for very good reason, that fact-finder is the jury. 



The jury may very well have gotten it wrong.  There surely was much reason – in the form of witness recantations and evidence of police pressure – to doubt whether the jury got it right.  But in the end, the “reasonable doubt” that many observers urged came from the (very reasonable) suspicion that the main witness against Troy Davis was probably lying.  The main problem is that as long as a reviewing court could hold that it was not irrational to believe this witness and to disbelieve the testimony of the defendant, there would be no real basis for overturning the conviction.







In the end, human beings (including jurors) are not very good lie detectors.  Indeed, we do little better than a coin flip, though we tend to believe we are very skilled at detecting deception through demeanor and other clues.  Though brain imaging based lie detection is still in its infancy, it may someday be capable of supplementing our inadequate abilities in this area.  Though we should be cautious – and courts are very cautious – we should also consider the relative reliability of alternative available techniques including that lowest tech of all techniques: the unaided jury.  And we should ask ourselves whether it makes sense to enshrine the lie detection role of the jury, in all of its glorious imperfection, at the expense of considering techniques that could help the jury to fulfill this role as well as its other important roles in our system of justice.




--Julie Seaman, PhD

Associate Professor of Law, Emory University





Want to cite this post?


Seaman, J. (2011). Lie Detection and the Jury. The Neuroethics Blog. Retrieved on
, from http://www.theneuroethicsblog.com/2011/11/lie-detection-and-jury.html





[1]http://neuroskeptic.blogspot.com/2009/09/fmri-gets-slap-in-face-with-dead-fish.html

[2] http://www.nytimes.com/2011/11/01/science/telling-the-story-of-the-brains-cacophony-of-competing-voices.html?_r=2&scp=1&sq=gazzaniga&st=cse


[3] http://singularityhub.com/2010/05/06/another-attempt-to-use-fmri-lie-detector-in-us-court-fails-in-brooklyn-more-on-the-way/



[5] http://www.law.cornell.edu/supct/html/92-102.ZS.html


[6] http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1352648


[7]http://scholar.google.com/scholar_case?case=6212097998214052620&hl=en&as_sdt=2&as_vis=1&oi=scholarr


[8] http://www.huffingtonpost.com/2011/09/21/troy-davis-executed_n_975109.html



Friday, November 18, 2011

Project Guerrilla Science: Neurobiological Origins of Zombies!

The Neuroethics Program made some new friends at the Society for Neuroscience meeting (SfN). Project Guerrilla Science presents, *ah hem*, research on necroneurology.
From the office of: Bradley Voytek, Ph.D. (Post-doctoral Fellow, University of California, San Francisco) & Timothy Verstynen, Ph.D. (Post-doctoral Research Associate University of Pittsburgh) ---- "HUMANS! ... We appreciate your interest in the zombie sciences. Necroneurology is the most exciting new thing to hit neuroscience since mirror neurons! We look forward to future possible collaborative opportunities. We encourage all scientists to take part in Project Guerrilla Science at next year's SfN. Be on the lookout for other, fun (maybe even non-zombie!?) research projects in the future. This has been surprisingly fun and successful (and has garnered us way more media attention than our actual research... ::sigh::). Please send any and all brains you may encounter--zombie or otherwise!"

Sunday, November 13, 2011

Neuroethics Blog Post on CNN Blog by Dr. Paul Root Wolpe: No mind-reading allowed!

Director of the Center for Ethics at Emory University, Dr. Paul Root Wolpe puts his foot down on CNN's Belief Blog.






Dr. Paul Root Wolpe, neuroethics expert


"Throughout human history, the inner workings of our minds were impenetrable, known only to us and, perhaps, to God. No one could see what you were thinking, or know what you were feeling, unless you chose to reveal it to them."



Read more about it by following the link below.

My Take: Keep government out of mind-reading business

International Neuroethics Society: Summary of what you (may have) missed!

Greetings from DC!  The Neuroethics Program is busy hobnobbing with some of world's most cutting-edge, interdisciplinary group of innovative thinkers at the International Neuroethics Society (INS)!







In case you didn't get the chance to attend this year, here is a brief summary of what you missed. The full list of events can be seen here and featured events from Day 1 of this year's meeting can be seen here.



This year INS hosted its annual meeting at the Carnegie Institution for Science. Day 2 of the annual INS meeting was an exciting and inspiring day featuring outstanding sessions. Each session highlighted some of the most pressing topics in the field of neuroethics.













The day opened with a panel on Neuroscience, National Security, and Society. The panel featured Jonathan Moreno, University of Pennsylania; William Casebeer, DARPA; and James Giordano, Potomac Institute for Policy Studies. Dr. Moreno is the author of the book Mind Wars. Moreno outlined the past, current, and potential uses of current neuroscience research in National Defense weaving a narrative from ingestion of cognitive-enhancing drugs, to external brain imaging, to more invasive brain-machine interface and beyond. Moreno noted that some experts had testified the science is not "ready" for such applications. However, those whose primary goal was National Defense might be more compelled by the reality that the technology is currently something a 20-year old soldier could learn to use in 20 minutes. Dr. Giordano suggested that conversations about neuroscience and defense have moved beyond whether or not we ought to weaponize neurotechnologies, but what what we should do when neurotechnologies do become weaponized. Giordano suggested that we should limit transparency about efforts to weaponize neurotechnologies to the general public and move forward with prudent communications in order to avoid inducing unwarranted mass public panic. Dr. Casebeer ended on a more optimistic front, stating that while there is peril associated with the use of neurotechnologies for national defense, by taking careful measures to protect human flourishing and autonomy, neurotechnologies hold the promise to help create a new generation of effective modes of neuro-defense.





The next session was led by self-described real life cyborg and author of World Wide Mind, Michael Chorost. Dr. Chorost has cochlear implants that allow him to hear, and in his talk he challenged us to explore how neurotechnology could improve humanity. Chorost noted that there are approximately 2 billion computers being used by 2 billion internet users, but this is no where close to the 100 billion neurons and all their synaptic connections in the human brain. Chorost envisions a future where brain transplants in otherwise healthy people might be used to make more sufficient and meaningful connections between people thereby creating a deeper awareness of those around us. Chorost views the internet as humanity's evolutionary assistant.






Image from http://www.thesocializers.com
The third session was a panel on Neuroethics and Novel Treatment in Neuropsychiatry moderated by Barbara Sahakian, Cambridge University. The panel featured Husseini Manji, Johnson and Johnson Pharmaceutical; Helen Mayberg, Emory University; and Jorge Moll, D'Or Institute for Research and Education. Dr. Manji pointed out that, according to studies by the World Health Organization, 30% of the Burden of Disease by 2015 will be attributed to neuropsychiatric disorders, an ominous figure which translates into an enormous loss of society's most important capital, its cognitive and mental capital. This outlook highlights the importance of refining our techniques for defining and diagnosing prodromal periods for psychiatric disorders such as schizophrenia, and neurodegenerative diseases such as Alzheimer's disease, as well as developing novel interventions for the the prodromal period. Helen Mayberg discussed her work using deep brain stimulation (DBS), an experimental intervention involving the surgical implantation of an electrode into the brain and battery pack in the pectoral muscle of patients, to treat intractable major depression. She describes the ethical problem of how physicians are to help patients who continue to need DBS after the pilot stages of experimentation. She also expressed ethical concerns about the evolving responsibility of the patient as their symptoms resolve. Patients may be miraculously improved while on DBS, but will need to continue having their DBS hardware maintained (i.e. battery replacement) to preserve these beneficial effects over time. Or on the flip side, how will patients disengage from the stimulator should they not want the treatment anymore? Grants for this research may cover costs to implant devices, but not to remove them. Finally, Jorge Moll discussed the implications of inferring cognitive and psychological states from neuroimaging such as for diagnosing psychopathy.








Illustration by Jeffrey Decoster
The afternoon closed with a discussion of Real Cases in Law and Neuroscience moderated by Hank Greely, Stanford University.  The panel included two attorneys Steve Greenberg and Houston Gordon, along with Russell Swerdlow, a neurologist at the University of Kansas Medical Center. Greenberg is a criminal defense lawyer who attempted to use brain images to prevent his client, an individual who has repeatedly raped and murdered young girls, from receiving the death penalty. Greenberg argues that brain imaging evidence suggests that psychopaths have brain defects, or birth defects (since they might have been born with these" brain defects") and asks how we should hold people who engage in criminal behavior accountable for their crimes, especially if they suffer essentially a birth defect. Attorney Houston Gordon has also tried to submit neuroimaging data in criminal cases, but the judge handling his case stated that the science wasn't ready. Gordon believes, however, that the science is ready, stating that neuroimaging data are the products of an unbiased computer algorithm and multiple, rigorous peer-reviewed studies. Finally, Dr. Swerdlow shared his story of a patient who had acquired pedophilia, which was due to an enormous tumor growing from the base of his skull. When the tumor was removed, the patient's tendencies toward pedophilia subsided, and his tendency for pedophilia returned when the tumor grew back. Swerdlow posed a number of questions based on this work: How hard wired is decision-making and how free is free will? And given this case as an example, are the current legal standards adequate?



If you missed this year's annual INS conference, don't despair.  INS is already planning its next conference so stay tuned and check back at neuroethicssociety.org for updates.



For those of you that attend the Society for Neuroscience meeting, please let us know if you'd like to share any work that struck a neuroethics chord in you by commenting below (We are especially interested in events we may have missed from Nov 14-16, but we welcome your notes from previous days at SFN).




Want to cite this post?


Rommelfanger, K. (2011). International Neuroethics Society: Summary of what you (may have) missed! The Neuroethics Blog. Retrieved on
, from http://www.theneuroethicsblog.com/2011/11/international-neuroethics-society_13.html

Saturday, November 12, 2011

International Neuroethics Society: Careers in Neuroethics Session

Greetings from Washington DC! The Neuroethics Program is on the road attending the International Neuroethics Society Meeting and Society for Neuroscience.



Have you been wondering how to begin your journey toward a career in neuroethics?



The 2011 International Neuroethics Society (INS) Meeting featured a Neuroethics Careers Session.  INS meeting organizers, including Emory Neuroethics Program's Gillian Hue, put together a stellar panel of speakers including Alan Leshner, AAAS (American Association for the Advancement of Science); Paul Root Wolpe, Emory University; Emily Murphy, Stanford and Hank Greely, Stanford.







"You enter the field almost always obliquely," Paul Root Wolpe of Emory told the audience. "You get into bioethics through a story."



To learn more about his story, a summary of this panel discussion can be found on the Dana Foundation's Blog.

Monday, November 7, 2011

Neuroethics Playlist

We have put together a playlist of songs about neuroethics, the brain, and the mind. Below you will find a Prezi presentation that includes the music and brief descriptions of each of the songs.







Neuroethics Playlist on Prezi (Updated 4/15/2018)





Special thanks to the followers on our Facebook page for their helpful suggestions.


Wednesday, November 2, 2011

Ted Talk: Trust, morality -- and oxytocin

"What drives our desire to behave morally? Neuroeconomist Paul Zak shows why he believes oxytocin (he calls it "the moral molecule") is responsible for trust, empathy and other feelings that help build a stable society."



For more read our previous blog post "Liquid Trust and Artificial Love" here.