When scientific research crashes into ethical quandaries, one Duke professor is there to parse the difficult questions.
A professor of law and philosophy, Nita Farahany studies the ethical and legal ramifications of new technologies and research methods, mainly in the fields of biology and neuroscience. Farahany served as a member of the Presidential Commission for the Study of Bioethical Issues from 2010 to 2017.
She has presented her work to a wide range of audiences, including a speech to the World Economic Forum and a November 2018 TED Talk titled "When technology can read minds, how will we protect our privacy?"
Farahany received her bachelor's and master's degrees in biology at Dartmouth College and Harvard University, respectively. Having obtained her J.D. and Ph.D in philosophy from Duke, she is the director of the Duke Initiative for Science and Society and the science, law and policy lab.
After law school, Farahany clerked for Judge Judith Rogers of the U.S. Court of Appeals for the D.C. Circuit. She then went to Vanderbilt University to finish her dissertation, intending to leave after a year and practice law, but she ended up staying at Vanderbilt and receiving tenure.
“I loved the opportunity to dive into scholarship. I loved teaching," Farahany said. "I loved students and the opportunity to be able to interact with them, and I realized that probably my real calling was getting to think about and write about the ethical, legal and social implications of emerging technologies."
Farahany explained her work and the ethical gray areas she's thinking about.
She said that society is in the early days of another industrial revolution, which will be both extraordinarily disruptive and beneficial. She described the future as “different but inevitable.”
“Analyzing the impact of scientific breakthroughs on law and philosophy is a way to be able to contribute to society by saying that these are really exciting advances, but we need to be able to get ahead of the potential downside and safeguard against it so that we can realize their true potential without harm or at least by minimizing harm,” she said.
Farahany discussed the rising hype around "precision medicine," a field focused on using personal genetic information to tailor appropriate treatments to a patient. Studying genetics leads to tension, however, due to the struggle between researchers' need for people to share their genetic information and protecting people's genetic privacy against "misuse."
Several issues surround brain surrogates, or models used to study the human brain, which have become more widespread in neuroscience research. Namely, there are a lot of questions without easy answers.
“The closer the proxies get to the real brain, the more challenging it becomes for society because we have to come to decide, 'What is a brain in a dish?'” Farahany said. “Is that an entity that should be accorded some welfare interest? Is it conscious? What does consciousness even mean? What does it mean to be human? What does it mean to be alive?”
Get The Chronicle straight to your inbox
Signup for our editorially curated, weekly newsletter. Cancel at any time.
Farahany stressed the importance of bringing scholars like her to the table to address ethical quandaries in research.
“I think the best way to maximize potential and minimize harm is by actively engaging in the process of deliberative democracy," she noted. "One of the great things about the development of artificial intelligence is that the researchers are eager to welcome the engagement of ethicists in their work.”
Currently, Farahany is working on a book called "Cognitive Liberty," which discusses the ethics of enhanced brain technologies.
“It will address all the ways in which we can access and change our brains, enhance them, diminish them, peer into them and monitor them,” she explained. “The future will offer wonderful opportunities for how we can access our brain, but it will also require us to think carefully about our rights, what freedom of thought means to us, and what we want our society to look like.”