WASHINGTON, D.C. — The mentor of discredited Duke cancer researcher Dr. Anil Potti provided the first public account of his thoughts and actions as the cancer research he developed with Potti and other scientists came under fire.
In an hour-and-a-half-long presentation to an Institute of Medicine committee, Joseph Nevins, Barbara Levine professor of breast cancer genomics, defended his and Duke’s handling of criticisms about the research that mounted over several years. But he acknowledged that he failed to identify problems with the underlying data Potti used to conduct cancer research at Duke’s Institute for Genome Sciences and Policy, which was regarded as groundbreaking when it was published.
“I didn’t recognize that a critical flaw in the research effort was one of data corruption, an apparent manipulation of validation data,” Nevins told the committee.
The IOM committee is working to establish standards for genomics research and other medical science based on large quantities of data. The committee was established in response to the Potti affair.
In a curt exchange with committee member Thomas Fleming, a professor of biostatistics at the University of Washington, Nevins refused to say how he believed the data problems occurred.
Fleming pointed out that because the problems in the data improved the experimental results, it seemed that they were introduced intentionally.
“I can’t address it,” Nevins responded. “I just can’t get into a position of speculating on how it happened.”
He noted that Duke is conducting a research misconduct inquiry to investigate how the data errors occurred. And although Potti has resigned and accepted responsibility for problems with the data, Nevins never mentioned his former colleague’s name during the presentation.
‘Too good to be true’
Nevins did attempt to explain why it took years for Duke to address the criticisms of the research.
He said he initially thought that the disagreements were about the statistical methods used to evaluate the data not the validity of the studies. He believed that Duke researchers in his group at the IGSP addressed those concerns by adjusting their methodology.
Nevins said he was also heartened when other studies appeared to confirm the initial findings. But he eventually learned that all the studies—focused on determining whether individuals would respond to cancer treatments—also had problems in their underlying data.
Duke’s reaction to criticism about the research was the focus of some harsh words from the chair of the IOM committee.
Dr. Gilbert Omenn, director of the University of Michigan Medical School Center for Computational Medicine and Biology, criticized the “dismissive nature” of Duke’s response to the complaints.
“The lack of investigations when things were brought up raises the question of how do you deal with something that looks too good to be true and might be,” he said. “It went on for three to four years until the problem was acknowledged. That’s something that’s got to be dealt with internally, I think.”
Nevins disagreed with Omenn’s characterization of Duke’s response.
“I frankly would suggest that the institution did a very good job of addressing this, of paying attention to the issues that were raised,” he said. “I fully appreciate the extent to which this has had a very negative impact on investigators not only outside of Duke, but also within Duke.”
More data issues
Still, Duke scientists were conducting clinical trials based on the flawed research until a July 2010 report in the Cancer Letter, an independent newsletter, revealed that Potti had falsified portions of his resume, including falsely claiming to be a Rhodes Scholar.
Following that revelation, enrollment in the clinical trials was halted and investigations into Potti’s work, some of which were already underway, gained new energy. Duke reviewers, including Nevins, soon found more data issues.
“Further analyses revealed corruption of multiple datasets compiled by Dr. Potti,” a Duke background report [PDF] provided to the IOM committee states.
In at least one case, the incorrect data provided support for Potti’s research, but the correct data did not.
Potti resigned Nov. 19 and accepted responsibility for the problems in his research. A research misconduct investigation, which may answer questions about how the errors in the data occurred, is ongoing. The clinical trials have since been stopped, and four papers based on data handled by Potti were retracted.
Lessons from the Potti affair
Nevins offered several lessons from the Potti affair as the IOM committee considers its implications for future research.
He said it is important to ensure the accuracy of all data used in clinical trials, possibly by using software that ensures that data is not manipulated. He noted, however, that it can be difficult to prevent malicious individuals from modifying data.
Nevins also recommended that researchers make public all the information they use to draw their conclusions.
“Ensuring that all data, methods and software are made available in publications is a must,” he said. “To the extent that we didn’t do so, it was a mistake.”
However, many panel members said that making all experimental data publicly available is unrealistic, because it would take up a lot of digital storage space.
“The issue is that we are generating more data than we can actually store,” said Veronique Kiermer, executive editor and head of researcher services for Nature Publishing Group. “What data do we store, and what can we afford to lose?”
The massive quantity of data that scientists are generating poses other challenges as well.
Scott Zeger, vice provost for research at Johns Hopkins University, said the amount of data that scientists are generating is growing much faster than researchers’ ability to interpret or store it. Ultimately, he said, more data does not necessarily help researchers or doctors gain a better understanding of how to treat patients.
Zeger noted that Duke has one of the best approaches to genomics studies, combining researchers from many disciplines in the IGSP and encouraging them to collaborate with each other. Often, institutions do not provide enough financial support for genomics research because it does not fit well into any established academic department.
“In some ways, Duke was a model,” Zeger said. “The question to you is what went wrong.... Was it a misunderstanding about the science or a bad actor that got in the way?”