Recently released documents show that federal researchers had more concerns than previously reported about the now-discredited cancer genomics research conducted by Dr. Anil Potti.
According to the documents, the National Cancer Institute continued to raise questions about the research and its use as justification for clinical trials at Duke even after a Duke review concluded in late December 2009 that the trials could continue. The information in the NCI documents is another indication of the growing doubts about Potti’s research in the months leading up to his suspension and resignation.
NCI scientists reported in early June that they could not reproduce the results of a key research paper co-authored by Potti that was being used to justify clinical trials, the documents show. The paper, published in the Journal of Clinical Oncology in 2007, purported to use a genomic test to predict whether a chemotherapy drug known as cisplatin would be useful in treating an individual’s cancer.
“We have been unable to reproduce any of the results reported for the cisplatin chemosensitivity predictor as they were presented in the [JCO] paper, which has been repeatedly cited as providing validation for the predictor,” NCI statistician Lisa McShane wrote in the June report, which was sent to the Duke researchers and Vice Dean for Research Sally Kornbluth. “The data, computer code, and instructions provided to us by Dr. Potti did not enable us to reproduce the results in the paper, and we do not know why or when the methods were changed.”
The NCI recently provided the documents to the Institute of Medicine as part of the IOM’s review of Potti’s research at Duke’s Institute for Genome Science and Policy and similar studies. The IOM then released them to The Cancer Letter.
Several weeks after the NCI found it could not reproduce Potti’s research, Duke researchers traveled to Maryland to meet with the NCI to discuss the organization’s concerns. After the June 29 meeting, attended by Kornbluth, IGSP Director Huntington Willard, Potti and other researchers, concerns about Potti’s data became more apparent.
“These interactions highlighted further lapses in data handling and analysis, and raised additional questions about the provenance of the data,” Kornbluth and Dr. Michael Cuffe, vice president for medical affairs, wrote in a summary of events for the IOM.
After the meeting, the NCI continued to harbor doubts about the Duke research, according to the prepared speech of McShane at the first meeting of the IOM review committee in December.
“The meeting concluded with NCI remaining unconvinced of the validity of the Duke predictors,” McShane said, according to the prepared remarks.
The clinical trials—including one funded in part by the NCI—continued, but the NCI requested that Duke provide it with the original data and computer code that might validate the researchers’ conclusions.
In an interview, McShane said that after the June 29 meeting the NCI gave the Duke researchers a final chance to establish the source and accuracy of their data and prove that their predictor worked. But that inquiry was soon overshadowed by the events that led to Potti’s suspension and eventual resignation.
“The scientific discourse with the NCI and Dr. Nevins and Dr. Potti clearly was not reaching a resolution, and so when the [resume] allegations came out only two weeks later, it was pretty clear we needed to pause these trials and figure out whether we were on sound footing,” Cuffe said.
The paper that provided the scientific justification for the predictor was retracted in November because Duke researchers could not reproduce its findings.
McShane was not the only one pointing out problems in cancer genomics research conducted by Potti.
Two biostatisticians from the University of Texas MD Anderson Cancer Center, Keith Baggerly and Kevin Coombes, also examined research conducted by Potti and others at Duke and encountered problems similar to those identified by the NCI. Their concerns that those problems could lead to patient harm, echoed by the NCI, helped prompt the 2009 Duke review of clinical trials based on the research.
But Duke officials did not provide biostatistics reviewers with a report from Baggerly and Coombes containing evidence of additional research flaws, and the reviewers ultimately approved re-opening the clincal trials. Baggerly said he is confident that had the reviewers been given the report, the data problems would have been spotted.
The administrators responsible for the review, including Institutional Review Board Chair Dr. John Harrelson, decided not to forward the report to the biostatisticians in order to preserve their objectivity, Harrelson said. He noted that the biostatisticians were provided with the articles Baggerly and Coombes had published criticizing the research.
“Our concern was to have an unbiased review of the data and to not have the accuser try the defendant,” Harrelson said
But Baggerly said the November 2009 report contained information not available elsewhere. He pointed out that the analysis was based on new information the Duke researchers posted online in early November.
He noted that even as Duke was investigating the research of Nevins and Potti, the researchers continued to post incorrect data online.
“What we have is documented proof that the data is wrong as the investigation is underway,” Baggerly said. “Indeed, it’s wrong for two drugs that they’re using in clinical trials for two years.”
Harrelson said that given what he and other administrators knew at the time, the decision not to give the report to the reviewers was the right one. He said the reason the review failed to detect problems was that the researchers did not provide the biostatisticians with all of their raw data.
“Our request to the investigators at Duke was that the reviewers be provided with all of the source data,” Harrelson said. “It turns out that they were not, and that’s what lead to the subsequent concern and the retraction of the papers by Dr. Nevins.”