That's one of the lessons of the Ketek investigation and one that Theravance experienced painfully late last month. On March 3, Theravance explained more about the last-minute cancellation in the planned February 27 FDA advisory committee review for its injectable antibiotic, telavancin.
The delay stems from FDA's unwillingness to review the application after the removal of data from one study site. Theravance said that the agency asked the company to remove the data from the site on January 30, twenty-seven days before the scheduled Anti-infective Drugs Advisory Committee and nineteen days after the agency first announced its intention to hold the advisory committee review of the drug.
Theravance maintains that the "removal of these data had no material impact on the overall efficacy and safety results or conclusions of the study previously reported, and we believed the site issue had been adequately addressed by the data removal."
In that case, the company was too focused on its own application and not paying attention to the overall environment and the pressure on FDA from Grassley's intense oversight -- focused in particular on the agency's willingness to overlook some questionable data to keep an applicaiton on track for approval.
Grassley sent FDA a letter on December 19 in his continuing review of FDA's handling of the Sanofi-Aventis Ketek (telithromycin) application that created a de facto standard: a sponsor cannot resolve a data integrity issue by just removing the results from the questioned center or investigator.
In his mid-December letter, Grassley made it very clear that he expects FDA to halt the approval process on applications if the agency finds any problems with data integrity. (See here for our previous coverage of Grassley's Ketek rule.)
Grassley's letter coming right before the Christmas holiday break was probably ignored by much of the financial community. It was also overshadowed by Theravance's December 27 announcement that it was getting ready for an imminent advisory committee review.
FDA tried to work around questionable data before. It asked an advisory committee previously to review the Ketek application with some suspect data from a clinical investigator removed. Based on the heat it has taken from Grassley for that review, FDA was unlikely to repeat that process of a quick fix for the data in the case of another drug, especially another antibiotic.
Theravance says it is "committed to working with the FDA to resolve outstanding issues related" to the telavancin clinical trials for complicated skin and skin structure infections, the "ATLAS program." The company reports that FDA wants to "further evaluate study site monitoring and study conduct to ensure data integrity in the ATLAS Phase 3 program." The agency wants to look into monitoring at other sites and not just take the word of the contract research organization.
Other NDA sponsors would be well advised to pay more attention to all pending data integrity issues for products nearing the review stage. The agency's division of scientific investigations is clearly emboldened and carrying more weight within FDA from Grassley's attention (See here).
Theravance's follow-up statement, however, at least begins to peel back the cover from the mystery of the recent wave of meeting cancellations. For a while in February, it looked like FDA did not want to bring any products to advisory committees. The agency canceled three meetings last month (see here).
This is another reason why I'd love a more open clinical trials process. I'm very interested in the reason that the data was pulled from the site in question. Glycopeptides must be administered very carefully or inflammation can occur at the injection site, as happened with oritavancin in 2003 but was later shown to be caused by too rapid an administration and can also occur with vancomycin.
ReplyDeleteI have to say that I kind of pity the FDA (*gasp*) because if there is any kind of a large scale MRSA outbreak they will really get it from the public for not approving newer antibiotics more quickly, on the other hand examples like ketek are putting pressure to take more time in reviewing all the data.
You know if all the data was more readily available to the public, then professionals such as yourselves could weigh in on concerning issues instead of all the responsibility lying with the FDA....just a thought.
Drewaight, as you probably know, Study Investigators at the clinical sites must adhere to the study protocol in order for the data to have integrity and validity. Many times data is pulled simply because they did not do this. Not doing so may change the interpretation of the data. This happens and needs to be brought to the company's attention. And the company's study monitors need to be trained to spot this when they do regular site reviews (IF they do them at all). It is poor science- and business - when when you need the FDA to highlight these errors. As well as suspicious. The companies are in charge of attesting to the validity under which the study was performed. They even even sign-off as to the validity of the data when they submit it to the FDA.
ReplyDeleteKudos to Sen. Grassley. I find the whole matter beyond suspicious.
Drewaight makes a good point about the need for more openness in the process.
ReplyDeleteFDA is heading there and probably will speed up in that direction in 2009.
The public disclosure of clinical trials and data submitted to FDA called for by the new FDA Amendments Act will create new opportunities for checking into all types of situations -- including allegations of data integrity problems. Which site had the problems? Who was the primary CRO for the trial?
More public attention to the who and how of the study may also begin to assuage the skeptical (suspicious) attitude of "anonymous".
Thanks for the comments.
(posting anonymously since I have a trial under way myself)
ReplyDeleteIt's hard for me to understand how to manage a trial under these circumstances (as stated -- there may be more context I don't realise).
In any complex trial there will always be protocol deviations. Nobody wants them to pollute the dataset.
It seems to me that if your remaining n is high enough that you still have a sufficient p value for all the arms you shouldn't be penalised for this[*]. You still should de metaanalysis on the deviations (perhaps, as drewaight said, it points to an unanticipated difficultly of administration), but if nothing is apparent, then go ahead.
Seems like an analysis like this should be no different from the analysis of SAEs. Having one site out of 20 that generated all of the study's deviations seems little worse than a patient struck hit by a board falling of a construction site.
It might be possible to get around this by writing some handstand CYA language into the protocol. Although that would help drug development, it would be a disaster. If the protocols become legally-driven rather than scientifically-driven ... ugh, I don't like to think where that would lead.
[*] Yes, I know that's a hairy problem since the trial is still blinded, but hardly intractable.