Education: Reshuffling the deck

From a statistician’s view point changing, re-ordering, removing, ignoring or expanding the trial’s pre-specified objectives is a serious form of scientific misconduct.

Results: We identified 20 clinical trials for which internal documents were available from Pfizer and Parke-Davis; of these trials, 12 were reported in publications. For 8 of the 12 reported trials, the primary outcome defined in the published report differed from that described in the protocol. Sources of disagreement included the introduction of a new primary outcome (in the case of 6 trials), failure to distinguish between primary and secondary outcomes (2 trials), relegation of primary outcomes to secondary outcomes (2 trials), and failure to report one or more protocol-defined primary outcomes (5 trials). Trials that presented findings that were not significant (P≥0.05) for the protocol-defined primary outcome in the internal documents either were not reported in full or were reported with a changed primary outcome. The primary outcome was changed in the case of 5 of 8 published trials for which statistically significant differences favoring gabapentin were reported. Of the 21 primary outcomes described in the protocols of the published trials, 6 were not reported at all and 4 were reported as secondary outcomes. Of 28 primary outcomes described in the published reports, 12 were newly introduced. (NEJM)

Re-shuffling the deck often is a clear sign of an analysis gone astray. The inferential properties of results from pre-specified versus “re-arranged” objectives are very different.

The difference between Search and Research:

Protocol-driven reporting = Research – where the study objectives are publicly pre-specified and remain unaltered throughout the investigation.

Data-driven reporting = Search – where the properties of the study results determine what the study objectives are or what order they take (this is also known as “random research” – it is an act of surrender by the investigators and a subtle form of ready, FIRE!, aim research). Data-driven research is sometimes known a fishing trip.

Fishing for hypotheses is like throwing a dart at a wall – and  then drawing a target around it.
(Andrée Monette)

Data-driven research masquerading as protocol-driven research = Scientific misconduct.

The scientific methods rests on protocol driven research, midstream methodological changes drastically devalue any result. Preservation of the integrity of the research effort  requires publishing the (ranked) study objectives and sticking to them regardless of the results. This protects against false positive (type I) error and ensures trustworthy estimates of the treatment effect.

Re-shuffling the deck is tempting and extremely easy to do but it is also easy to detect. Any journal review can check the following 5 items:
1 Are the trial ranked objectives  published on clinicaltrials.gov?
2 Are the stated objectives in the article text consistent with clinicaltrials.gov?
3 Is the top line result the primary objective?
4 Are all stated objectives reported?
5 Have no extra, unspecified objectives been reported?

A journal article should pass all five checks above, this would take minutes to accomplish and should be among the first tasks to be undertaken during peer review – it needn’t require a specialist statistician reviewer.

These 5 links form a kind of backbone and ensure the integrity of the research effort.

Note: If an interesting, unanticipated result is found in a trial that the investigators deem important it should still be reported – but not as the top line result. It should be reported clearly as an exploratory finding and considered at best, a new hypothesis, at worst a spurious finding.

Suggested reading:
Outcome Reporting in Industry-Sponsored Trials of…
Seroquel and Velligan
Can you trust your research literature?