Professor White & Colleagues “Regret” Ignoring Null Results in GETSET Trial Follow-Up

By David Tuller, DrPH

Last week, I wrote about the correction made to the “Highlights” section of the paper reporting the long-term follow-up results for the GETSET trial. (The trial was conducted by Professor Peter White, one of the three lead PACE investigators, and colleagues.)  I noted that this correction was not indicated or identified—a fact I attributed to the ambiguous status of “Highlights” sections, which could be viewed as more like news headlines than part of a paper’s core text.

But I was wrong—the official notice of a correction, called a corrigendum, has now appeared. For whatever reason, the correction itself was made before the corrigendum was posted. And unlike corrigendi that just state what was corrected, this one provided something of an explanation. Here is the full text:

The authors regret that they did not submit a revised highlights statement, when resubmitting their paper, in the light of revisions made to the final article. The revised highlights include a new statement, which reads: “There were no differences between interventions in primary outcomes at long-term follow up”. This is a substitute for: “Guided graded exercise self-help (GES) may lead to sustained improvement in fatigue.” The authors also made a minor correction to another highlight, which now reads: “The study showed that GES probably was cost-effective”.

The revised highlights are as follows:

• There were no differences between interventions in primary outcomes at long-term follow up.

• There was no evidence of greater harm after GES compared to specialist medical care at long-term follow-up.

• The study showed that GES probably was cost-effective.

• Most patients remained unwell at follow up; more effective treatments are required.

The authors would like to apologise for any inconvenience caused.

**********

First, I want to thank the Journal of Psychosomatic Research and its editorial team for pursuing this important issue. Professor White, who is on the journal’s advisory board, could not have been pleased at having to make such a correction—especially based on a complaint coming from me, given my active role in discrediting the PACE trial. Even so, the journal obviously decided that the concerns I raised were valid and well-founded. For Professor White, that must have stung.  

No one wants to correct a paper—just like no one likes to have to correct a news article. In many or most cases, however, corrections do not undermine the entire message the authors hoped to convey. In this case, that is precisely the impact. The first sentence of the new “Highlights” section now clearly states that the study had null results. The previous first sentence ignored the bad news and falsely presented the results as positive.

One telling aspect of the corrigendum was the suggestion that the deceptive framing in the published “Highlights” section was meant for an earlier draft of the paper. Professor White and his colleagues thus presented their mistake as having failed to revise that section at the same time they revised the paper itself. The problem with this explanation is that an initial draft of the paper that portrayed the results along the lines of the now-disappeared “Highlights” sentence would have involved serious misrepresentation of the trial’s findings. Given the null results, it is troubling that anyone involved believed it was responsible or ethical to submit a draft–or “Highlights” section–that downplayed them to such a degree.

The corrigendum also acknowledged a minor edit in the “Highlights” section sentence about cost-effectiveness. (I hadn’t noticed this change previously, and I’m not completely sure what the change is, since the corrigendum unfortunately didn’t include this detail.) In any event, the current version of the statement appears nonsensical, given the null results. In what way can a treatment be said to be “cost-effective” if it does not produce beneficial impacts? Cost-effective at what, exactly?

Professor White and his colleagues need to come up with some better logic on that one. As it is, the assertion that the treatment “probably was cost-effective” is essentially meaningless. and makes the investigators look ridiculous.

As for the Journal of Psychosomatic Research, it has not disclosed whether or not the “Highlights” section was peer-reviewed. It goes without saying that the initial version of the “Highlights” section should never have been allowed to pass through the editorial process unchallenged by editors and peer-reviewers—especially if the paper itself had been revised to address related inadequacies, as the corrigendum indicates.

The journal should make clear whether it normally subjects “Highlights” sections to peer review. If so, what went wrong in this instance, and what will the journal do to ensure this sort of lapse does not recur? And if not, will the journal now insist that “Highlights” sections be subjected to peer review like other parts of the manuscript? I assume so—but full transparency would be welcome.

Comments

16 responses to “Professor White & Colleagues “Regret” Ignoring Null Results in GETSET Trial Follow-Up”

  1. CT Avatar
    CT

    We need more people like David to help put science to rights. Perhaps there should be university departments and courses that are devoted to it?

  2. Andrea Avatar
    Andrea

    The Journal of Psychosomatic Research is just vanity publishing for the misogynist boys network in a field of study that shouldn’t exist in 2021.

  3. JK Avatar
    JK

    If these clowns believe a demonstrably useless intervention is cost effective, I have a bridge to sell them. Thanks David for your good work in achieving this important and gratifying win 🙂

  4. Steven Lubet Avatar
    Steven Lubet

    I doubt that the Highlights sections are peer reviewed. Those are the sorts of things that are added after acceptance. There is no point in writing Highlights unless the revised piece is being prepared for publication. Of course, an editor should still review the Highlights to make sure they are consistent with the body of the article.

  5. Rigmor Avatar
    Rigmor

    «cost -effective» seems to have been mixed up with «wasting good money» – they’ve been spending money for useless research. Every penny has an alternative use – these might have been spent for good and solid studies instead of this crap built on more crap. In Norway we call this behavior «throwing money out of the window». Hardly known as cost-effective.

  6. Kathrin Avatar
    Kathrin

    I hope this dark chapter in medical history ends soon.
    I believe we would start speak about basic human rights in the field of Myalgic encephalomyelitis.
    Enough is enough, and this nightmare have to end.

    Thank you, David, for your fabulous work, getting skelletons out of the closet! You are fab!

  7. Jen Avatar
    Jen

    One day we can close the chapter on these disgraceful people. Their ‘regret’ is far from authentic and extremely late.

  8. Linda Sanday Avatar
    Linda Sanday

    ‘Cost effective’ would be in there surely to be picked up by algorithms for certain health organisations.

  9. SusanC Avatar
    SusanC

    It does sometimes happen when you’re working to a tight deadline that you write most of the paper before you’ve got the experimental results, and then either (a) the results are not quite what you were expecting; or (b) the deadline is upon you, and some of the experimental data that your draft talks about still doesn’t exist. A rapid rewrite of the paper is in order, and sometimes things get missed by the authors, who are in a hurry. (I.e, they forget to fix some text where they talk about data that either doesn’t exist, or doesn’t show what they claim it shows)

    The referees absolutely should reject papers that have that kind of inconsistency.

    I think it’s a scandalous failure of reviewing that the original highlights got into print, (And if no one is checking what the highlights section says, that’s an even bigger failure of the system)

  10. LB Avatar
    LB

    Are you able to push for a correction on the cost effectiveness of this because this simply isn’t true and is misleading.

    That would suggest they’d welcome funding to treat people with null results.

    This seems to be a regular occurrence in things White and Chalder are involved in, for the NHS it’s a waste of time and money. For patients who have little energy and significant payback, the treatment will cause harm in the form of PEM.

    When will this madness end?

    Thank you for your work so far, it must have stung… a lot.

  11. Dawn Avatar
    Dawn

    To paraphrase Arthur Dent:

    “Ah, this is obviously some strange usage of the words “cost effective” that I wasn’t previously aware of.”

    Thank you David for your persistence pursuing this misrepresentation of null results.

  12. Sarah Avatar
    Sarah

    Wild thing, I think I love you. Thank you for your bringing your tenacity, perspicuity, and erudition to this matter.

  13. Mónica Avatar
    Mónica

    David Tuller es mi héroe, defensor a ultranza de la ciencia con criterios claros, sin sesgos y con resultados reales, no acomodados a los resultados previamente concevidos.

  14. Jen Avatar
    Jen

    Great work, David!

    Can they actually say it’s cost effective without any data or figures to support this claim? How did they work this out? What are the costs and perceived benefits of an utterly useless therapy? It would be more cost effective not to subject patients to this nonsense therapy. Or are they saying it’s cheaper to just ignore a patients actual medical needs? As that is not true either. Only effective treatments that have measurable outcomes such as being well enough to return to work in the long term will actually be cost effective? Or are they somehow exempt as they added “probably”. Hardly scientific. Seems a bit dodgy!

  15. Alicia Butcher Ehrhardt, PhD Avatar
    Alicia Butcher Ehrhardt, PhD

    The papers that cite this paper – all those who have been inconvenienced by the LIE – need to be corrected as well.

    Who is going to do this job?

    Wasn’t that the main problem, that other researchers listed this paper, with its incorrect information, as support for something they wrote – which is also wrong?

    Nice job for a grad student learning how lies propagate.

  16. Sue Avatar
    Sue

    Alicia, you’re so right. This is what people with ME tried unsuccessfully to stop: Contamination of the research literature pool.

    It’s as if the scientists were swimming normally as they usually do in a pool full of knowledge. Then a group of hooligans came along and peed, then schat, in the pool. Legitimate scientists who could, got out immediately. That’s what happened to ME/cfs research back then–competent researchers left the field to the BPS hooligans, whose “contributions” made the literature pool worse and worse. Today, David and others are calling them to clean up, and the BPS crowd has been lying so long they don’t know what truth is.