The PACE trial may, in the end, be about more than an injustice done to the chronic fatigue syndrome (ME/CFS) community. It may become, if the latest devastating criticism of it takes hold, exhibit number one in the medical community of how not to do a clinical trial.
[fright]
[/fright]It also presents a challenge to the orthodoxy of the medical profession. To think that a bunch of sick patients and a journalist working without pay could bring down an $8 million dollar study and embarrass one of the most respected medical journals in the world. It shouldn't happen but with the latest critique emanating from a statistician one wonders how much time the PACE trial and Lancet have left.
Key Figures
The PACE trial has taken many shots but the first shots taken - by ME/CFS patients - may have been the most important. Tom Kindlon, Graham McPhee and others in the chronic fatigue syndrome (ME/CFS) community created the foundation for the trials trouble. It wasn't that they objected to the trial's conclusions; it was how methodically they did it. The rigorous manner with which they presented their case give their critique the legs it needed to get out of the community.
That happened in a big way when David Tuller blew the whole thing open with his multi-part series "Trial by Error" in Virology Today last October. Tuller couldn't get a major media outlet to bite so he published his huge piece in a small virology blog. You wouldn't expect it to get much traction there, but Tuller's exhaustively researched series began a firestorm that has not quit. Tuller is the second essential piece in the PACE saga. Without Tuller - PACE stands. After Tuller - who knows what will happen.
[fleft]
[/fleft]Jon Cohen followed Tuller's first blow with a short review in Science. In it he quoted, one of the PACE authors, Michael Sharpe, saying he doesn’t think there’s “a growing army of people upset about this". (Boy, was he ever wrong.)
In November six researchers demanded that Lancet review the 2011 study, James Coyne got in the act with a call for the release of the PACE data, and patients filed an Freedom of Information Act to get access to the same.
Julie Rehmeyer, a journalist with ME'/CFS who recently won an award for excellence in statistical reporting got the PACE trial into the mainstream media with a November piece in Slate. The Wall Street Journal picked up the ball in March after 43 scientists asked Lancet to reanalyze the data and an 11,000 person MEAction petition was submitted with an Amy Dockser Marcus article.
Statistics Organization Speaks Out
All that was prelude, however, to the latest and perhaps most devastating blow to the PACE trial yet - an open critique from Rebecca Goldin, the director of Stats.org, and Professor of Mathematical Sciences at George Mason University. Goldin's entry into the debate indicates that the PACE trial controversy is now bigger than ME/CFS; that it's being held up and examined in the medical community as a case study of a major research effort gone wrong.
[fright]
[/fright]If this keeps up the 600 person plus, $8 million PACE trial may end up making the textbooks in a way its authors could not have imagined: a case study of how not to produce a clinical trial.
Goldin, who has no connection at all to ME/CFS, but is very committed to statistical rigor in research, was scathing in her critique. Stating that "problems with the study (existed) on almost all levels" Goldin focused on just one area - study design. Goldin doesn't just review the problems in the PACE trial - she dissects them, going through them step by step.
Selection Bias
Problems with "selection bias" lead the authors of the trial to select out patients who would be most likely to benefit. Goldin reported that the patients in the trial were "more likely to suffer from depression, less likely to meet the clinical criteria for ME, more likely to be able to function a little bit, and less likely to have found an effective way to manage their illness."
The authors got the population they wanted. The strangest thing about the trial in the end, though, may not be all the shenanigans the authors went through to up their recovery rates of the participants in the trial but how poorly the trial did even after doing so.
Shifting Recovery Criteria
[fleft]
[/fleft]"Dramatic" alterations in recovery criteria as the trial was underway made it easier for the authors to label the patients as "recovered". Goldin cited the now notorious alteration which made it possible for patients to meet the criteria for ME/CFS and be classified as recovered from it at the same time.
Goldin seemed like nothing if not shocked by the manipulations of the physical functioning criteria. The physical functioning score required for someone in the trial to be considered recovered from ME/CFS was 60 - just five points lower than the normative scores for 75 year olds. Those normative scores, by the way, are not from healthy 75 year olds - they're the average score for all 75 year olds - healthy and sick.
With 39 being the average age of the PACE trial participants, the PACE trial authors were essentially saying that if we can get a 39 year old to function at the level of a 75 year old, they would be considered "recovered". (The normative value for a 39 year old is 93). The PACE authors justified their criteria but in doing so exposed a basic mathematical error they made when they confused median values for mean values.
Goldin goes through two more of the changes to the recovery criteria that occurred over time including one which was so shocking that she asked the authors to confirm she had it right. (She did).
A PACE "Recovery Story"
She then presented an example of a patient the PACE trials might have helped to "recover" from ME/CFS.
Doomed Study
In this long piece Goldin goes on to do more analyses that poke holes in the study and the authors reasoning. In the end the best she can say about the trial is that it provides an example of how to throw $8 million down the tubes.
Lost Trust
The burning question now is what Lancet, the journal the original study was published in, will do. One of the most prestigious medical journals in the world, studies published in Lancet carry enormous weight. I recently read a book on a large effort designed to assess the world's health problems. Tens of millions of dollars were spent on it over several years. The authors first goal was to get their publications into Lancet. Every analysis and every word was poured over with an eye to meeting Lancet's exacting standards - because, if it appears in Lancet, it's trusted.
[fright]
[/fright]The PACE trial results (which have been published in several journals) were trusted. In an editorial "On PACE: An Editorial" published alongside Goldin's critique Trevor Butterworth noted that the Independent's headline was “Got ME? Just get out and exercise say scientists.” The Medical News Today reported that Fear of exercise' is biggest barrier to chronic fatigue syndrome recovery". Others media pieces Goldin cited were:
That's of course very troubling to the ME/CFS patients who have been stymied again and again by their inability to be active, let alone "exercise". That a study of this size and cost and potential impact could be so shoddily run, and then show up in one of the best medical journals in the world, basically blew science reporter Julie Rehmeyer's mind.
That's a tough thing to lay on a journal that's built up an impeccable reputation over many years. With this critique from an established statistician, the stakes for Lancet and other journals publishing the PACE studies just got higher. Will they lance the growing PACE boil, and let outside experts re-assess the studies or will they continue on their way, an embarrassment to themselves, and if Goldin is right, the medical profession itself.
[fright]
Key Figures
The PACE trial has taken many shots but the first shots taken - by ME/CFS patients - may have been the most important. Tom Kindlon, Graham McPhee and others in the chronic fatigue syndrome (ME/CFS) community created the foundation for the trials trouble. It wasn't that they objected to the trial's conclusions; it was how methodically they did it. The rigorous manner with which they presented their case give their critique the legs it needed to get out of the community.
That happened in a big way when David Tuller blew the whole thing open with his multi-part series "Trial by Error" in Virology Today last October. Tuller couldn't get a major media outlet to bite so he published his huge piece in a small virology blog. You wouldn't expect it to get much traction there, but Tuller's exhaustively researched series began a firestorm that has not quit. Tuller is the second essential piece in the PACE saga. Without Tuller - PACE stands. After Tuller - who knows what will happen.
David Tuller may not get a Pulitzer Prize for investigating PACE trial on a blog; but his service to—and we do not exaggerate—millions of sufferers around the world make it hard for us to think of another work of journalism so deserving of commendation.
[fleft]
In November six researchers demanded that Lancet review the 2011 study, James Coyne got in the act with a call for the release of the PACE data, and patients filed an Freedom of Information Act to get access to the same.
Julie Rehmeyer, a journalist with ME'/CFS who recently won an award for excellence in statistical reporting got the PACE trial into the mainstream media with a November piece in Slate. The Wall Street Journal picked up the ball in March after 43 scientists asked Lancet to reanalyze the data and an 11,000 person MEAction petition was submitted with an Amy Dockser Marcus article.
Statistics Organization Speaks Out
All that was prelude, however, to the latest and perhaps most devastating blow to the PACE trial yet - an open critique from Rebecca Goldin, the director of Stats.org, and Professor of Mathematical Sciences at George Mason University. Goldin's entry into the debate indicates that the PACE trial controversy is now bigger than ME/CFS; that it's being held up and examined in the medical community as a case study of a major research effort gone wrong.
The question of how all this happened and how the criticism is being handled have sent shockwaves through medicine. The results from PACE (including these) have been published in prestigious journals and influenced public health recommendations around the world; and yet, unraveling this design and the characterization of the outcomes of the trial has left many people, including me, unsure this study has any scientific merit. How did the study go unchallenged for five years?
[fright]
Goldin, who has no connection at all to ME/CFS, but is very committed to statistical rigor in research, was scathing in her critique. Stating that "problems with the study (existed) on almost all levels" Goldin focused on just one area - study design. Goldin doesn't just review the problems in the PACE trial - she dissects them, going through them step by step.
Selection Bias
Problems with "selection bias" lead the authors of the trial to select out patients who would be most likely to benefit. Goldin reported that the patients in the trial were "more likely to suffer from depression, less likely to meet the clinical criteria for ME, more likely to be able to function a little bit, and less likely to have found an effective way to manage their illness."
The authors got the population they wanted. The strangest thing about the trial in the end, though, may not be all the shenanigans the authors went through to up their recovery rates of the participants in the trial but how poorly the trial did even after doing so.
Shifting Recovery Criteria
[fleft]
Goldin seemed like nothing if not shocked by the manipulations of the physical functioning criteria. The physical functioning score required for someone in the trial to be considered recovered from ME/CFS was 60 - just five points lower than the normative scores for 75 year olds. Those normative scores, by the way, are not from healthy 75 year olds - they're the average score for all 75 year olds - healthy and sick.
With 39 being the average age of the PACE trial participants, the PACE trial authors were essentially saying that if we can get a 39 year old to function at the level of a 75 year old, they would be considered "recovered". (The normative value for a 39 year old is 93). The PACE authors justified their criteria but in doing so exposed a basic mathematical error they made when they confused median values for mean values.
Goldin goes through two more of the changes to the recovery criteria that occurred over time including one which was so shocking that she asked the authors to confirm she had it right. (She did).
A PACE "Recovery Story"
She then presented an example of a patient the PACE trials might have helped to "recover" from ME/CFS.
For the sake of illustration, we can imagine someone who came into the study with extreme and debilitating fatigue, scoring a 6 on the bimodal fatigue scale, and 65 on the physical function scale. The person meets the clinical definition of CFS according to the Oxford criteria, but is generally an upbeat person.
While in the trial, she has not improved in her overall fatigue or her physical function, but she is quite happy to be getting expert medical care, and she is also sleeping a little better thanks to sleep medications. She rates herself as “much improved”, but not “very much improved” on the CGI, but she records no differences in her assessments of physical function or fatigue. She still cannot walk a mile, remember her words, or hold a job. Yet she is a case of someone who has “recovered” thanks to the CGI.
Doomed Study
In this long piece Goldin goes on to do more analyses that poke holes in the study and the authors reasoning. In the end the best she can say about the trial is that it provides an example of how to throw $8 million down the tubes.
It seems that the best we can glean from PACE is that study design is essential to good science, and the flaws in this design were enough to doom its results from the start.
Lost Trust
The burning question now is what Lancet, the journal the original study was published in, will do. One of the most prestigious medical journals in the world, studies published in Lancet carry enormous weight. I recently read a book on a large effort designed to assess the world's health problems. Tens of millions of dollars were spent on it over several years. The authors first goal was to get their publications into Lancet. Every analysis and every word was poured over with an eye to meeting Lancet's exacting standards - because, if it appears in Lancet, it's trusted.
[fright]
- “Psychotherapy Eases Chronic Fatigue Syndrome, Study Finds”—New York Times
- “Pushing limits can help chronic fatigue patients”—Reuters
- “Brain and body training treats ME, UK study says”—BBC
- “Therapy, Exercise Help Chronic Fatigue Syndrome”—WebMD
- “Helping chronic fatigue patients over fears eases symptoms”—Fox News
- “Chronic fatigue syndrome patients’ fear of exercise can hinder treatment – study”— The Guardian
- “Study supports use of 2 controversial treatments for chronic fatigue”—CNN
- “Chronic Fatigue Treatments Lead To Recovery In Trial”—Medical News
That's of course very troubling to the ME/CFS patients who have been stymied again and again by their inability to be active, let alone "exercise". That a study of this size and cost and potential impact could be so shoddily run, and then show up in one of the best medical journals in the world, basically blew science reporter Julie Rehmeyer's mind.
But when you dig down into the details, you find that the data doesn’t support the researchers’ claims. The most amazing problem (among many) is that “recovery” was defined so loosely that patients could get sicker over the course of the study and still be said to have recovered! But this study is considered top notch, gold standard work. The media has fallen down here too—no article in the mainstream media has ever seriously analyzed this study, even though patients are being injured by it regularly.
The whole thing has been really shocking for me. It’s had a huge impact on my perspective on science and the world as a whole, and there are a lot of controversial issues where my emotional stance has changed in a really big way.
That's a tough thing to lay on a journal that's built up an impeccable reputation over many years. With this critique from an established statistician, the stakes for Lancet and other journals publishing the PACE studies just got higher. Will they lance the growing PACE boil, and let outside experts re-assess the studies or will they continue on their way, an embarrassment to themselves, and if Goldin is right, the medical profession itself.
Last edited: