These buttons use cookies: Learn More

Facts are scared… er, sacred

12/07/2011
by Rob Findlay

Yesterday the Guardian Datablog (slogan: “Facts are sacred”) published a piece on NHS waiting times, which I am afraid I described on Twitter as being “truly awful”. @tobyhillman gently asked if I could “deconstruct in a blog, would be good to have comparison/ alternative presented alongside”. So that is what this post is all about.

The Guardian’s blog covered A&E, diagnostic and cancer waiting times too, but I’ll stick to my specialist subject of referral-to-treatment (RTT) waits. The Guardian’s piece promises:

New data journalism shows exactly how much worse NHS waiting times have got

But my main complaint about their piece is that it lacks any insight into the underlying dynamics of waiting times, and is therefore ill-equipped to deliver this promise. For instance:

For complex statistics, it’s often possible to extract the data in different ways to support different positions. With issues like NHS waiting times, the issues are rarely simple: as is happening at present, it is possible for the waiting time of a “typical” patient to fall, while thousands of extra patients face waiting times longer than those mandated in the NHS constitution.

This statement is true, but fails to expose the crucial point that reducing the waiting time of a “typical” patient can directly cause thousands of extra patients to wait longer, if lots of patients start jumping the queue without good clinical reason. Badly-set targets can cause this in real life: if you start forcing down the “typical” (i.e. median) waiting time with targets, as the NHS is now doing, then you are encouraging queue-jumping by routine patients and therefore putting upward pressure on long-waits.

The Guardian’s “Treatment within 18 weeks” panel is where they pull the numbers out:

Measure: Number of patient waiting over 18 weeks for hospital treatment
Change year-on-year: +11% (by patient numbers), or +24% (by percentage of patients affected) (April 2011 versus April 2010)
Patient numbers: 2,387 more people had waited more than 18 weeks for their treatment in April 2011 versus April 2010, despite the number of procedures carried out dropping by over 29,000
Description: The key waiting time measure introduced under New Labour, patients are guaranteed under the NHS constitution that they will receive their hospital treatment within 18 weeks of GP referral. The coalition government reaffirmed its commitment to this target as part of the NHS listening exercise, where it was one of David Cameron’s five pledges on the NHS.
Source: 18-week waits: “Referral to Treatment Waiting Times Statistics, Adjusted Admitted Pathways (Provider data)”, Department of Health

The first problem is that the NHS constitution guarantee does not cover the “Number of patient (sic) waiting over 18 weeks for hospital treatment”. It covers the waiting times of those patients lucky enough to be treated, not those patients still waiting. The Guardian’s instincts were right, though: the target should be based on those still waiting. But the Guardian has missed the big story again, which is that the Government is measuring the wrong thing: they are measuring the rate at which the backlog is being cleared, not the backlog itself. Even worse, they have set targets that limit how fast hospitals can clear the backlog. So the whole thing is upside down.

Secondly, the Guardian are being somewhat selective over their timeframes. April 2010 is close to the low point on several waiting time measures, so all those measures will look worse by comparison. The complete time trend tells a more interesting story of deterioration over the winter, followed by improvement.

England waiting time trends - all specialties

England waiting time trends - all specialties

Finally, I don’t think they even have their numbers right. The increase in over-18-week adjusted admitted pathways over that timeframe is 2,353, not 2,387. It’s only a small difference, but a difference nonetheless. They quote the source as being provider data, which might explain the discrepancy; if you want to quote England-wide statistics you should use the commissioner data not the provider data.

Let’s end on a positive. This morning saw a rare example of mainstream journalists tracking the right measure (patients who are still waiting) instead of just following the numbers the government prefers. More of that, please, and well done the BBC.
Return to Post Index