Civitas
+44 (0)20 7799 6677

Putting the record straight

nick cowen, 17 September 2007

When questioned on Sunday AM yesterday on the subject of the report by Sir Derek Wanless , released last week by the King’s Fund, on how effectively the NHS had spent its money the Health Secretary, Alan Johnson, referred to a recent study by the Commonwealth Fund:
“There was a recent study by the Commonwealth Fund which is the independent organisation in America that compared six health services in the developed world – Canada, Australia, US, Germany and New Zealand and ourselves. We came out top. We came out top on efficiency. We came out top on quality. We came out top on fairness, on equity.”


This is true. But the study is of highly questionable merit.
For one, what Mr Johnson didn’t mention is that Britain comes fourth on health outcomes and fourth on access (largely because waiting times are still much longer than all the other countries; 41% reported waiting 4 months or longer for elective operations in Britain in the last year, compared with just 6% in Germany).
And is the NHS really the most efficient, providing the highest quality care? This seems incredulous, given the damning statistics provided by Sir Derek.
It is worth reminding ourselves of a few rather startling ones relating to efficiency and productivity: between 1998/9 and 2005/6, spending on elective and emergency services rose by 56%, yet activity only increased by 12%. Wanless calculated that had unit costs increased in line with what he thought reasonable, the NHS could have seen an extra 1 million patients in 2005/06. Real unit costs for outpatients and mental health services also increased significantly. This is not the hallmark of an efficient service.
It is also interesting that the Commonwealth Fund survey ranks the US bottom for efficiency. Yet a study by Feacham et al. in 2001, reported in the BMJ, found that if the NHS had the same acute bed day average (even adjusted for the higher proportion of the population aged over 65 years) as Kaiser Permanente, an American HMO serving California, then it could save up to 40 million hospital days or £10bn per year. And this was before the massive unit cost increases.
Quality is admittedly a more difficult kettle of fish – what constitutes quality is somewhat subjective and, with a lack of decent data in this field, comparisons both over time and internationally are incredibly difficult (if not impossible). In fact, what Wanless takes as ‘quality’ of care is ‘primarily improvements in health outcomes’ – if so then ‘health outcomes’ in the Commonwealth study becomes ‘quality’ and the UK is fourth, not top, on this measure.
The Commonwealth Fund instead argue that quality is not about health outcomes, but more of a process thing; ‘right, safe, coordinated and patient-centred care’. Thus they include things like whether doctors get prompted by a computer to give patients a test result, the likelihood of getting an infection in hospital, blood pressure checks, the use of non-physician clinicians in primary care and arrangements for follow-up visits upon discharge from hospital etc. It is on the combination of such measures, not quality as health outcomes, that the UK ranks top.
But still, even this is questionable. I’m not sure if I’d weight whether a doctor received prompts from a computer (here the UK ranks top) the same as getting an infection in hospital (here the UK ranks bottom) – as the Commonwealth Fund do. The measures of quality reported here are also very primary-care biased – an area where the UK has historically been better. And in the subsection of quality, ‘coordinated care measures’, the UK comes top, yet in the Feacham et al. study one of the reasons highlighted for the NHS’ comparatively weak health care compared with Kaiser Permanente was that ‘Kaiser [unlike the NHS] has achieved real integration through partnerships between physicians and administration and can exercise control and accountability across all components of the healthcare system’.
The point is that while the results of each individual criteria is interesting, the sum of the parts don’t make up ‘quality’.
And the same goes for efficiency. As the Economist wrote: ‘the efficiency rating is based on an eclectic range of indicators, such as health-care spending as a share of GDP and whether patients visit hospitals’ emergency departments for conditions that could have been treated by family doctors. The individual comparisons may be revealing, but the sum of the parts does not measure efficiency, which is how effectively a system turns inputs into outputs.’ Taking the percentage of national health expenditure spent on admin and insurance, for example, (a measure on which the UK comes second) is not a simple measure of efficiency. It is perfectly possible that by spending more on insurance in a competitive environment you actually get disproportionately higher standards of care and health outcomes.
Interestingly, also, the UK does well in terms of ‘visited A&E for a condition that could have been treated by a regular doctor had he/she been available’ – yet this is something that Wanless drew particular attention to: that emergency admissions, have shown a net increase of around 1.6m (35%) between 1998 and 2005.
The Commonwealth Fund study should not be used as an escape clause. The Wanless study is a painful tonic for the government, and they know it.

James Gubb

Newsletter

Keep up-to-date with all of our latest publications

Sign Up Here