Click to listen highlighted text! Powered By GSpeech

Home » Information Technology » The war over Jarman’s data

The war over Jarman’s data



Prof Brian Jarman

Prof Brian Jarman

1. Introduction

The fiasco over the HSMR is now of totemic proportions.

In many ways, the battle over the Channel 4 headline story of Wednesday saying “for the first time” that hospital mortality rates in the UK were considerably higher than other jurisdictions has become more totemic than the notorious “War over Jennifer’s Ear“. To give him credit, Prof Brian Jarman (@Jarmann) is always very helpful on Twitter. There can be a temptation to play the man not the ball, but Jarman has a very distinguished professional record as a member of the medical profession as part of the St. Mary’s Hospital/Imperial College centres in LondonBy his own admission, his first degree was in physics, and his PhD on Fourier analysis of seismic wave propagation. He changed to medicine age 31. There is absolutely no doubt that Prof Brian Jarman is an incredibly bright man. On the basis of presumption of innocence under English law, it is critical to assume that Prof Jarman did not intend to mislead the general public through the reporting of his claims over hospital mortality statistics. This is indeed consistent with the positioning of excellent patient safety campaigning groups who overall feel that a discussion over the statistics is a genuine offense to the misery of relatives (and families) who have suffered. There is perhaps an issue of whether Jarman was particularly reckless in how such data might be presented before a fearful public, and possibly how Jarman’s data are presented in the media, following the Telegraph and similar reports and this week’s Channel 4 presentation, in the future might be the best surrogate guide to Jarman’s true intentions.

The Channel 4 website report, “NHS hospital death rates among worst, new study finds“, is here.

“NHS chief Sir Bruce Keogh says he is taking very seriously figures revealed by Channel 4 News which show that health service patients are 45 per cent more likely to die in hospital than in the US.”

According to this Channel 4 website report:

“The figures prompted Sir Bruce Keogh, medical director of the NHS, to say he will hold top-level discussions in a bid to tackle the problems. I want our NHS to be based on evidence. I don’t want to disregard stuff that might be inconvenient or embarrassing…I want to use this kind of data to help inform how we can improve our NHS,” he told Channel 4 News.I will be the first to bring this data to the attention of clinical leaders in this country to see how we can tackle this problem.”

2. The methodology of the “hospital standardised mortality ratios” (HSMRs)

The methodology of the ‘hospital standardised mortality ratios‘ is well known to Sir Bruce Keogh. The wider criticisms of the HSMR have been extensively discussed in the medical press, see for example Using hospital mortality rates to judge hospital performance: a bad idea that just won’t go away” (in the BMJ, April 2010).

In the widely publicised “Review into the quality of care and treatment provided by 14 hospital trusts in England: overview report” published from earlier this year, it is stated:

Keogh HSMR

Jarman’s ‘discovery’, in a flourish worthy of ‘Gone with the Wind’ perhaps, is described thus:

“What he found so shocked him, he did not release the results. Instead, he searched – in vain – for a flaw in his methodology and he asked other academics to see if they could find where he might have gone wrong. They, too, could not find fault.”

The question remains: what happened in the intervening nine years which prevented any activity culminating in professional peer review?

3. These data were indeed presented as far back as 2004 

The Channel 4 reporting was sensational as illustrated in this excerpt from the Channel 4 report by Victoria Macdonald:

“So now he is releasing the findings. And they are shocking. The 2004 figures show that NHS had the worst figures of all seven countries. Once the death rate was adjusted, England was 22 per cent higher than the average of all seven countries and it was 58 per cent higher than the best country.”

The data are even known to the Department of Health as evidenced by Jarman himself here:

Jarman department of health tweettweet shared by Brian Jarman provides a link to the presentation of the data in 2004.

The starting point of the report was to go to the Mayo Clinic:

“Because of confidentiality issues we are not allowed to name the other countries. But America stands out in the data for its lower mortality rates. So we went to find out why. At the Mayo Clinic Hospital in Phoenix, Arizona, they are in the best two per cent in the country. It is an impressive hospital, with piano music playing in the lobby and sunshine streaming into the rooms.”

4. Concerns about the data

Any (reasonable) reviewer of these data, particularly given the bold and significant nature of the claims therein, will be concerned about the ‘quality’ of the data, in much the same way Jarman is concerned about the ‘quality’ of healthcare. This is particularly so if one adopts the approach of ‘treating the data’ rather than ‘treating the clinical patient’, which many clinicians would not advocate anyway in isolation.

This ‘confidentiality issue’ about the manner in which these data were provided is confirmed in a recent tweet by Jarman:

confidentiality tweetIn the absence of clarity of what these confidentiality issues precisely are, it is hard to deduce fully what it is exactly that is so prohibitive for Jarman publishing his data. Many senior academics indeed converge on the notion that the publication of these data would be a useful contribution to the field, provided that the publication were properly refereed from two perspectives. These perspectives are that the statistical techniques used are sound. The second perspective is from a clinician’s perspective that the correct public health policy issues have been identified, and analysed correctly using available global evidence, and the citation of relevant background assumptions and confounding factors. Whilst it is hard to conceive that Governments have been withholding publication of data on public policy grounds (and arguably there is no more important a policy issue than a country’s mortality), it is possible that individual private companies may not wish to disclose fully confidential data. In the private sector even, such lack of disclosure of confidential data has led to accusations of fraud and discussion of mitigation, because of the sensitivity of such data to the markets in the dividend signalling theory. Academics have suggested to Jarman on Twitter that it might be possible to publish these data using the techniques of anonymisation or pseudo-anonymisation, as is prevalent in contemporaneous scientific research, but again no answer has been readily provided.

5. The utility (and futility) of cross-jurisdictional comparisons

The financial situation of the Mayo Clinic itself is known to be strong, with a colossal amount of income per patient at the Mayo Clinic compared to a patient in the English National Health Service:

“But a copy of the clinic’s consolidated financial report obtained by the Pioneer Press shows total revenue of $8.48 billion in 2011. That was an increase of $533 million, or nearly 7 percent over revenue of $7.94 billion during the previous year. Income from operations increased by 18 percent, growing from $515.3 million in 2010 to $610.2 million last year, according to the financial report. “This is very strong,” said Steve Parente, a professor of finance at the University of Minnesota’s Carlson School of Management. “In terms of pure operations, they’re doing quite well….Their grants and contracts are going up, too,” said Parente, who reviewed the financial report.””

A pertinent issue is whether the hospital episode statistics are themselves reliable. Prof Jarman quoting other reports feels that they are reliable (see tweet). However, this is in contradistinction from other reports sourced at the Royal College of Physicians of London (as described in a previous Guardian article):

“Currently the public can use the NHS Choices website to help them choose a hospital for treatment. NHS Choices, and the information used by Dr Foster, is based on “Hospital Episode Statistics” (HES) data, which the NHS says is “authoritative and essential”. However, NHS insiders say the information, usually collected by administrative staff from patient records, is unreliable.Professor John Williams, director of the health information unit at the Royal College of Physicians, carried out a study into HES data and found a significant number of operations were recorded inaccurately. He has called for a change in the way data is collected, saying flaws in the HES database were exposed as long ago as 1982.”

Hospital death rates, particularly if followed over time, can give useful warning of problems, as Sir Bruce Keogh has stated. Dr Jacky Davis in a subsequent Channel 4 interview (see below) was asked about the ‘smoke detector’ use of HSMRs as being pivotal in warning about problems.

smoke detectorIt is, for example, argued that the issues in Mid Staffs would not have been exposed, but for the HSMRs. This argument is relatively convincing, but it is also argued that any reader of the local papers in Mid Staffs would have been aware of the problem long before the official HSMR figures emerged. Dr Jacky Davis in her C4 interview on Thursday, in reply, admitted that this  ‘smoke detector argument’ might be true, but explained further ‘if there’s smoke going up you have to make sure – is there a fire there?‘. This is intuitively true, as well as the notion that the mortality rate in any given hospital will depend on the numbers of people who are actually dying ‘on site’. In general of course variation of data in other jurisdictions are likely to relate to regional variation of the care of palliative care patients, for example, and also where the funding for medical care is coming from (for example, the location of death may be an artifact of the conditions of private insurance funding.)

However, arguably reliable comparison of such data internationally is fraught with difficulties, as Prof Walter Holland and colleagues found out when they published the European Atlas of Avoidable Deaths in 1988. Prof Holland is an eminent expert in public health and a visiting Chair at the London School of Economics. Since total mortality rates in the UK are similar, or lower, than in the United States, as a large proportion of the population in both countries die in a hospital such a difference seems unlikely, according to Holland. To compare hospital mortality rates between hospitals, whether in one or more countries, arguably it is necessary to take into account such factors as availability of discharge facilities, e.g. hospices, hospital admission criteria, length of time in hospital, and many other procedural and cultural factors, according to the eminent Holland.

Comparison of outcomes for individual conditions is even more difficult because of differences in diagnostic and coding procedures which have been illustrated many times. In a famous paper “Evidence of methodological bias in hospital standardised mortality ratios: retrospective database study of English hospitals” published in the BMJ in 1999, this coding problem was identified thus:

“Our findings suggest that the current Dr Foster Unit method is prone to bias and that any claims that variations in standardised mortality ratios for hospitals reflect differences in quality of care are less than credible.8 12 Indeed, our study may provide a partial explanation for understanding why the relation between case mix adjusted outcomes and quality of care has been questioned.24 Nevertheless, despite such evidence, assertions that variations in standardised mortality ratios reflect quality of care are widespread,25 resulting, unsurprisingly, in institutional stigma by creating enormous pressure on hospitals with high standardised mortality ratios and provoking regulators such as the Healthcare Commission to react.20

In our jurisdiction, for example, the proportion of surgery done as elective day cases is reported to be quite high. Unfortunately the data presented by Jarman in his original 2004 presentation and the Wednesday Channel 4 package are inadequate alone for anyone to determine the validity of the conclusions or the methods used. This is a striking media story but unfortunately has not been subjected to proper scrutiny. This will be essential for policy in English health policy, particularly in the critical issue of patient safety, to progress; and progress it must not on the basis of emotion or soundbites.

6. A wider issue 

Prof Jarman has consistently stated that he is in favour of a NHS funded by taxation, which is different from the attitude of libertarians and the Mayo Clinic, as shown here:

Jarman taxationCritics of Jarman have felt that his criticisms of some foci of the delivery of care undermine the daily work of clinicians across the country, and may even undermine the doctor-patient relationship so pivotal for the work of clinicians in real life (see for example here).

Clive Peedell tweetJarman and supporters feel, however, that for ages criticisms of poor quality of care in the NHS have been suppressed, and that there is a professional duty to speak up about such poor care to improve the NHS. However, critics of Jarman feel that such an approach ‘in the wrong hands’ is merely providing ammunition for further marketisation and privatisation of the NHS, and that Jarman is carrying out the ‘handy work’ for politicians to see this ideological goal come into fruition.

RallyThis debate is intense with accusation and counter-accusations, but the volume of criticism regarding the NHS article headlining Channel 4 news on Wednesday is a testament to how the public will not accept unqualified comparisons of the NHS with other jurisdictions either. The ensuing analysis has further put the spotlight on Jarman’s data, but this can be no bad thing either. The medical and nursing professions, and their regulatory bodies, will wish for all involved to maintain the reputation, trust in and confidence in their work, but this is only possible if the work is ‘fit for purpose’. Politicians can slag off the NHS to an extent to which clinicians would likely become disciplined for, and it is this contradiction which raises eyebrows.

7. Conclusion

Notwithstanding, it is clear that the fundamental mistake is that it was perceived by some that the Channel 4 report represented irresponsible journalism in that the assumptions were not accurately stated, it suffered from bias by omission by lack of explanation about the surrounding issues (not least presenting Keogh as somebody who tacitly endorsed the uncritical issue of the HSMR, which is false), and basically left a very ugly taste in many people’s mouths. It did no favours for the reputation of, trust in, and confidence in Channel 4 news reporting either that night (Wednesday), but in fairness to Channel 4 news they hosted an interview with @DrJackyDavis – Co-Chair of the NHS Consultants Association – the following evening (Thursday).

However, it is impossible to deny the value of the discussion which has ensued after this dreadful report. The policy situation is precarious, aggravated by certain Ministers of the Crown lying blatantly in the whole saga over a period of months in the current Government.

  • http://ideb8.wordpress.com ideb8

    As “Jarman and supporters feel..that for ages criticisms of poor quality of care in the NHS have been suppressed, and that there is a professional duty to speak up about such poor care to improve the NHS” why must the public also then be deceived by headline lies?

    Why are headline lies allowed to fester without challenge? If the Miliband brothers are justified in feeling impelled to challenge headline lies, why doesn’t the country feel so impelled in relation to yet another straw being placed on the back of the NHS?

    The public are becoming more and more aware of how politicians and newspapers, among others, deceive them. And more and more angry about it.

    Tony Blair deceiving us to join a war. Clegg deceiving students he’d not support fees. MPs deceiving us about expenses and links with journalists and the police. Newspapers deceiving us about phone-tapping, bribery and links with politicians and the police. The Police deceiving us about Hillsborough and links with politicians and journalists. Banks deceiving us about their products and charges. The Government deceiving us about privatising the NHS.

    It makes the blood of most people boil.

    Then Prof Brian Jarman claimed and promoted figures of thousands of NHS deaths and welcomed the publicity this generated, as it helped him focus media attention on parts of the NHS he regarded as failing.

    Although admitting the numbers didn’t represent needless or avoidable or unnecessary deaths, he only informed the journalists of this fact by prior email or phone call, as ‘caveats’.

    Again and again, misleading and distorted figures have been deliberately disseminated, even welcomed, as if their corrosive effect on NHS staff morale will be beneficial:

    Not once did he challenge the headline writers after publication or demand they retract his name in support of them or at least distance him from their distortion of his stats.

    But then, why should he? The publicity was useful.

    He should have because deceiving the public is not only likely, given recent history, to increase the risk that superheating might increase our “boiling” – explosively! – it also demonstrates clearly to us that a respected academic, not just a politician like Blair, feels free to stoop without scruple to achieve their goal, regardless of the means employed to do so.

    He also should have because, although not affiliated to the Royal Statistical Society, the health data entrusted to him via DFI from the NHS gives him a priviliged position, an influential position which ought to entail at least compliance with the RSS Code of Conduct.

    Their codes 5 & 6 include:

    5. “..shall not purport to exercise independent judgement..on any..service in which they knowingly have any interest, financial or otherwise”

    6. “..should not ALLOW any misleading summary of data to be issued in their name..”

    [my caps]

    Shouldn’t we as members of the public, as voters, as taxpayers, as NHS beneficiaries, as the subjects of incendiary statistics relating especially to our health and that of our children, not demand more than just the minimum professional conduct expected from a respected scholar in his exclusive position?

    “..it is impossible for outside observers to verify the analysis and interpretation, especially when the stories are trumpeted by media with an apparent vested interest in running down the NHS. This inevitably breeds suspicion and scepticism”
    http://www.bmj.com/highwire/filestream/663536/field_highwire_article_pdf/0/bmj.f5775

    What has the RSS to say on the relentless headlines fostered by DFI & welcomed as helpful by their source?

    Why does it take the ASA to do so?

    Apart from a few, where have most RSS members & fellows, both current and honorary, spoken up in unison against this regular misuse of statistics in headlines designed to deceive the public – headlines which, by so distorting the debate on our current health system, skew any reasoned debate on its future shape?

    When are we going to stand, like the wronged Miliband brothers in arms, to challenge past and future slurs, smears and machete lies against our struggling and nearly broken-backed NHS?

    “All that is necessary for misquotes to triumph is for good scholars to do nothing”
    – Burke’s “Peerage”?

  • http://askmy.mp iDeb8

    Re the quote in comment above:

    “..it is impossible for outside observers to verify the analysis and interpretation, especially when the stories are trumpeted by media with an apparent vested interest in running down the NHS. This inevitably breeds suspicion and scepticism..”

    Apologies for broken BMJ link, this one should hopefully be better:

    http://www.bmj.com/content/347/bmj.f5775?ijkey=OQh3Y0VIh2pugxJ&keytype=ref

    [by Prof Spiegelhalter FRS, Winton Professor of the Public Understanding of Risk & Fellow of Churchill College, Cambridge, ISI highly cited researcher, 34th most-cited mathematical scientist in world over last 10 years]

  • Pingback: Report on SHA Web presence 2013()

  • A A A
  • Click to listen highlighted text! Powered By GSpeech