Uitgeverij Paris × Close
Journal of Medical Law and Ethics (JMLE)
2016 / 1 (March) 1
 
  • Grace Williams - Medical Ethics and Law BSc Department, King’s College London/University of Birmingham

    OPEN ACCESS
    How could responsible surgical innovation be cultivated by new legislation?  online pdf
 
  • Paul Jerome McLaughlin Jr. Esq. - Florida Mechanical and Agricultural College of Law, Law Library

    OPEN ACCESS
    The Legal and Medical Ethical Entanglements of Infant Male Circumcision and International Law online pdf
 
  • Christopher Roy-Toole - Barrister, Former Member of a Research Ethics Committee in England

    OPEN ACCESS
    TGN1412 Remembered: the Scientific Duty of the Ethics Committee online pdf
 
  • John Rumbold - Postdoctoral Research Fellow, Kingston University London, Sarah Seaton - Medical Statistician, University of Leicester

    OPEN ACCESS
    Mid Staffs: Disaster by Numbers (or ‘How to create a Drama out of a Statistic’)? online pdf
 
  • Alexandros Eleftheriou - University of Nicosia (LLB), Medical Law and Ethics LLM, University of Kent, Lawyer at Cyprus Bar si

    OPEN ACCESS
    Birch v. University College London Hospital Nhs Foundation Trust [2008] EWHC 2237 (QB) online pdf
 
  • Kristoffer Mauritzson - PhD candidate at University of Warwick

    OPEN ACCESS
    Communicating DNACPR Decisions online pdf

Mid Staffs: Disaster by Numbers (or ‘How to create a Drama out of a Statistic’)?

Toon als PDF
John Rumbold - Postdoctoral Research Fellow, Kingston University London, Sarah Seaton - Medical Statistician, University of Leicester*


The campaign about poor care at Mid Staffordshire NHS Foundation Trust culminated in a statutory public inquiry. There were both qualitative data about poor care and quantitative data about mortality rates. The campaign focussed initial press coverage on the excess mortality figures with reports talking about hundreds of ‘unnecessary deaths’. This paper looks at the basis for those figures and their role in judging the quality of healthcare, the admissibility of expert evidence on HSMR figures, and whether raised HSMR or SHMI adjusted mortality rates have any probative value in clinical negligence claims.

Introduction

In February 2013, the long-awaited report of the Mid Staffordshire NHS Foundation Trust (MSFT) Public Inquiry (hereafter referred to as ‘the Francis Report’,R. Francis, Report of the Mid Staffordshire NHS Foundation Trust Public Inquiry (London: The Stationery Office, 2013). bearing in mind this was in fact his second report on the Trust) was published. It consisted of three volumes and a separate volume for the ‘Executive Summary’, itself 115 pages long. Francis made a total of 290 recommendations. With the breadth and depth of material covered, inevitably attention focussed on the material that appeared to be most easily digested and understood by the public. There were stories of neglect which shocked the public, but the headline stories were that over a thousand patients had died as a result of poor care, and that patients were so thirsty that they were forced to drink from flower vases (see below). Both these allegations are at best unproven, and most importantly were not conclusions of the Francis Report. This article will examine the difficulties with the interpretation of the figures for excess mortality produced by the Hospital Standardised Mortality Ratio (HSMR) method.

Distortions and Misunderstandings in the Media Reporting

The figures for excess mortality were a central element of the story for the media,A graphic illustration of this is that the junior reporter for the Express & Star who first reported the problems at the Trust could cite from memory the page in the original Francis inquiry report where the relevant figures were. However, he was unable to state the correct meaning of the figures. even though Francis had clearly stated:

 

‘Taking account of the range of opinion offered to the Inquiry, including a report from two independent experts, it has been concluded that it would be unsafe to infer from the figures that there was any particular number or range of numbers of avoidable or unnecessary deaths at the Trust.’R. Francis, ‘Report of the Mid Staffordshire NHS Foundation Trust Public Inquiry: Executive Summary’, Report No. 947 (London: The Stationery Office, 2013), para 74.

 

Explanations of how these figures should be interpreted and the caveats about their use were apparently ignored by the media. Lawyers too have misunderstood the meaning of the excess mortality figures. Brazier commented that:

 

‘many of those responsible at Mid Staffordshire, responsible for leaving patients screaming in pain for hours and contributing to between 400 and 1200 deaths, will not be prosecuted.’Margot Brazier, ‘Medical neglect law needs shot in the arm’, www.law.manchester.ac.uk/about/

news/display/?id=9893, accessed 29 September 2014.

 

A Michelmores lawyer commented on Twitter that:

 

‘NHS #HSMR mortality rates ‘should be ignored’ Prof Black said in Feb but haven’t they flagged up dangerous hospitals?’Laurence Vick @LaurenceVick, https://twitter.com/LaurenceVick/status/565216483614392321, accessed 10 February 2015.

 

One newspaper even ‘monstered’ a respected public health doctor who had been explaining the problem with this misuse of the HSMR statistics on Twitter, to the annoyance of campaigners (the article was removed subsequent to a complaint to the Press Complaints Commission).Press Complaints Commission, ‘Complaint against Daily Mail resolved’, www.pcc.org.uk/

news/index.html?article=OTAyNg==>, accessed 12 October 2014. Nurses were abused on social media as ‘killers’, and demands were made that criminal prosecutions be brought and that the Chief Executive of the NHS, Sir David Nicholson, resign.

Taylor analyses the difficulties with the designation of ‘excess’ or ‘avoidable deaths’, commenting:

‘Subtracting the expected number of deaths from the actual number of deaths in a hospital over a period gives a measure of the ‘excess deaths’. There are sometimes characterized as ‘avoidable’ deaths. Of course, if all hospitals were equally effective, random variation would mean that ‘avoidable’ deaths would be detected in half of them.’P. Taylor, ‘Standardized mortality ratios’, International Journal of Epidemiology 42 (2013), 1882-1890.

 

Equally, the concept of ‘excess deaths’ implies that in the better hospitals there are ‘excess lives’. Taylor points out another flaw in basing the comparison on the average:

 

The fact that 11 trusts are ‘outliers’ in terms of having an unexpectedly low mortality rate has received rather little attention, but it suggests that perhaps there is weakness in an approach which focuses on comparing bad hospitals with average ones, since clearly even average hospitals could be improved.

 

The initial investigation of the Trust by the Healthcare Commission was triggered by a high HSMR at MSFT. There were numerous accounts of poor care in the period of 2007-2008, and it has been recognised that financial concerns had eclipsed clinical concerns at one stage. One particular anecdote which was repeated many times in reports was that ‘patients were so thirsty that they drank from vases’.Daily Mail Reporter, ‘Patients at scandal-hit hospital “forced to drink from vases after being left on ward without water”’, Daily Mail (London: 23 November 2010), www.dailymail.co.

uk/news/article-1332070/Stafford-Hospital-inquiry-Patients-left-water-forced-drink-vases.html, accessed 12 October 2014; D. Cameron, Prime Minister’s speech to the House of Commons (6 February 2013), https://www.gov.uk/government/speeches/francis-report-pm-statement-on-mid-staffs-public-inquiry, accessed 13 February 2014. However, this was disputed as flower vases had been banned from the wards of MSFT for some years on hygiene grounds.This was confirmed in writing by current hospital management. The conclusion of Francis was that these episodes were unproven,R. Francis, ‘Independent inquiry into care provided by Mid Staffordshire NHS Foundation Trust January 2005-March 2009’, Report No. 375 (London: The Stationery Office, 2010), 48. and there was no mention of them in the Healthcare Commission’s reportHealthcare Commission, Investigation into Mid Staffordshire NHS Foundation Trust (Commission for Healthcare Audit and Inspection, 2009). (which led to retractions by some newspapers).For example, a retraction was printed with this story ‘Mid Staffs whistleblower Julie Bailey: “I don’t go out here on my own any more”’ stating ‘We have been asked to make it clear that the inquiry into failings at the hospital, conducted by Robert Francis QC, did not hear any direct evidence about any incident of patients forced to drink water from flower vases’, www.theguardian.com/society/2013/oct/27/julie-bailey-mid-staffordshire-nhs-whistleblower, accessed 13 February 2015. The two episodes that have been identified involve an elderly confused patient and a patient who was on fluid restriction for medical reasons – in other words, his lack of water was reflective of good nursing care, rather than neglect. His widow mentions this restriction in the interview she gave the BBC.BBC News, ‘My husband drank from a vase while a patient at Stafford’, www.bbc.co.uk/

news/health-21936591, accessed 18 December 2014. Other stories of neglect have been refuted. For example, the receptionists were not triaging accident & emergency patients,Op. cit. n. 10. this was a simple clerical error where the receptionists were filling in the documentation incorrectly.R. Ramesh, ‘Stafford’s A&E set for closure as anger grows at “crucifixion of a good hospital”’, The Guardian (London: 29 July 2013).

The Pitfalls in interpreting the HSMR Figures

The HSMR is a method for examining the mortality of hospital patients. The method was introduced by Jarman in the mid-1990s to assess the quality of hospital care.B. Jarman, S. Gault, B. Alves, A. Hider, S. Dolan, A Cook, et al., ‘Explaining differences in English hospital death rates using routinely collected data’, BMJ 318 (1999), 1515-20. It is the ratio of the observed number of events in a population compared to the number expected calculated using the rates in a reference population. The way the number expected is calculated is described on the Dr Foster Unit website.Dr Foster, ‘HSMR mortality indicators’ (2010), www1.imperial.ac.uk/resources/3321CA24-A5BC-4A91-9CC9-12C74AA72FDC/hsmrmethodology26nov2010.pdf, accessed 1 September 2014. If the value of the HSMR is greater than 100, there is said to be ‘excessive’ events in that population. If that value is outside the 95th percentile, then there is only a 5% chance of that value having occurred by chance. However, the inference that there is a 95% probability that the finding of increased mortality is genuine is incorrect. The junior journalist who broke the story made this mistake,S. Walker, ‘Mid Staffs Death-Rate Debate with Journalist Shaun Lintern’ (2013), https://skwalker1964.wordpress.com/2013/02/28/mid-staffs-death-rate-debate-with-hsjs-shaun-lintern-2/, accessed 4 December 2014. as well as other commentators.Personal communication. Similar misunderstandings occur in the Parliamentary Briefing Paper on mortality rates at Mid Staffs.G. Thompson, Mortality Rates at Mid-Staffordshire NHS Foundation Trust Briefing Paper (House of Commons: House of Commons Library, 2009). The HSMR has also been adopted by hospitals in the USA,Consortium of Chief Quality Officers, ‘Using Hospital Standardized Mortality Ratios for Public Reporting: A Comment by the Consortium of Chief Quality Officers’, American Journal of Medical Quality 24 (2009), 164-65. Canada,C. Brown, ‘Value of hospital standardized mortality ratio unclear, administrators say’, CMAJ 183 (2011), E23-24. and the Netherlands,B. Jarman, D. Pieter, A.A. van der Veen, R.B. Kool, P. Aylin, A. Bottle, G.P. Westert, S. Jones, ‘The hospital standardised mortality ratio: a powerful tool for Dutch hospitals to assess their quality of care?’, Qual Saf Health Care 19 (2010), 9-13. as well as being the methodology used by Dr Foster.

Another issue is that of multiple comparisons, often misunderstood by people. An example of this is the ‘birthday paradox’. If you take two football teams plus the referee (23 people), the chance of two persons sharing the same birthday is, counterintuitively, slighter more than 50%. The threshold of a 1-in-1,000 probability seems quite significant, but the number of comparisons makes the number of such events within the UK NHS considerable. The CQC threshold for mortality statistics apparently generates about 30 to 40 alerts per month.P. Taylor, ‘Standardized mortality ratios’, International Journal of Epidemiology 42 (2013), 1882-1890. The specificity at this threshold is not known, but Mohammed et al. found that the standard HSMR thresholds had only a one-in-eleven true positive rate.M.A. Mohammed, R. Lilford, G. Rudge, R. Holder, A. Stevens, ‘The findings of the Mid-Staffordshire inquiry do not uphold the use of hospital standardized mortality ratios as a screening test for “bad” hospitals’, Quarterly Journal of Medicine 106 (2013), 849-54.

The misinterpretation of HSMR statistics can be down to an error in the interpretation of conditional probability (a.k.a. the fallacy of the transposed conditional). An example of this encountered during a criminal trial is the prosecutor’s fallacy.C. Salmon, D. Ormerod, ‘DNA evidence – the prosecutor’s fallacy – the role of the expert – suggested direction to the jury on the random occurrence ratio’, Criminal Law Review (1997), 669-673. This is where the slim chance of a DNA match is erroneously used to argue for guilt. If in fact the defendant was selected on the basis of a DNA match, this argument from rarity is fallacious. Put another way, the chance of someone winning the lottery is 14 million to one, but most weeks there is a winner. We do not automatically accuse the winner of cheating because the odds are so slim.

In all analyses, the reason for the values falling outside the limits should be considered. Poor care is only one explanation.

Other Mortality Statistics

There are other methods of assessing hospitals which are similar to the HSMR, including the Summary Hospital Mortality Index (SHMI).Health and Social Care Information Centre, ‘Summary Hospital-Level Mortality Indicator’, www.hscic.gov.uk/SHMI, accessed 10 February 2015. Each method relies on different data and calculates subtly different things,D.M. Shahian, R.E. Wolf, L.I. Iezzoni, L. Kirle, S.T. Normand, ‘Variability in the measurement of hospital-wide mortality rates’, NEJM 363 (2010), 2530-2539. so the HSMR or SHMI can give different results for the same hospital. For example, Blackpool Teaching Hospitals Foundation Trust stated that ‘it was at a disadvantage under the SHMI as it does not take into account levels of deprivation unlike other indicators’.S. Calkin, ‘Trusts blame high SHMIs on poor coding’ (2011), www.hsj.co.uk/news/acute-care/

trusts-blame-high-shmis-on-poor-coding/5037205.article?sm=5037205#.VAMOz_k7um4, accessed 30 September 2014. These methods are all based on statistical models, whose accuracy can only be assured in the populations in which they have been studied. Concerns have been raised about the HSMR and the constant risk fallacy (assuming the risk factors in the adjustment relates to the risk in the same ways in the different study populations).Jon Nicholl, ‘Case-Mix Adjustment in Non-Randomised Observational Evaluations: The Constant Risk Fallacy’, Journal of Epidemiology and Community Health 61 (2007), 1010.

Issues with HSMR

The HSMR, like all statistical analyses, is affected by the quality of the data, and all hospital admissions data are affected by the accuracy of coding. MSFT had issues with coding staff, both in terms of numbers and experience. Consequently there was a substantial amount of re-coding required when a new coding manager was appointed to calculate an accurate HSMR. For example, in MSFT in 2009 an elderly person who broke a hip was five times less likely to die than in other places. The Trust stated: ‘We have not always had such a low SMR [standardised mortality ratio] for fractured neck of femur. Our Clinical Coding department advise that the change is due to substantially improved coding procedures.’N. Hawkes, ‘Patient coding and the ratings game’, BMJ 340 (2010), c2153. The overall HSMR reduced from 127 to 93 in two years,I.A. Scott, C.A. Brand, G.E. Phelps, A.L. Barker, P.A. Cameron, ‘Using hospital standardised mortality ratios to assess quality of care — proceed with extreme caution’, Medical Journal of Australia 194 (2011), 645. but audits of the coding practice at Mid Staffs showed no ‘statistical gaming’. Further, the main coder for the time in question was Ms Kirkbright-Hayes – suspended by another NHS Trust for whistle-blowing on inappropriate coding to improve HSMR statistics,‘Hospital told to stop threatening whistleblower as Hunt steps in: Health secretary demands investigation into claims would be sacked’, www.dailymail.co.uk/news/article-2572556/Hospital-told-stop-threatening-whistleblower-Hunt-steps-Health-Secretary-demands-investigation-claims-sacked.html, accessed 12 October 2014. so there seems little reason to doubt her integrity.

The most important effect on the HSMR is coding for palliative care. Differences in hospital practice in admitting patients to hospital for palliative care causes the HSMR to be unreliable.N. Black, ‘Assessing the quality of hospitals’, BMJ 340 (2010), c2066; C.A.K.Y. Chong, G.C. Nguyen, M.E. Wilcox, ‘Trends in Canadian hospital standardised mortality ratios and palliative care coding 2004-2010: a retrospective database analysis’, BMJ Open (2012), e001729. The frequency of this varies enormously between Trusts. There was a recognised shortage of hospice beds in the Stafford area, which is the explanation for the relatively high use of the palliative care code. The change in coding practice is explained by an improvement in poor coding practice, again due to having insufficient numbers of fully trained coders.Personal communication. The palliative care code was initially being used rarely, so it was inevitable that correct coding would result in a large increase in the use of this code. Despite inferences to the contrary, Francis found there was no evidence that the HSMR was deliberately rigged, stating:

 

It is unlikely that those working in the Trust on the issue of coding entered into a sophisticated plan to manipulate data dishonestly. It is much more likely that they were motivated by the known deficiencies in coding.Op. cit. n. 1, para. 5.232.

 

The palliative care code usage, after an initial surge, was in line with national levels.P. Taylor, ‘Rigging the death rate’, London Review of Books (11 April 2013).

Other coding errors that have a marked effect are insufficient depth of coding,M.A. Mohammed, A.J. Stevens, ‘A simple insightful approach to investigating a hospital standardised mortality ratio: An illustrative case-study’, PLoS ONE 8 (2013), e57845. and coding by admission diagnosis. Depth of coding refers to the number of co-morbidities and amount of other information that is included. Again, it has been shown that the depth of coding at Mid Staffs was initially lower than the national average.N. Hawkes, ‘How the message from mortality figures was missed at Mid Staffs’, BMJ 346, (2013) f562. HSMR and other methodologies rely on this information to make the necessary adjustments for the different hospital patient populations. Coding for admission diagnosis is a problem because a patient may be for example admitted with ‘syncope’.The medical term for a ‘transient loss of consciousness due to generalized cerebral ischaemia secondary to a global reduction in cerebral blood flow’ according to Churchill’s Medical Dictionary. There is a long list of causes of syncope, ranging from a simple faint to a life-threatening emergency such as a heart attack. Any resulting morbidity or mortality will be related to the underlying cause, rather than the presenting symptom of syncope.

HSMRs cannot be used to calculate unnecessary deaths. Jarman himself emphasises this fact, but claims that ‘excess’ mortality is an indication to look more closely at a hospital for an explanation; even this role is disputed.Op. cit. n. 38; Fullfact.org, ‘How many people died “unnecessarily” at Mid Staffs?’ (2013),

https://fullfact.org/factchecks/francis_many_deaths_unnecessarily_at_mid_staffs-28805,

accessed 24 December 2014. Francis stated that:

 

‘it would be unsafe to infer from the figures that there was any particular number or range of numbers of avoidable or unnecessary deaths at the Trust.’Op. cit. n. 3.

 

This implies that he considers that there may have been no genuine increase in mortality at Mid Staffs. Unfortunately, many commentators appeared to have looked at the page containing the HSMR figures and not at any of the accompanying caveats and notes on interpretation. Thus an unquantifiable number of deaths due to poor care became a definite number of ‘unnecessary deaths’, ‘killings’, ‘state-assisted manslaughters’Media doctor Dr Phil Hammond commented about 1200 cases of state-assisted manslaughter on Twitter; he claimed the comment was satirical. or even ‘murders’.Comments on Twitter from various sources.

Ms Leslie MP, member of the Health Select Committee, quoted the HSMR figures as 1,200 deaths, although other Parliamentarians understood that the figures should not be used in this way.Hansard, ‘Backbench debates March 5th 2013’ (2013), www.publications.parliament.uk/pa/

cm201213/cmselect/cmbackben/130305/130305.htm, accessed 24 December 2014, Q15. In the period concerned, HSMR figures were not given much credence, and so would not necessarily have been seen as a trigger for investigation. As Mr Bradshaw MP stated to the Health Select Committee in 2009.

 

These are questions that you may also want to put to the Care Quality Commission if you have not already, but in the example of Mid-Staffordshire what alerted the Healthcare Commission was not just the high HSMRs because I think everybody accepts that HSMRs in isolation are not enough to tell you that there is a problem. That is one of the reasons that they have not been used in a way that we have now decided to use them and publicise them because they can be skewed for particular reasons. However, in combination with other alerts the system is becoming ever more sophisticated. It was the combination of the level of patient complaints, the level of patient complaints upheld and the staff survey and more that finally caused the Healthcare Commission to begin asking searching questions.Examination of witnesses (questions 1040-1059), www.publications.parliament.uk/pa/

cm200809/cmselect/cmhealth/151/9060307.htm, accessed 24 December 2015, Q1042.

 

The Laker case note review found that there was at worst one unnecessary death.There were individual cases of negligence noted at inquests, notably the deaths of Gillian Astbury and John Moore-Robinson. Otherwise, there are no confirmed deaths due to poor care. Even with the failings of a case note review, this is considered more accurate than any statistical method in determining unnecessary deaths (although it only covered about 200 cases). Keogh acknowledged this, and commissioned a review by Professors Black and Darzi in the relationship between ‘excess mortality rates’ and actual ‘avoidable deaths’. This involved conducting retrospective case note reviews on a substantial random sample of in-hospital deaths from Trusts with lower than expected, as expected and higher than expected mortality rates.B. Keogh, Review into the quality of care and treatment provided by 14 hospital trusts in England: Overview report (London: NHS England, 2013), 7. That review found that there was only a small, but statistically non-significant, association between HSMR and the proportion of avoidable deaths. The same was true for SHMI.H. Hogan, R. Zipfel, J. Neuberger, A. Hutchings, A. Darzi, N. Black, ‘Avoidability of hospital deaths and association with hospital-wide mortality ratios: retrospective case record review and regression analysis’, BMJ 351 (2015), h3239.

The recent announcement of an annual case note review of 2,000 deaths in the NHS also signals a move away from statistical methods for detecting avoidable deaths.L. Sabin, ‘Health Secretary Jeremy Hunt orders annual review of “avoidable deaths” in NHS hospitals’, The Independent on Sunday (London: 8 February 2015). The HSMR has been used to construct ‘league tables’ of hospitals in the past, and the results were made available to the public.D. Campbell, A. Asthana, ‘Exposed: The hospitals whose high death rates are failing the NHS’, Observer (London: 27 November 2010). Dr Foster still advertises that ‘Your mortality rate is your pulse (keep your finger on it)’.Dr Foster Ltd, Dr Foster hospital guide 2013 (London: Dr Foster Intelligence, 2013). However at that time the SHA did not consider them valuable intelligence about healthcare quality.M.A. Mohammed, J.J. Deeks, A. Girling, G. Rudge, M. Carmalt, A.J. Stevens, et al., ‘Evidence of methodological bias in hospital standardised mortality ratios’, BMJ 338 (2009), b780. There have been a number of articles in the press and academic literature explaining the limitations of the HSMR and SHMI methodologies for assessing the performance of hospitals. The Guardian led the way after blogger Steve Walker raised the issue.S. Walker, ‘The Real Mid Staffs Story: One “Excess” Death, if That’ (2013), http://skwalker

1964.wordpress.com/2013/02/26/the-real-mid-staffs-story-one-excess-death-if-that/, accessed 1 September 2014. Dr Foster Intelligence has a huge commercial interest in proving the value of HSMR. There is big business in improving mortality statistics; the ethics of particular companies’ methods is debated, usually by competing concerns.B. Jarman, ‘Witness statement’ (2014), www.midstaffspublicinquiry.com/sites/default/files/

evidence/Brian_Jarman_-_witness_statement.pdf, accessed 12 January 2015, paras 82 & 83.

Academic Criticism of the HSMR

The Keogh report into 14 NHS hospitals selected on the basis of either a raised HSMR or raised SHMI revealed that all had problems with patient care (attributed to poor staffing levels), but Keogh stated that:

 

However tempting it may be, it is clinically meaningless and academically reckless to use such statistical measures to quantify actual numbers of avoidable deaths.Op. cit. n. 48.

 

Further, he states that:

 

This review has shown the continuing challenge hospitals are facing around the use and interpretation of aggregate mortality statistics. The significant impact that coding practice can have on these statistical measures, where excess death rates can rise or fall without any change in the number of lives saved, is sometimes distracting boards from the very practical steps that can be taken to reduce genuinely avoidable deaths in our hospitals.Ibid.

 

Spiegelhalter in the BMJ described the figure of 13,000 ‘unnecessary deaths’ reported in advance by the TelegraphL. Donnelly, P. Sawer, ‘13,000 died needlessly at 14 worst NHS trusts’, Daily Telegraph (London 13th July 2013). and the ‘1200’ at MSFT as potential ‘zombie statistics’ that ‘will not die in spite of repeated demolition’.D Spiegelhalter, ‘Have there been 13 000 needless deaths at 14 NHS trusts?’ (2013) 347 BMJ f4893.

The Academy of Medical Royal Colleges Report on International HSMRs concluded there could be no firm conclusions that hospital care in the UK was significantly inferior to the USA:

 

we have no measure of the uncertainty attached to the estimate of 45%. On a simplistic level, it is quite accurate because it is based on large numbers, but uncertainty in almost all the key assumptions used in its derivation mean that we cannot have much credence that this estimate is even close to the actual value.

 

A difference of 45% was not considered proof of a difference in hospital care,Academy Working Group, International HSMRs: Academy Working Group Report to Professor Sir Bruce Keogh (London: Academy of Medical Royal Colleges, 2015). such were the difficulties in assessing the effect on HSMR of factors such as diagnostic and coding differences between the two countries.

The Role of HSMR and other Mortality Statistics in the Evaluation of Medical Care

The Department of Health stated that ‘a high HSMR is a trigger to ask hard questions’ rather than evidence of poor performance per se.B. Jarman, B. Edwards, ‘Slaying the myths: A layman’s guide to mortality rates’, www.nhsmanagers.net/guest-editorials/slaying-the-myths-a-laymans-guide-to-mortality-rates/, accessed 14th Oct 2014. Even this HSMR role as a trigger is disputed.Op. cit. n. 38. The issues that have been found in many hospitals with a high HSMR do not provide evidence of the specificity or sensitivity of HSMR-generated alerts without comparable inspections of those with an average or low HSMR.Op. cit. n. 38. Hospital-wide mortality statistics, such as SHMI and the HSMR, have been advocated as techniques for the reporting of clinical outcomes but such an approach has long been recognised as being problematic. So if the HSMR or SHMI figures by themselves are not sufficient evidence for a system failure, what other indicators might have told us that MSFT was failing? We have the numerous complaints with the specific details of poor care and neglect. We have a number of reports that looked at the hospital including the independent case note review where specific cases of concern were reviewed. Although poor care is likely to contribute to increased mortality, if nursing staff were prioritising tasks that were medically important (as opposed to important for dignity and comfort), then mortality may not have been significantly affected.

Mortality rates for specific diagnoses or procedures will be more specific, although due to lower numbers they will be more prone to statistical outliers. Systems for looking at an entire hospital’s mortality rates from all causes have not achieved that same level of reliability yet. Any calculated excess mortality cannot be assumed to reflect a genuine increase in mortality. Additionally, it has been argued that mortality is a poor indicator of care, as many illnesses and procedures result in very low mortality. This means that the HSMR and other methods have a low signal to noise ratio and so mortality statistics, however corrected, are unlikely to be a good measure of the quality of care.Op. cit. n. 32; R. Lilford, P. Provonost, ‘Using hospital mortality rates to judge hospital performance: A bad idea that just won’t go away’, BMJ 340 (2010), c2016; D.W. Pitches, M.A.

Mohammed, R.J. Lilford, ‘What is the empirical evidence that hospitals with higher-risk adjusted mortality rates provide poorer quality care? A systematic review of the literature’, BMC Health Services Research 7 (2007), 91. Also most deaths do not reflect poor-quality care. HSMR has low criterion validity; that is, there is a weak association between the HSMR and other quality of care indicators. Other issues are also highlighted including risk adjustment, small sample sizes leading to imprecision and changes over time.Op. cit. n. 32; R.B. Penfold, S. Dean, W. Flemons, M. Moffatt, ‘Do hospital standardized mortality ratios measure patient safety? HSMRs in the Winnipeg regional health authority’, Healthcare Papers 8 (2008), 8-25.

The Probative Value of HSMR

Thus where there is a case of clinical negligence where the surgical unit is a genuine outlier on adjusted mortality statistics, this may be useful evidence; in the case of a hospital’s HSMR statistics, the figures cannot be said to have sufficient probative value. The calculated excess mortality rates at MSFT were only 11% higher than the average. This is without considering the difficulties in ascertaining the underlying cause of the increased mortality, even if it is a genuine phenomenon. These figures suggest that whatever method of calculating hospital mortality is used, it will not be helpful in proving clinical negligence in individual cases. The results may be statistically significant, but this does not necessarily make them legally significant given the failure of ‘loss of a chance’ cases in the English courts.Gregg v. Scott UKHL [2005] 2 AC 176. When Boseley stated in The Guardian that

 

They [relatives] and their lawyers will want to know the reasons for the estimated 544 excess deaths between April 2010 and April 2012,S. Boseley, ‘Keogh’s review of mortality rates in the NHS is a blueprint, not a red alert’, TheGuardian (London: 15 July 2013).

 

she demonstrated a fundamental misunderstanding. This is exacerbated by the ill-chosen phrase ‘unexpected deaths’ used in the same article. These phrases imply a level of causation that can only be proven by examination of individual cases.

Would the use of HSMR figures in clinical negligence litigation satisfy the Bonython criteria?R v. Bonython (1984) 38 SASR 45. The field of hospital mortality statistics is an area that requires specialist knowledge. The field is a recognised area of scientific knowledge, with suitably qualified experts to opine on it. So this satisfies the relatively liberal admissibility criteria of English law. Whether the HSMR per se satisfies the Bonython criteria for the purposes of either comparing mortality rates or quality of care is arguable. There is no general acceptance of HSMR as a reliable measure for inter-hospital comparisons. Professor Black appeared on BBC Radio 4’s File on 4 programme before his review of hospital-wide mortality ratios had been conducted, where he concluded that ‘based on what he already knew, HSMRs should be ignored’. He said they could not entirely take into account factors such as burden of illness and were skewed by other factors such as the availability of hospice care in the area – where there is less hospice care patients are more likely to be in hospital when they die.

 

I don’t think there’s any value in the publication of HSMR and I’d go further, I think it’s actually a distraction because it gives ... a misleading idea of the quality of care of a hospital.

 

When asked what the public should make of media coverage of death rates, he added: ‘Personally, I would suggest that the public ignore them.’N. Triggle, ‘NHS death rates “should be ignored”’ (25 February 2014), www.bbc.co.uk/news/

health-26329750, accessed 16 April 2014. He even dismissed the suggestion that HSMRs can act as a ‘smoke alarm’ as most studies have found there is no correlation between HSMR and avoidable deaths. The retrospective review confirmed this evaluation. On this basis, it can be strongly argued that the probative value of HSMR and other hospital-wide mortality ratios do not outweigh their prejudicial value.

Conclusions

We would argue that the popular image of the ‘Mid Staffs disaster’ (as campaigners have described it) is a fiction created by the application of an unreliable statistic. Avoidable deaths can only be established by a review of individual cases, and excess deaths cannot be established reliably by any current statistical methodology. The development of validated excess mortality measures will not provide a useful method of evaluating the quality of healthcare by itself, and bona fide findings of a raised mortality rate would be unlikely to provide meaningful evidence in individual cases. Even the more cautious claims that adjusted mortality rates can act as a ‘smoke alarm’ seem unfounded.

In terms of the mortality rates, Mid Staffs was a ‘disaster by numbers’ produced by reckless reporting of a misinterpreted statistic. The report of Black and Darzi demonstrate that there is no significant association between HSMR and avoidable deaths,Op. cit. n. 49. therefore the current methods cannot be used to support medical negligence claims. The focus on mortality should not override other measures of care, particularly those that address issues such as dignity and comfort. 


* DOI 10.7590/221354016X14589134993974

Indien u een los artikel wilt bestellen, stuur een e-mail naar info@uitgeverijparis.nl