Eur J Vasc Endovasc Surg (2015) 49, 277e282

Understanding Administrative Abdominal Aortic Aneurysm Mortality Data K. Hussey a b

a,*

, T. Siddiqui a, P. Burton b, G.H. Welch a, W.P. Stuart

a

Department of Vascular Surgery, Western Infirmary, Glasgow, UK Health Information Services Department, Glasgow, UK

WHAT THIS PAPER ADDS This manuscript explores the quality of a national administrative data set within a single health board in Scotland. Limiting the process to a single region has allowed interrogation of all patient health records.

Objective: Administrative data in the form of Hospital Episode Statistics (HES) and the Scottish Morbidity Record (SMR) have been used to describe surgical activity. These data have also been used to compare outcomes from different hospitals and regions, and to corroborate data submitted to national audits and registries. The aim of this observational study was to examine the completeness and accuracy of administrative data relating to abdominal aortic aneurysm (AAA) repair. Methods: Administrative data (SMR-01 returns) from a single health board relating to AAA repair were requested (September 2007 to August 2012). A complete list of validated procedures; termed the reference data set was compiled from all available sources (clinical and administrative). For each patient episode electronic health records were scrutinised to confirm urgency of admission, diagnosis, and operative repair. The 30-day mortality was recorded. The reference data set was used to systematically validate the SMR-01 returns. Results: The reference data set contained 608 verified procedures. SMR-01 returns identified 2433 episodes of care (1724 patients) in which a discharge diagnosis included AAA. This included 574 operative repairs. There were 34 missing cases (5.6%) from SMR-01 returns; nine of these patients died within 30 days of the index procedure. Omission of these cases made a statistically significant improvement to perceived 30-day mortality (p < .05, chisquare test). If inconsistent SMR-01 data (in terms of ICD-10 and OPCS-4 codes) were excluded only 81.9% of operative repairs were correctly identified and only 30.9% of deaths were captured. Discussion: The SMR-01 returns contain multiple errors. There also appears to be a systematic bias that reduces apparent 30-day mortality. Using these data alone to describe or compare activity or outcomes must be done with caution. Ó 2014 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved. Article history: Received 31 July 2014, Accepted 10 December 2014, Available online 23 January 2015 Keywords: Administrative data, Aneurysm, Clinical outcome

INTRODUCTION Administrative data are gathered for the primary purposes of capacity planning, commissioning services, and, ultimately, remuneration. Data collection is also part of clinical work and can be used to populate specialty and procedural registries. Both clinical and administrative data have been used to identify variation in process and outcome for defined conditions or interventions. Several authors have explored the completeness of clinical and administrative data sets1,2 and have indicated a high level of agreement, concluding that data collected by the administrative team could provide a means of validating data gathered by clinicians. Furthermore, it has been suggested that * Corresponding author. K. Hussey, Department of Vascular Surgery, Western Infirmary, Glasgow G11 6NT, UK. E-mail address: [email protected] (K. Hussey). 1078-5884/Ó 2014 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.ejvs.2014.12.015

administrative data could be used to directly measure clinical performance at hospital and clinician level. However, others have described weaknesses of administrative data sets.3e8 Powerful voices have asserted that clinician engagement in the process of data gathering and governance is imperative.9 However, the precise role and, perhaps more importantly, the extent of clinician involvement and influence in the process has yet to be fully defined. Key issues relate to resource and the implication of using medical time to collect or verify data, but also to the lack of independence of data gathered by an individual or group with a large stake in the reported outcomes. Any failure to identify a group of cases with adverse outcomes, through systemic flaw or bias, would have real effects on reported performance. If the generation of robust data is to be the responsibility of the clinician, it is important to understand methods of data collection and intrinsic weaknesses of the process within both clinical and administrative systems. An

278

advantage of the NHS is that it remains a largely closed system of care. As such, it would seem appropriate to make full use of this to standardise and optimise data collection and analysis. The National Vascular Registry (NVR) is part of a quality improvement framework introduced with the aim of reducing mortality following elective surgery for abdominal aortic aneurysm (AAA). Vascular surgeons are encouraged to enter clinical data on patients undergoing operative repair through a secure web-based data collection system. These clinical data are then linked to national administrative data to define health board contribution to the registry.10 However, concerns remain about data quality and administrative coding e a process that is not subject to external audit.11 National administrative data in Scotland are derived from the Scottish Morbidity Record (SMR), with episodes generated for each hospital admission (similar to Hospital Episode Statistics [HES] data in England and Wales). The acute services activity are recorded and reported as SMR-01 data. As with HES, SMR-01 returns are populated using diagnostic (ICD-10) and treatment (OPCS-4) codes derived from discharge summaries. Each episode contains fields for up to six diagnostic and four procedural codes, with relevant dates for the episode of care and the intervention. Hospital episode data are collated at health board level and submitted electronically to the Information and Statistical Division (ISD) of NHS Scotland. Diagnostic (ICD-10) and operative coding (OPCS-4) of aortic conditions is complex; diagnostic codes discriminate between repair of AAA for “rupture” and repair “without mention of rupture,” but the operative coding is subtly different, separating cases into emergency and nonemergency categories. This allows identification of an elective repair of intact AAA, an emergency repair of a ruptured AAA, an emergency procedure for a non-ruptured aneurysm (symptomatic AAA), but also creates the potential for conflict of codes with a “ruptured” aneurysm being repaired as an elective procedure (possible, but rare). Also recorded within administrative data is the hospital-assigned urgency code (scheduled or emergency) for admission. The primary aim of the present work was to ascertain the completeness and accuracy of national administrative data relating to AAA repair within a single health board. The purpose was to assess the reliability of any measure of outcome derived from these administrative data in an unmodified form. METHOD Permission to collate, store, and examine patient identifiable data was obtained from the Caldicott Guardian (September 2007 to August 2012 inclusive). The Community Health Index (CHI) number (a unique patient identifier used throughout Scotland derived from the patients date of birth) was used to access electronic patient health records. Indications, dates of intervention, and precise procedures were validated from case records. Thirty-day mortality was calculated.

K. Hussey et al.

Greater Glasgow and Clyde Health Board administrative data, in the form of SMR-01 returns (centrally held national data set, HES equivalent) were extracted for all hospital episodes featuring diagnostic or procedural codes relating to infra-renal AAA, in any of the data positions. Mortality for this group was sourced from National Records, Scotland (a population registry), which can be linked to SMR-01 returns, using the CHI number. Each entry in the SMR-01 data set was examined individually and each AAA repair procedure verified. To examine the completeness and accuracy of the SMR01 returns, a Reference data set was derived from all available sources. Clinical data were collected from theatre registers, secretarial diaries, and personal logbooks. It was not possible to gain access to theatre activity data held electronically for the period of interest as the system had been upgraded and older files were no longer accessible, therefore this search was performed manually. The final Reference data set was a composite of SMR-01 returns and clinical data. Each case was examined and assigned to a category for indication (asymptomatic, symptomatic, or ruptured) and another for repair procedure (open or endovascular). The first search was for cases that were recorded in both the SMR-01 returns and the Reference data set. The number of procedure-related episodes miscoded, but still present in the SMR-01 returns in some form, was established, as were the number of procedures entirely omitted. The data were then interrogated to establish how the SMR-01 returns appeared at face value using the recorded ICD-10 and OPCS-4 codes. This analysis was repeated, excluding cases in which the recorded urgency category of the index episode, assigned by the coding process (elective or emergency), conflicted with either the ICD-10 or OPCS-4 codes. Thirtyday mortality was recorded for each search. These data were compared with the validated outcomes within the Reference data set. This process is summarised in Fig. 1. RESULTS Reference data set: validated primary AAA intervention e patients, procedures, indications, and outcomes There were 608 procedures for AAA identified from all available sources. Of these, 497 (81.5%) patients were male and 261 (42.7%) were endovascular interventions (Table 1). There were two patients, recorded in a theatre register by name only, who appeared to have had AAA repair but no further record of their progress or outcome could be identified. These were excluded from further analysis. Clinical data set: data derived from all known clinical sources There were 499 procedures identified (Table 1). There were 54 deaths within 30 days of procedure in this cohort. (The two inadequately identified cases mentioned above appeared in this subgroup and were excluded from subsequent analysis.)

AAA Mortality Data

279

Clinical Data set Theatre registers/secretarial diaries/personal logbooks Confirmed cases (n=500) SMR-01 Returns (ISD data) Table 1 2433 episodes of care for 1724 patients Confirmed cases (n=574) Table 1 Reference Data set Composite dataset of clinical and administrative data identifying all procedures performed for AAA (n=608) Table 1

SMR-01 Returns Cases excluded if inconsistent diagnostic (ICD-10) and operative (OPCS-4) codes

Table 1. Comparison of recorded outcome by data set (SMR-01 returns taken at face value). Intervention, Indication

Reference data set (Deaths within 30 days, n, %) Elective, 148 Open repair (6, 4%) Symptomatic, 51 Open repair (6, 11.8%) Rupture, 148 Open repair (53, 35.8%) Elective, EVAR 249 (3, 1.2%) Symptomatic, 9 EVAR (0) Rupture, EVAR 3 (1, 33%) Total 608 (69, 11.2%)

SMR-01 returns Clinical data set (Deaths within (Deaths within 30 days, n, %) 30 days, n, %)

139 (5, 3.5%) 50 (6, 12%) 136 (45, 33%) 237 (3, 1.2%) 9 (0) 3 (1, 33%) 574 (60, 10.5%)

130 (6, 4.6%) 41 (4, 9.8%) 117 (40, 34.2%) 201 (3, 1.5%) 7 (0) 3 (1, 33%) 499 (54, 10.8%)

only half (3/6) of the early deaths among the elective open cases would have been identified without case-by-case scrutiny. If the search was then performed with exclusion of cases where there was conflict between the hospital assigned urgency of admission and the ICD-10 and OPCS-4 coding fewer cases were accurately identified (Table 2).

Table 2

DISCUSSION SMR-01 Returns Cases excluded if inconsistent hospital assigned urgency, diagnostic (ICD-10) and operative (OPCS-4) codes

The present data describe outcomes following AAA repair performed in a single Scottish health board. Outcomes appear comparable with trial and registry data.12e14 This conclusion could be drawn independently from both the Reference data set (validated data) and also from SMR-01

Table 2

Figure 1. Process of data analysis.

SMR-01 returns: analysis of procedures and outcome using administrative data Interrogating locally held data using the described codes yielded 2433 episodes of care (1724 patients) for which infra-renal AAA was mentioned in a diagnostic or procedural field. Within this group there were 574 episodes for which primary AAA repair could be verified (Table 1). Sixty (10.5%) of the 574 cases in both the Reference data set and the SMR-01 returns died within 30 days. There were 34 cases not recorded in the SMR-01 returns; 9 of these patients died within 30 days. Omission of these cases made a statistically significant improvement to perceived 30-day mortality in the SMR-01 returns (p < .05, chi-square test). When SMR-01 returns were interrogated excluding cases where there was inconsistent ICD-10 and OPCS-4 coding, only 81.9% of cases were identified and only 30.9% of the deaths were captured (Table 2). The majority of elective open repairs (97.3%) and EVAR (95.2%) were identified, but

Table 2. SMR-01 returns described according to coding consistency and compared with the Reference data set. Intervention, Indication

Elective, Open repair Symptomatic, Open repair Rupture, Open repair Elective, EVAR

Reference data set (Deaths within 30 days, n, %)

148 (6, 4%) 51 (6, 11.8%) 148 (53, 35.8%) 249 (3, 1.2%) Symptomatic, 9 EVAR (0) Rupture, EVAR 3 (1, 33%) Total 608 (69, 11.2%)

144 (3, 2%) 61 (6, 9.8%) 70 (9, 12.9%) 223 (3, 1.3%) 0

SMR-01 returns Consistent urgency/ICD/OPCS codes (Deaths within 30 days, n, %) 118 (2, 1.7%) 36 (5, 13.8%) 59 (6, 10.2%) 205 (3, 1.4%) 0

0

0

498 (21; 4.2%)

418 (16, 3.8%)

SMR-01 returns Consistent ICD/OPCS codes (Deaths within 30 days, n, %)

280

returns (taken at face value). However, scrutiny of the unmodified administrative data against a verified data set reveals a number of cases recorded incorrectly or inadequately, with some episodes completely omitted. These incorrectly recorded or missing episodes are more likely to have ended in an adverse outcome. However, as the total numbers of episodes generated by interrogation of the administrative data are so close to actuality, a legitimate assumption of accuracy could be made. HES generate national data sets containing large numbers of episodes and interventions from which researchers and auditors have drawn conclusions about care models, standards, and performance.1,2,7,8 It is therefore important to identify and, if possible, correct system errors and intrinsic sources of bias. Several authors have suggested satisfactory levels of agreement between administrative data and contemporaneous clinical data, concluding that it is appropriate to use these data for cross-validation purposes, thus consolidating this practice of dual data collection.1,15e 18 Aylin and colleagues made a direct comparison between HES data and databases of various professional bodies (representing vascular, cardiothoracic, and colorectal surgery).15 The search of HES data was based on procedure codes (OPCS), linking episodes identified with specific admissions in the clinical data sets. Diagnostic codes were not used. Cases were excluded from analysis if data errors were found, notably, mode of admission (scheduled versus unscheduled care). It was concluded that HES data were comparable with clinical data on a variety of measures and could be used for monitoring healthcare performance. The data presented here would suggest that using a model of scrutiny based on consistency of data will miss cases and accept systematic errors, thus reducing the validity of the derived conclusions. After examining HES data relating to 20,290 episodes of AAA repair, Johal and colleagues reported that in 94.9% of episodes there was agreement between the diagnostic and procedural codes for elective procedures. However, 9.8% of patients undergoing elective repair according to choice of ICD or OPCS codes were coded inappropriately by urgency of admission category by the hospital.2 In 6.7% of cases recorded as having had an emergency repair by ICD and OPCS codes, the HES urgency code was in disagreement. Holt and colleagues made a direct comparison between the HES data and a reference standard data set gathered from case notes and theatre records of several hospitals performing elective AAA repair. There was no significant difference between the mortality rates derived from HES data and the reference data set gathered from local records within the participating hospitals (5.3% versus 5.0% respectively). Data for open and EVAR repair were not separated. Of note, 147 cases of elective repair were identified in local hospital records that were not recorded within the HES data set. No case records were found for 69 cases recorded in HES (7% of the sample), and in another 69 cases it was not possible to link a HES episode with a procedure found in the local data set. The summative effect of these conflicts and omissions was to give the appearance

K. Hussey et al.

of close correlation between the numbers of cases in the two data sets (967 versus 962 episodes).1 It is possible that these missing cases could have contained a disproportionate number of adverse outcomes. Errors identified within clinical and administrative data could perhaps be considered to take three forms. Firstly, there are random transcription errors particularly relating to binary data. These could easily cancel each other out, adding and excluding cases more or less equally, leading to an apparent agreement between the numbers of procedures described in administrative and clinical data sets, which may explain the relatively low rates of data conflict reported by some observers.2 Secondly, there may be errors, perhaps frequently repeated, caused by common misunderstandings or misclassifications of a clinical diagnosis or procedure. For example confusion between open and endovascular repairs, or between symptomatic nonruptured AAA and ruptured AAA (both of which are described within OPCS-4 as “emergency repair of AAA”). These errors could be reduced if coding is performed by appropriately experienced medical staff writing discharge summaries. However, a reliance on the discharge process may itself be a weakness as there is an inevitable error rate within these documents.3,18e20 Finally, deviation from the standard care pathway caused by adverse events may itself be an issue that makes these cases less likely to be captured within administrative data sets; the potential for bias is clear. Some patients may have died during the immediate perioperative period, or on the operating table, and as a result some documentation, particularly the discharge summary, is incorrectly processed. Other patients may have died under the care of an acute care team other than the vascular service, for instance intensive care or cardiology, and the subsequent discharge summary may have not have specified the key procedure, namely repair of AAA.21 This may account for the significant proportion of patients operated for ruptured AAA appearing on SMR-01 returns without a procedure code. Interpretation of these data at face value would lead to the incorrect conclusion that these patients were admitted with ruptured AAA and that treatment was palliative. If the patient’s notes do not come to a surgeon for a discharge summary, the administrative and clinical data sets will be in agreement, leading to a notion of accuracy. The same circumstances that lead to errors in an administrative data set may also mean that cases with adverse outcome are not documented within a clinical data set. Giving clinicians complete responsibility for the data presented to the public may be a double-edged sword. Randomised controlled trials are designed to make careful note of patient exclusions and have pre-defined structured follow-up protocols. Self-reported data might lack such vigilant oversight.22 “Gaming” of outcomes is rarely mentioned explicitly, but Pouw and colleagues from Utrecht used this expression in a paper describing deaths occurring after discharge from hospital.23 Building mechanisms into the audit process to reduce the possibility, and even the risk of the accusation, of gaming seems appropriate. Adverse

AAA Mortality Data

events in the community can easily become “missing data,” as can events occurring following a transfer of care within a single continuous period of in-patient care. NHS Scotland (like many other national healthcare providers) generates a unique identifier for each individual.24 The CHI number is nationally recognised and offers the possibility of gathering data prospectively, and retrospectively, on every episode of care. From the starting point of a complete and accurate list of specified episodes or events, outcome data can be obtained by a process of case linkage.25 Elective procedures in general and prophylactic procedures in particular, such as AAA repair and carotid intervention for stroke prevention, seem ideally suited to such a process. Publication of individual performance data for surgeons raises a number of issues. It is clear that the small number of AAA interventions performed by any individual surgeon means that identification of outlying performance is unlikely, even in a 3-year cycle.26,27 As well as reflecting what may be a better way of working, analysis of the larger numbers of cases at team or hospital level may deliver more meaningful data. The NVR analyses outcome at health board level, combining clinically validated data (entered by vascular clinicians) with administrative data returns. However, the omission of even a small number of cases with adverse outcomes will devalue the process, offering false reassurance to both healthcare providers and the public. Another tactic to improve the value of procedure-based audit might be to examine the mortality at 1 year. This would not only increase the number of deaths in the population of interest, but also give an indication of appropriate case selection for patients undergoing prophylactic procedures. A reduced number of variables may also encourage participation and reduce cost, without necessarily diminishing the potential for meaningful riskstratification.28 The current system of dual data collection has been described as “anachronistic.”29 A single approach to data collection, from which accurate clinical and administrative performance can be derived, may be achieved by structuring data capture from a clearly defined point of care, for example the point of intervention. It seems reasonable to verify the description of the procedure at the end of every case to provide a corroborative stream of data for procedure-based audit, a process already advocated by the World Health Organization.30 This concept is not new, nor is the concept of case-linkage to provide follow-up data.31,32 However, a level of independent audit is required that will demand appropriate resource and management; no single method of data collection appears adequate at present.33,34 Conclusion These Scottish administrative data generated for AAA intervention are incomplete and in their current form cannot define performance at either hospital or clinician level. These same intrinsic weaknesses may also be present in HES data. The clinical data alone are similarly flawed. It is

281

of concern that the missing cases contain a disproportionate number of adverse outcomes. CONFLICT OF INTEREST None. FUNDING None. REFERENCES 1 Holt PJ, Poloniecki JD, Thompson MM. Multicentre study of the quality of a large administrative data set and implications for comparing death rates. Br J Surg 2012;99:58e65. 2 Johal A, Mitchell D, Lees T, Cromwell D, van der Meulen J. Use of Hospital Episode Statistics to investigate abdominal aortic aneurysm surgery. Br J Surg 2012;99:66e72. 3 Dixon J, Sanderson C, Elliott P, Walls P, Jones J, Pettigrew M. Assessment of the reproducibility of clinical coding in routinely collected hospital activity data: a study in two hospitals. J Public Health Med 1998;20(1):63e9. 4 Hannan EL, Kilburn Jr H, Lindsey ML, Lewis R. Clinical versus administrative data bases for CABG: does it matter? Med Care 1992;30(10):892e907. 5 Fry DE, Pine MB, Jordan HS, Hoaglin DC, Jones B, Meimban R. The hazards of using administrative data to measure surgical quality. Am Surg 2006;72(11):1031e7. 6 Shahian DM, Silverstein T, Lovett AF, Wolf RE, Normand SL. Comparison of clinical and administrative data sources for hospital coronary artery bypass surgery report cards. Circulation 2007;115:1518e27. 7 Westaby S, Archer N, Manning N, Adwani S, Grebenik C, Ormerod O, et al. Comparison of hospital episode statistics and central cardiac audit database in public reporting of congenital heart surgery mortality. BMJ 2007;335:779. 8 Williams JG, Mann RY. Hospital episode statistics: time for clinicians to get involved? Clin Med 2002;2(1):34e7. 9 The Information Centre for Health and Social Care, Academy of Medical Royal Colleges. Hospital Episode Statistics (HES): improving the quality and value of hospital data. http://www. aomrc.org.uk/publications/statements/doc_view/9379hospital-episode-statistics-improvng-the-quality-and-value-ofhospital-data-discussion-document.html [accessed 22.10.14]. 10 The Vascular Society of Great Britain and Ireland. Abdominal aortic aneurysm quality improvement framework. http://www. aaaqip.com/aaaqip/2011/02/scotland-smr01-v-nvd-comparisons-for-aaa-procedures-july-september-2010.html [accessed 22.10.14]. 11 Aylin P, Lees T, Baker S, Prytherch D, Ashley S. Descriptive study comparing routine hospital administrative data with the Vascular Society of Great Britain and Ireland’s national vascular database. Eur J Vasc Endovasc Surg 2007;33:461e5. 12 De Bruin JL, Baas AF, Buth J, Prinssen M, Verhoeven ELG, Cuypers PMG, et al. for the DREAM Study Group. Long-term outcome of open or endovascular repair of abdominal aortic aneurysm. N Engl J Med 2010;362:1881e9. 13 The United Kingdom EVAR Trial Investigators. Endovascular versus open repair of abdominal aortic aneurysm. N Engl J Med 2010;362:1863e71. 14 Ashton HA, Buxton MJ, Day NE, Kim LG, Marteau TM, Scott RA, et al. Multicentre Aneurysm Screening Study Group. The Multicentre Aneurysm Screening Study (MASS) into the effect

282

15

16

17

18 19

20 21

22

23

24

of abdominal aortic aneurysm screening on mortality in men: a randomised controlled trial. Lancet 2002;360(9345):1531e9. Aylin P, Bottle A, Majeed A. Use of administrative data or clinical databases as predictors of risk of death in hospital: comparison of models. BMJ 2007;334:1044. Holt PJ, Poloniecki PD, Hofman D, Hinchliffe RJ, Loftus IM, Thompson MM. Re-interventions, readmissions and discharge destination: modern metrics for the assessment of the quality of care. Eur J Vasc Endovasc Surg 2010;39(1):49e54. Tekkis PP, McCulloch P, Steger AC, Benjamin IS, Poloniecki JD. Mortality control charts for comparing performance of surgical units: validation study using hospital mortality data. BMJ 2003;327:786. Mayor S. NHS data are not accurate enough for monitoring doctors’ performance. BMJ 2006;333:937. Macaulay EM, Cooper GG, Engeset J, Naylor AR. Prospective audit of discharge summary errors. Br J Surg 1996;83(6):788e 90. O’Dowd A. Coding errors in NHS cause up to £1bn worth of inaccurate payments. BMJ 2010;341:c4734. McKee M, Coles J, James P. ‘Failure to rescue’ as a measure of quality of hospital care: the limitations of secondary diagnosis coding in English hospital data. J Public Health 1999;21(4): 453e8. The Vascular Society of Great Britain and Ireland. Outcomes after elective repair of infra-renal abdominal aortic aneurysm. http://www.hqip.org.uk/assets/NCAPOP-Library/NCAPOP2013e14/Outcomes-after-Elective-Repair-of-Infra-renalAbdominal-Aortic-Aneurysm.pdf[accessed 12.10.14]. Pouw ME, Peelen LM, Moons KGM, Kalkman CJ, Lingsman HF. Including post-discharge mortality in calculation of hospital standardised mortality ratios: retrospective analysis of hospital episode statistics. BMJ 2013;347:f5913. Troëng T, Malmstedt J, Björck M. External validation of the Swedvasc Registry: a first-time individual cross-matching with the unique personal identity number. Eur J Vasc Endovasc Surg 2008;36:705e12.

K. Hussey et al. 25 Stuart WP, Hussey KK, Eifell RKG, Drummond R, Ford I, Welch GH. Using centrally held data to validate carotid surgery outcome data. Stroke 2013;44(6):1670e5. 26 NHS Commissioning Board. Everyone counts: planning for patients 2013/14. 2012 www.commissioningboard.nh.uk/files/ 2012/12/everyonecounts-planning.pdf [accessed 12.10.14]. 27 Walker K, Neuberger J, Groene O, Cromwell DA, van der Meulen J. Public reporting of surgical outcomes: low numbers of procedures lead to false complacency. Lancet 2013;382: 1674e7. 28 Dimmik JB, Osbourne NH, Hall BL, Ko CY, Birkmeyer JD. Riskadjustment for comparing hospital quality with surgery: how many variables are needed? J Am Coll Surg 2010;210(4):503e8. 29 Learning from Bristol: The Department of Health’s response to the report of the public inquiry into children’s heart surgery at the Bristol Royal Infirmary 1984e1995. Presented to Parliament by the Secretary of State for Health by Command of Her Majesty. January 2002. 30 World Health Organization. WHO surgical safety checklist and implementation manual. http://www.who.int/patientsafety/ safesurgery/tools_resources/SSSL_Checklist_finalJun08.pdf? ua¼1. 31 Newcombe HB, Kennedy JM, Axford SJ, James AP. Automatic linkage of vital records. Science 1959;130(3381):954e9. 32 Lawson EH, Ko CY, Louie R, Han L, Rapp M, Zingmond DS. Linkage of a clinical surgical registry with Medicare inpatient claims data using indirect identifiers. Surgery 2013;153(3): 423e30. 33 Scottish Stroke Care Audit. 2014 national report: stroke services in Scottish Hospitals. http://www.strokeaudit.scot.nhs.uk/ Downloads/2014_report/SSCA-report-2014-web.pdf. 34 Flum DR, Fisher N, Thompson J, Marcus-Smith M, Florence M, Pellegrini CA. Washington State’s approach to variability in surgical processes/outcomes: Surgical Clinical Outcomes Assessment Program (SCOAP). Surgery 2005;138(5):821e7.

Understanding administrative abdominal aortic aneurysm mortality data.

Administrative data in the form of Hospital Episode Statistics (HES) and the Scottish Morbidity Record (SMR) have been used to describe surgical activ...
297KB Sizes 0 Downloads 9 Views