EMERGENCY MEDICAL SERVICES/ORIGINAL RESEARCH

Out-of-Hospital Stroke Screen Accuracy in a State With an Emergency Medical Services Protocol for Routing Patients to Acute Stroke Centers Andrew W. Asimos, MD*; Shana Ward, MSPH; Jane H. Brice, MD, MPH; Wayne D. Rosamond, PhD; Larry B. Goldstein, MD; Jonathan Studnek, PhD, NREMT-P *Corresponding Author. E-mail: [email protected].

Study objective: Emergency medical services (EMS) protocols, which route patients with suspected stroke to stroke centers, rely on the use of accurate stroke screening criteria. Our goal is to conduct a statewide EMS agency evaluation of the accuracies of the Cincinnati Prehospital Stroke Scale (CPSS) and the Los Angeles Prehospital Stroke Screen (LAPSS) for identifying acute stroke patients. Methods: We conducted a retrospective study in North Carolina by linking a statewide EMS database to a hospital database, using validated deterministic matching. We compared EMS CPSS or LAPSS results (positive or negative) to the emergency department diagnosis International Classification of Diseases, Ninth Revision codes. We calculated sensitivity, specificity, and positive and negative likelihood ratios for the EMS diagnosis of stroke, using each screening tool. Results: We included 1,217 CPSS patients and 1,225 LAPSS patients evaluated by 117 EMS agencies from 94 North Carolina counties. Most EMS agencies contributing data had high annual patient volumes and were governmental agencies with nonvolunteer, emergency medical technician–paramedic service level providers. The CPSS had a sensitivity of 80% (95% confidence interval [CI] 77% to 83%) versus 74% (95% CI 71% to 77%) for the LAPSS. Each had a specificity of 48% (CPSS 95% CI 44% to 52%; LAPSS 95% CI 43% to 53%). Conclusion: The CPSS and LAPSS had similar test characteristics, with each having only limited specificity. Development of stroke screening scales that optimize both sensitivity and specificity is required if these are to be used to determine transport diversion to acute stroke centers. [Ann Emerg Med. 2014;-:1-7.] Please see page XX for the Editor’s Capsule Summary of this article. 0196-0644/$-see front matter Copyright © 2014 by the American College of Emergency Physicians. http://dx.doi.org/10.1016/j.annemergmed.2014.03.024

INTRODUCTION Background Reperfusion therapy and other advances in stroke therapy during the past 2 decades highlight the critical role of emergency medical services (EMS) in optimizing acute stroke care.1-5 Most important, early administration of intravenous tissue plasminogen activator in selected patients with acute ischemic stroke increases the likelihood of a favorable outcome, especially when administered within 90 minutes of symptom onset.3,6-8 EMS protocols routing patients with acute stroke to hospitals capable of delivering intravenous tissue plasminogen activator is one strategy to improve timely use of thrombolytic therapy.1,9-13 Such protocols rely on the use of sensitive and specific stroke screening scales to identify patients most likely to benefit from transport to a stroke center. The most commonly used stroke screening instruments are the Cincinnati Prehospital Stroke Scale (CPSS) and the Los Angeles Prehospital Stroke Screen (LAPSS).1,14,15 Although the CPSS and the LAPSS are widely Volume

-,

no.

-

:

-

2014

promoted and broadly adopted, the generalizability and overall accuracy of the LAPSS and CPSS are unclear.1,13,16-21 Importance Many states, including North Carolina, have implemented statewide stroke patient EMS routing plans that specify that, within certain time constraints, patients with a positive stroke screen be transported to hospitals designated as acute stroke centers.11 Determining the accuracy of out-of-hospital stroke screens in such states is essential because the use of low-sensitivity screens can result in transport to hospitals unable to treat patients with thrombolytics, whereas bypassing hospitals according to false-positive screens can result in costly and inconvenient transport diversion. Among the challenges of studying stroke screen accuracy is the ability to link out-of-hospital databases containing EMS stroke screen results with hospital databases containing patient diagnostic data. PreMIS (Prehospital Medical Information System) is a statewide EMS database used Annals of Emergency Medicine 1

Asimos et al

Out-of-Hospital Stroke Screen Accuracy

Editor’s Capsule Summary

What is already known on this topic Accurate identification of out-of-hospital patients with acute stroke is difficult, and error can affect later health care options. What question this study addressed How well do 2 out-of-hospital scoring systems perform when seeking to identify patients with and without acute stroke? What this study adds to our knowledge In a retrospective analysis of 1,217 patients assessed with the Cincinnati Prehospital Stroke Scale and another 1,225 assessed with the Los Angeles Prehospital Stroke Screen, neither had adequate sensitivity (80% and 74%, respectively) or sensitivity (48% each) for detecting acute stroke. How this is relevant to clinical practice Either a better tool or improved training and implementation of existing tools is needed to optimize out-of-hospital stroke detection.

throughout North Carolina.22 It includes patient identifiers and data fields for CPSS and LAPSS results. The North Carolina Disease Event Tracking and Epidemiologic Collection Tool (NC DETECT) is a deidentified statewide hospital emergency department (ED) surveillance database that includes ED diagnoses.23 Access to both of these out-of-hospital and hospital databases is unique to North Carolina. We have been successful in using deterministic matching to link PreMIS to deidentified databases such as NC DETECT.24 Goals of This Investigation The purpose of this study was to conduct a statewide assessment of the accuracy of the CPSS and LAPSS in identifying stroke patients by comparing the stroke screen results in PreMIS with the ED diagnostic information contained in NC DETECT.

MATERIALS AND METHODS Study Design and Setting As a substudy of a project evaluating North Carolina’s EMS routing protocol for patients with suspected stroke, we conducted a retrospective study of patients from January 1, 2009, to March 31, 2011. Data Collection and Processing We used PreMIS, which is a National Emergency Medical Services Information System–compliant, Internet-based system 2 Annals of Emergency Medicine

for documenting EMS service delivery and care for patients in North Carolina.22 EMS agencies collect and submit data into PreMIS by using either a Web-based interface provided at no cost or a commercial EMS data system certified compliant for all data elements required by the North Carolina Office of EMS.22 In 2008, 42% of EMS agencies entered 17% of the total records in PreMIS through the Web-based interface, whereas the remainder of records were submitted through use of commercial EMS data systems.22 We obtained PreMIS through a data use agreement with the North Carolina Office of EMS. We categorized EMS agencies with the Credentialing Information System, a database used by the state regulatory offices to monitor and provide credentials to EMS personnel, ambulances, and EMS agencies. Each North Carolina EMS system administrator is responsible for entering descriptive EMS data into the system through an online form. Within the system, EMS agencies are categorized according to status (eg, volunteer versus nonvolunteer), type (eg, governmental versus nongovernmental), and service level (eg, paramedic versus nonparamedic). We obtained Credentialing Information System data through an information request filed with the North Carolina Department of Health and Human Services. The NC DETECT database was developed to provide realtime surveillance of hospital ED visits across North Carolina and represents the first fully automated statewide hospital ED surveillance system.23 Through legislative mandate, all North Carolina acute care, hospital-affiliated civilian EDs that are open 24 hours a day, 7 days a week are required to provide data on all ED visits to the NC DETECT surveillance system at least daily. Hospitals perform their own medical coding and then securely transmit the data to a data aggregator, using standardized health record formats. Data are monitored regularly and data quality issues are communicated to hospitals for resolution. NC DETECT is classified as a deidentified patient database but includes patient birth dates and sex. We filed a data use agreement with the North Carolina Division of Public Health to obtain NC DETECT data. We obtained approval from the Carolinas Medical Center’s institutional review board to perform this study. Selection of Participants From among the patient records in PreMIS during our study period, we selected patients with a preliminary EMS impression of stroke (Figure). We excluded duplicate data records (eg, 1 patient, but multiple portions of the EMS transport in the data) and patients who were transferred between facilities. Using a validated method of deterministic matching,24 data analysts at the Emergency Medical Services Performance Improvement Center (EMSPIC) linked patient records in PreMIS to corresponding records in NC DETECT. EMSPIC data analysts used the following 4 match variables to link records generating unique 1-to-1 EMS transport-to-ED visit matches: (1) patient date of birth (month, day, and year), (2) patient sex, (3) facility (ED hospital facility to which EMS transported the patient and Volume

-,

no.

-

:

-

2014

Asimos et al

Out-of-Hospital Stroke Screen Accuracy versus nongovernmental versus fire department), and service level (EMT-paramedic versus other, which includes any of the following: EMT-basic, EMT-intermediate, first responder, or nurse). In accordance with the overall distribution of yearly EMS run volumes, we categorized EMS agencies by size (small, medium, and large), according to quartile distributions of the maximum EMS run volume in PreMIS from either of our 2 consecutive 12-month periods included in our combined data set. Additionally, we created a “very large” category for EMS agencies with call volumes at or above the 90th percentile. This resulted in the following size categorizations: 1 to 386¼small; 387 to 2,225¼moderate; 2,226 to 8,766¼large; >8,766¼very large.

Figure. Patient inclusion flow chart.

hospital facility of the patient’s ED visit), and (4) date and time (EMS arrival date and time and ED registration date and time are within 1 hour, regardless of which came first). Linkage verification performed by the EMSPIC reflects a plausible match rate of 94% for unique 1-to-1 EMS transport-to-ED visit matches.25 We next sequentially excluded patients without any ED diagnosis International Classification of Diseases, Ninth Revision (ICD-9) code in NC DETECT or without CPSS and LAPSS result data. Outcome Measures We compared the EMS stroke screen result (positive or negative) in PreMIS to the ED disposition diagnosis ICD-9 code(s) found in NC DETECT. Because we did not have access to any EMS agency–specific protocols, we do not know when local EMS protocols recommended performance of either the CPSS or LAPSS, or whether EMS providers received any CPSSor LAPSS-specific training. Additionally, PreMIS includes a data field for only the final CPSS or LAPSS result and not for any of the individual components of the scales. Our criterion standard for a stroke or transient ischemic attack was any of the following ICD-9 codes existing in any ICD-9 code field position within NC DETECT: ischemic stroke (433.01, 433.11, 433.21, 433.31, 433.81, 433.91, 434.01, 434.11, 434.91, and 436), hemorrhagic stroke (431 and 432.9), or TIA (433.10, 434.00, 434.10, 434.90, 435.0, 435.1, 435.3, 435.8, and 435.9). From data within the Credentialing Information System, we categorized EMS agencies according to status (nonvolunteer versus mixed/volunteer), type (governmental non–fire department Volume

-,

no.

-

:

-

2014

Primary Data Analysis For both the CPSS and the LAPSS, we calculated sensitivity, specificity, and positive and negative likelihood ratios, with resultant 95% confidence intervals (CIs). For our primary analysis, we included TIA diagnostic codes in our stroke criterion standard. We performed 3 sensitivity analyses. First, we performed the same calculations excluding patients with TIA ICD-9 codes from the analysis. Additionally, because data made available from the EMSPIC after our study was designed indicated a possible 6% mismatch rate for PreMIS and NC DETECT records, we conducted 2 sensitivity analyses planned a posteriori by recalculating sensitivity and specificity, assuming best- and worst-case scenarios for a 6% mismatch rate. To perform this calculation, for the worst-case scenario we moved 6% of patients from the stroke criterion standard group to the nonstroke criterion standard group and moved 6% of patients with a negative stroke screen result from the nonstroke criterion standard group to the stroke criterion standard group. For the best-case scenario, we moved 6% of all patients with a positive stroke screen result from the nonstroke criterion standard group to the stroke criterion standard group and moved 6% of all patients with a negative stroke screen result from the stroke criterion standard group to the nonstroke criterion standard group. We used SAS Enterprise Guide (version 5.1; SAS Institute, Inc., Cary, NC) to perform all statistical analyses.

RESULTS Characteristics of Study Subjects The Figure provides information on the study sample. We were able to link 62% of PreMIS records to NC DETECT records. Our final study sample included CPSS or LAPSS data from 2,442 patients, generated by 117 EMS agencies from 94 of North Carolina’s 100 counties. It included 1,217 patients with CPSS data and 1,225 patients with LAPSS data. Characteristics for patients screened with the CPSS versus the LAPSS are listed in Table 1. For both stroke screens, most EMS agencies contributing data had large to very large annual patient volumes and were governmental (non–fire department) agencies with nonvolunteer, EMT-paramedic service level providers, but a higher percentage of EMS agencies using the LAPSS were nongovernmental organizations with lower annual transport volumes and staffed by Annals of Emergency Medicine 3

Asimos et al

Out-of-Hospital Stroke Screen Accuracy Table 1. Demographic and transporting EMS agency characteristics for CPSS and LAPSS patients.* Variable Patient characteristics Age, mean (SD) Sex Male Female Race White Black Other Not recorded Ethnicity Non-Hispanic Hispanic Not recorded EMS agency characteristics EMS annual transport volume Very large Large Moderate Small Organization type Governmental Nongovernmental Fire department Organization status Nonvolunteer Volunteer/mixed Service level EMT-paramedic Other†

CPSS (n[1,217)

LAPSS (n[1,225)

66 (17)

69 (15)

505 (42) 712 (59)

561 (46) 664 (54)

650 388 139 40

689 282 11 243

(53) (32) (12) (3)

1,085 (89) 20 (2) 112 (9)

856 304 54 3

(56) (23) (0) (21)

932 (76) 13 (1) 280 (23)

(70) (25) (4) (0)

745 341 124 15

(61) (28) (10) (1)

993 (82) 202 (17) 20 (2)

697 (57) 476 (39) 49 (4)

1,134 (93) 81 (7)

1,081 (88) 141 (12)

1,173 (97) 42 (3)

1,146 (94) 76 (6)

*Data are presented as No. (%) unless otherwise indicated. † Includes EMT-basic, EMT-intermediate, first responder, or nurse.

volunteer and nonparamedic providers. Of our total study population, 38% (N¼935) had an ischemic stroke, 15% (N¼366) had a TIA, and 7% (N¼167) had a hemorrhagic stroke; 40% (N¼974) received a nonstroke diagnosis. Main Results The CPSS had a sensitivity of 80% (95% CI 77% to 83%) and a specificity of 48% (95% CI 44% to 52%) (Table 2). When patients with TIA ICD-9 codes were excluded from the analysis (N¼1,021), the sensitivity of the CPSS increased to 86% (95% CI 83% to 89%). The LAPSS had a sensitivity of 74% (95% CI 71% to 77%) and also had a specificity of 48% (95% CI 43% to 53%). Similar to the CPSS, when TIA ICD-9 code patients were excluded (N¼989), the sensitivity increased to 85% (95% CI 80% to 87%). The 10 most common ICD-9 diagnosis codes for patients with false-positive stroke screen results are listed in Table 3. Table 4 gives the best- and worst-case scenarios, assuming a 6% mismatch rate between PreMIS and NC DETECT records. In the worst-case scenario, the sensitivities of the CPSS and LAPSS decreased to 69% (95% CI 66% to 73%) and 65% (95% CI 61% to 68%), respectively, whereas the specificities decrease to 35% (95% CI 31% to 39%) and 30% (95% CI 26% to 35%), respectively. Alternatively, for the best-case scenario, the sensitivities of the CPSS and LAPSS increased to 82% (95% CI 79% to 85%) and 76% (95% CI 73% to 79%), respectively, whereas the specificities increase to 50% (95% CI 46% to 55%) and 51% (95% CI 46% to 56%), respectively.

LIMITATIONS Our study has several limitations. Our final study sample was limited by missing ICD-9 codes for some patients and the

Table 2. Accuracy of CPSS and LAPSS. CPSS ED Diagnosis

Positive

Stroke and TIA* 533 No stroke/TIA 287 Total 820 Test characteristics (95% CI) Sensitivity Specificity Positive likelihood ratio Negative likelihood ratio ED diagnosis† Positive Stroke only 403 No stroke 287 Total 690 Test characteristics (TIAs excluded) (95% CI) Sensitivity Specificity Positive likelihood ratio Negative likelihood ratio

LAPSS

Negative

Total

Positive

Negative

Total

130 267 397

663 554 1,217

596 219 815

209 201 410

805 420 1,225

80 (77–83) 48 (44–52) 1.55 (1.42–1.70) 0.41 (0.35–0.48) Negative 64 267 331 86 (83–89) 48 (44–52) 1.67 (1.52–1.82) 0.28 (0.22–0.36)

Total 467 554 1,021

Positive 476 219 695

74 (71–77) 48 (43–53) 1.42 (1.28–1.57) 0.54 (0.48–0.61) Negative 93 201 294

Total 569 420 989

85 (80–87) 48 (43–53) 1.60 (1.45–1.77) 0.34 (0.28–0.41)

TIA, Transient ischemic attack. *Includes both stroke and TIA ICD-9 diagnosis codes. † Analysis excludes patients with TIA ICD-9 diagnosis codes.

4 Annals of Emergency Medicine

Volume

-,

no.

-

:

-

2014

Asimos et al

Out-of-Hospital Stroke Screen Accuracy

Table 3. Top 10 ED ICD-9 code diagnoses for nonstroke patients with a positive CPSS or LAPSS result. ICD-9 Code*

Description

N (Total[506)

%

401.9 780.79 780.97 599 784 345.9 584.9 294.8 780.39 250

Hypertension Malaise and fatigue Altered mental status Urinary tract infection Headache Epilepsy unspecified Acute renal failure, unspecified Mental disorder Convulsion Diabetes mellitus

43 37 34 27 23 16 12 11 10 7

8.5 7.3 6.7 5.3 4.5 3.2 2.4 2.2 2.0 1.4

*ICD-9 diagnosis codes in the first or second position fields in NC DETECT.

overall linkage rate. However, according to our previously published analysis, we are unable to postulate systematic bias involving linked versus nonlinked records that should affect our stroke screen analysis. Although we found both the CPSS and LAPSS to be reasonably sensitive, our methodology favored overestimating sensitivity, according to the selection criterion of cases with a preliminary impression of stroke. Particularly, given the modest specificity of these screens in patients thought to have a stroke, one would assume a lower specificity in patients with vague neurologic presentations (eg, altered mental status, weakness). The ICD-9 codes we included in our criterion standard and their location within any ED ICD-9 diagnosis field position within NC DETECT also favored sensitivity. Nonetheless, the sensitivities we found for both the CPSS and LAPSS are within the ranges previously reported (66% to

95% for the CPSS15,17,19,26,27 and 68% to 93% for the LAPSS14,18,26,28). We relied on the final stroke screen result data existing in PreMIS. We did not have access to any historical or physical examination data used to classify a stroke screen result as either positive or negative. For example, by definition a CPSS result is positive if any one of its 3 component tests shows an abnormal finding, but we do not know whether EMS agencies appropriately applied this conventional definition of a positive CPSS result. Additionally, we included in our criterion standard any possible ED diagnosis TIA or stroke ICD-9 codes located in any ED diagnosis code field, making it is possible that we included a diagnosis related to a comorbid process versus a diagnosis specifically associated with the transport. However, 78% (N¼953/1,222) of all of the TIA and stroke diagnoses were in first or second ICD-9 diagnosis fields in NC DETECT, and none were beyond the fifth diagnosis fields. Other published work from NC DETECT has found that the ICD-9 diagnosis codes in the first or second positions most frequently pertain to the ED visit versus being comorbid processes not specifically associated with the visit.29,30 We relied on ED diagnoses for our criterion standard and lacked inpatient diagnostic data to confirm the accuracy of stroke diagnoses in the ED. Additionally, in our primary analysis, we included TIA in our stroke group, according to the rationale that patients may have had stroke signs in the field that resolved after ED arrival. Finally, in accordance with stipulations within our data use agreement with the North Carolina Division of Public Health, we were not provided EMS agency identifiers (linked data were released to us at the aggregate level of EMS agency

Table 4. Sensitivity analyses assuming a 6% incorrect match rate between PreMIS and NC DETECT records. CPSS ED Diagnosis Worst-case scenario* Stroke and TIA* No stroke/TIA Total Test characteristics (95% CI) Sensitivity Specificity Best-case scenario† ED diagnosis Stroke only No stroke Total Test characteristics (95% CI) Sensitivity Specificity

LAPSS

Positive

Negative

Total

Positive

Negative

Total

460 360 820

203 194 397

663 554 1,217

522 293 815

283 127 410

805 420 1,225

69 (66–73) 35 (31–39) Positive 550 270 820

Negative 122 275 397 82 (79–85) 50 (46–55)

65 (61–68) 30 (26–35) Total 672 545 1,217

Positive 609 206 815

Negative 196 214 410

Total 805 420 1,225

76 (73–79) 51 (46–56)

*To perform this calculation, we assumed a worst-case scenario in which 6% of all patients with a positive stroke screen result have been moved from the stroke criterion standard group to the nonstroke criterion standard group, and 6% of all patients with a negative stroke screen result have been moved from the nonstroke criterion standard group to the stroke criterion standard group. † To perform this calculation, we assumed a best-case scenario in which 6% of all patients with a positive stroke screen result have been moved from the nonstroke criterion standard group to the stroke criterion standard group, and 6% of all patients with a negative stroke screen result have been moved from the stroke criterion standard group to the nonstroke criterion standard group.

Volume

-,

no.

-

:

-

2014

Annals of Emergency Medicine 5

Asimos et al

Out-of-Hospital Stroke Screen Accuracy categorization); therefore, we could not adjust for potential clustering by EMS provider.

DISCUSSION We evaluated the accuracy of the CPSS and the LAPSS in the setting of a statewide EMS system. Although the sensitivities and specificities of the 2 scales were similar, the specificities were at best modest. Because poor specificity can result in “overtriage,” with many nonstroke patients being diverted to stroke centers, specificity assumes particular importance when stroke screens are used for transport diversion. Accepting the precept that some level of overtriage is justified to improve overall access to thrombolytics and other acute stroke therapies, the question becomes, what rate of overtriage is acceptable within a statewide system of acute stroke care? Unfortunately, no overtriage or undertriage guidelines have been proposed for stroke systems of care. For regionalized trauma care, the American College of Surgeons Committee on Trauma has suggested that an overtriage rate of 30% to 50% may be acceptable.31 As with trauma, priority for stroke should be given to undertriage to prevent morbidity from delays in definitive care, but this can result in an overuse of financial and human resources, can contribute to stroke center crowding, and can increase EMS transport times and hospital turnaround times. Additionally, nonstroke patients may be taken to hospitals within which their electronic medical record history cannot be accessed and their physicians do not have privileges, thus contributing to, rather than reducing, the fragmentation of care. This can frustrate patients, family members, and providers. Furthermore, when hospital bypass is undertaken in more rural settings with a limited number of ambulances, inappropriate diversion may result in the unavailability of an ambulance to transport an acute myocardial infarction patient to a percutaneous transluminal coronary angioplasty center or a major trauma victim to a regional trauma center. Previous studies that also have found suboptimal specificity of the CPSS and LAPSS were smaller and limited to one agency, county, or metropolitan region. A retrospective observational study similar to ours conducted in the San Diego EMS system found that of 477 patients with a paramedic assessment of stroke with the CPSS, 193 had a final discharge diagnosis of stroke (positive predictive value of 40%).17 Two prospective studies performed in Australia reported a specificity of the CPSS of only 33%20 and 54%.26 A study of the Durham County, NC, EMS agency found that of patients presenting with a CPSS abnormality, less than half received a final diagnosis of stroke or TIA.19 The specificity of the CPSS was even lower (24%) in the Mecklenburg County, NC, EMS system.21 Because of data use agreement restrictions associated with the current study, we do not know how many of our study patients came from these 2 counties, but given the overall low specificity and total number of counties and EMS agencies that contributed study data, our results suggest that poor specificity is not limited to just those 2 counties or their EMS agencies. 6 Annals of Emergency Medicine

One of the key obstacles in studying stroke screen accuracy within regionalized and statewide acute stroke systems of care is the lack of linked data systems across the stroke disease course, especially to include the out-of-hospital phase of care.32 We were able to conduct our study by linking 2 large databases and relying on a deterministic linking strategy. In a feasibility study using the same linking methodology, we previously found a large number of valid matches among the 63% of successfully linked out-of-hospital and hospital records,24 a rate almost identical to the 62% in our current study. Additionally, an unpublished analysis of the PreMIS-NC DETECT data linkage file found a highly plausible EMS transport–ED visit match rate of 94%.25 As our sensitivity analyses showed, assuming a worst-case scenario from an incorrect stroke diagnosis in 6% of patients decreases our already low specificity and results in a sensitivity in the lower range of previously reported sensitivities. Alternatively, even the best-case scenario increases the specificities only to the 50% range. In summary, understanding the accuracy of out-of-hospital stroke screens within systems of stroke care is important. Reasonably high sensitivity, such as that found in our study, allows out-of-hospital providers to appropriately route patients to stroke centers so they can begin to alert or activate resources while decreasing the probability of sending a stroke patient to a hospital unable to provide acute revascularization therapy. Our analysis, however, suggests that the specificity of the CPSS and LPSS may be too low for use within acute systems of care with bypass protocols in place, especially when ambulance resources are limited.

Supervising editor: Donald M. Yealy, MD Author affiliations: From the Department of Emergency Medicine, Carolinas Medical Center, Charlotte, NC (Asimos); the Dickson Advanced Analytics Group, Carolinas Heath Care System, Charlotte, NC (Ward); the Department of Emergency Medicine, School of Medicine (Brice), and Department of Epidemiology, School of Public Health (Rosamond), University of North Carolina, Chapel Hill, NC; the Department of Neurology, Duke University and Durham VA Medical Center, Durham, NC (Goldstein); and the Mecklenburg EMS Agency, Charlotte, NC (Studnek). Author contributions: AWA conceived the study and obtained research funding. SW managed the data. SW, WDR, and JS provided statistical advice. SW analyzed the data. AWA drafted the article, and all authors contributed to its revision. AWA takes responsibility for the paper as a whole. Funding and support: By Annals policy, all authors are required to disclose any and all commercial, financial, and other relationships in any way related to the subject of this article as per ICMJE conflict of interest guidelines (see www.icmje.org). The authors have stated that no such relationships exist. This work was funded by the 2011-2012 EMF/Genentech Regionalization and Stroke Care Grant. Publication dates: Received for publication January 20, 2014. Revision received March 14, 2014. Accepted for publication March 26, 2014.

Volume

-,

no.

-

:

-

2014

Asimos et al

Out-of-Hospital Stroke Screen Accuracy

Presented at the International Stroke Conference, February 2013, Honolulu, HI. The NCDETECT Data Oversight Committee does not take responsibility for the scientific validity or accuracy of methodology, results, statistical analyses, or conclusions presented. REFERENCES 1. Jauch EC, Saver JL, Adams HP Jr, et al. Guidelines for the early management of patients with acute ischemic stroke: a guideline for healthcare professionals from the American Heart Association/ American Stroke Association. Stroke. 2013;44:870-947. 2. National Institute of Neurological Disorders and Stroke rt-PA Stroke Study Group. Tissue plasminogen activator for acute ischemic stroke. N Engl J Med. 1995;333:1581-1587. 3. Hacke W, Donnan G, Fieschi C, et al. Association of outcome with early stroke treatment: pooled analysis of ATLANTIS, ECASS, and NINDS rt-PA stroke trials. Lancet. 2004;363:768-774. 4. Hacke W, Kaste M, Bluhmki E, et al. Thrombolysis with alteplase 3 to 4.5 hours after acute ischemic stroke. N Engl J Med. 2008;359:1317-1329. 5. Goldstein JN, Thomas SH, Frontiero V, et al. Timing of fresh frozen plasma administration and rapid correction of coagulopathy in warfarin-related intracerebral hemorrhage. Stroke. 2006;37:151-155. 6. Lees KR, Bluhmki E, von Kummer R, et al. Time to treatment with intravenous alteplase and outcome in stroke: an updated pooled analysis of ECASS, ATLANTIS, NINDS, and EPITHET trials. Lancet. 2010;375:1695-1703. 7. Marler JR, Tilley BC, Lu M, et al. Early stroke treatment associated with better outcome: the NINDS rt-PA stroke study. Neurology. 2000;55:1649-1655. 8. Saver JL, Fonarow GC, Smith EE, et al. Time to treatment with intravenous tissue plasminogen activator and outcome from acute ischemic stroke. JAMA. 2013;309:2480-2488. 9. Higashida R, Alberts MJ, Alexander DN, et al. Interactions within stroke systems of care: a policy statement from the American Heart Association/American Stroke Association. Stroke. 2013;44:2961-2984. 10. Acker JE III, Pancioli AM, Crocco TJ, et al. Implementation strategies for emergency medical services within stroke systems of care: a policy statement from the American Heart Association/American Stroke Association Expert Panel on Emergency Medical Services Systems and the Stroke Council. Stroke. 2007;38:3097-3115. 11. Song S, Saver J. Growth of regional acute stroke systems of care in the United States in the first decade of the 21st century. Stroke. 2012;43:1975-1978. 12. El Khoury R, Jung R, Nanda A, et al. Overview of key factors in improving access to acute stroke care. Neurology. 2012; 79(13 suppl 1):S26-S34. 13. Fassbender K, Balucani C, Walter S, et al. Streamlining of prehospital stroke management: the golden hour. Lancet Neurol. 2013;12:585-596. 14. Kidwell CS, Starkman S, Eckstein M, et al. Identifying stroke in the field. Prospective validation of the Los Angeles Prehospital Stroke Screen (LAPSS). Stroke. 2000;31:71-76.

Volume

-,

no.

-

:

-

2014

15. Kothari RU, Pancioli A, Liu T, et al. Cincinnati Prehospital Stroke Scale: reproducibility and validity. Ann Emerg Med. 1999;33:373-378. 16. Audebert HJ, Saver JL, Starkman S, et al. Prehospital stroke care: new prospects for treatment and clinical research. Neurology. 2013;81:501-508. 17. Ramanujam P, Guluma KZ, Castillo EM, et al. Accuracy of stroke recognition by emergency medical dispatchers and paramedics—San Diego experience. Prehosp Emerg Care. 2008;12:307-313. 18. Ziegler V, Rashid A, Muller-Gorchs M, et al. Mobile computing systems in preclinical care of stroke. Results of the Stroke Angel initiative within the BMBF project PerCoMed. Anaesthesist. 2008;57:677-685. 19. Frendl DM, Strauss DG, Underhill BK, et al. Lack of impact of paramedic training and use of the Cincinnati Prehospital Stroke Scale on stroke patient identification and on-scene time. Stroke. 2009;40:754-756. 20. Bergs J, Sabbe M, Moons P. Prehospital stroke scales in a Belgian prehospital setting: a pilot study. Eur J Emerg Med. 2010;17:2-6. 21. Studnek JR, Asimos A, Dodds J, et al. Assessing the validity of the Cincinnati Prehospital Stroke Scale and the medic prehospital assessment for code stroke in an urban emergency medical services agency. Prehosp Emerg Care. 2013;17:348-353. 22. Mears GD, Pratt D, Glickman SW, et al. The North Carolina EMS Data System: a comprehensive integrated emergency medical services quality improvement program. Prehosp Emerg Care. 2010;14:85-94. 23. Hakenewerth AM, Waller AE, Ising AI, et al. North Carolina Disease Event Tracking and Epidemiologic Collection Tool (NC DETECT) and the National Hospital Ambulatory Medical Care Survey (NHAMCS): comparison of emergency department data. Acad Emerg Med. 2009;16:261-269. 24. Mears GD, Rosamond WD, Lohmeier C, et al. A link to improve stroke patient care: a successful linkage between a statewide emergency medical services data system and a stroke registry. Acad Emerg Med. 2010;17:1398-1404. 25. Rhea S. EMSPIC Analysis of PreMIS-NC DETECT Data Linkage File Document. 6-18-2012. 26. Bray JE, Martin J, Cooper G, et al. Paramedic identification of stroke: community validation of the Melbourne Ambulance Stroke Screen. Cerebrovasc Dis. 2005;20:28-33. 27. Bray JE, Coughlan K, Barger B, et al. Paramedic diagnosis of stroke: examining long-term use of the Melbourne Ambulance Stroke Screen (MASS) in the field. Stroke. 2010;41:1363-1366. 28. Kidwell CS, Saver JL, Schubert GB, et al. Design and retrospective analysis of the Los Angeles Prehospital Stroke Screen (LAPSS). Prehosp Emerg Care. 1998;2:267-273. 29. Yeatts KB, Lippmann SJ, Waller AE, et al. Population-based burden of COPD-related visits in the ED: return ED visits, hospital admissions, and comorbidity risks. Chest. 2013;144:784-793. 30. Lich KH, Travers D, Psek W, et al. Emergency department visits attributable to asthma in North Carolina, 2008. N C Med J. 2013;74:9-17. 31. Committee on Trauma. Prehospital Trauma Care: Resources for Optimal Care of the Injured Patient. Chicago, IL: American College of Surgeons; 2006:21-25. 32. Glickman SW, Kit DM, Hirshon JM, et al. Defining and measuring successful emergency care networks: a research agenda. Acad Emerg Med. 2010;17:1297-1305.

Annals of Emergency Medicine 7

Out-of-hospital stroke screen accuracy in a state with an emergency medical services protocol for routing patients to acute stroke centers.

Emergency medical services (EMS) protocols, which route patients with suspected stroke to stroke centers, rely on the use of accurate stroke screening...
381KB Sizes 0 Downloads 3 Views