Practical Radiation Oncology (2012) 2, e155–e164

www.practicalradonc.org

Original Report

Radiation oncology information systems and clinical practice compatibility: Workflow evaluation and comprehensive assessment Luis E. Fong de los Santos PhD ⁎, Michael G. Herman PhD Department of Radiation Oncology, Mayo Clinic, Rochester, Minnesota Received 24 May 2011; revised 2 February 2012; accepted 6 February 2012

Abstract Purpose: To map the level of clinical practice compatibility with a radiation oncology information system (ROIS) through a workflow- and clinical process-based method aimed at optimizing the safety, efficacy, and efficiency of patient care; to improve the understanding of the critical relationship between the clinical practice and ROIS. Methods and materials: Clinic-specific workflow and infrastructure were classified into clinical processes, information management, and technological innovation integration. Clinical information systems-information technology infrastructure and process maps were generated by a team of experts, representing clinical constituents. These maps served as the basis for evaluating connectivity and process flow and to guide the development of a quantitative survey where all clinical tasks and subprocesses were ranked according to importance in patient care and scored by the team of experts for performance. Process maps and survey output were used to measure ROIS compatibility with the practice and to guide practice improvement. Results: Practice-specific process and infrastructure maps were generated. The developed survey was applied and results indicate a range of ROIS compatibility with clinical workflow and infrastructure. Survey results combined with experiential feedback provided specific prioritized guidance to improve both ROIS performance and clinic-specific processes and infrastructure. Conclusions: This work provides a systematic and customizable tool to understand and evaluate clinical information and workflow and its compatibility with a given ROIS. The analysis provides insight into workflow improvements and information systems and information technology infrastructure limitations. Participating in such a process provides the entire team with a deeper understanding of the critical relationship between the clinical practice and the ROIS. Published by Elsevier Inc. on behalf of American Society for Radiation Oncology.

Introduction Sources of support: Supported in part by a grant from Varian Medical Systems, Inc. Conflicts of interest: None. ⁎ Corresponding author. Department of Radiation Oncology, Mayo Clinic, 200 First Street SW, Rochester, MN 55905. E-mail address: [email protected] (L.E. Fong de los Santos).

In 1997, the Institute of Medicine issued a report suggesting that computer-based patient records are an essential technology for health care and that electronic records should be the standard for medical records. 1 The increasing volume and complexity of clinical data demand

1879-8500/$ – see front matter. Published by Elsevier Inc. on behalf of American Society for Radiation Oncology. http://dx.doi.org/10.1016/j.prro.2012.02.002

e156

L.E. Fong de los Santos, M.G. Herman

that information technologies (IT) and information systems (IS) become an integral component in the delivery of patient care that is safe, effective, patient-centered, timely, efficient, and equitable. 2,3 However, it has been demonstrated that successful implementation of IT requires a rigorous, structured evaluation to assure compatibility of IS with clinical workflow, technologic and social infrastructure, and the goals of a specific clinical environment. 4-7 This is particularly true in radiation oncology, where computer-controlled complex treatments evolve rapidly. Since the introduction of computerized record and verify systems 8-10 over 40 years ago, radiation oncology information systems (ROIS) have evolved to become the bridge among information management, technologic innovations, and patient treatment in high-quality patient care. 11 Consequently, there has been critical interest by radiation oncology departments to optimize workflow and identify the most compatible ROIS configuration for a specific practice. While considerable efforts have been invested in developing methodologies to evaluate health information systems, there have been no publications focused on the implementation of evaluation methods for information systems in radiation oncology and the critical integration of the ROIS in a given clinical workflow. In this manuscript, current knowledge for evaluation of health information systems is combined with practice process assessment to develop a practice-specific assessment methodology for information systems in radiation oncology.

Methods and materials The model presented here relies on a systematic framework to map practice-specific clinical workflow and IS-IT infrastructure. It then develops into an assessment methodology using quantitative and qualitative analytical tools to combine ROIS performance metrics with importance to patient care, resulting in a hierarchical classification of the overall ROIS compatibility with sitespecific clinical practice goals and infrastructure. This methodology provides patient care teams in radiation oncology departments with a road map to document clinical processes and infrastructure and subsequently measure the level of compatibility and performance of a given ROIS in that specific clinical environment.

Evaluation model The methodology is based on the hierarchical relationship among the main components of a modern radiation oncology practice: clinical users, clinical processes, information, software, and hardware. Clinical users occupy the highest level in this classification scheme as they are ultimately responsible for defining, developing, maintaining, and participating in the clinical practice. Based on the goals of the practice, clinical users develop practice-specific processes

Practical Radiation Oncology: October-December 2012

that will provide an efficient and safe workflow. Clinical processes generate, utilize, and exchange information throughout the episode of patient care. Software manages the exchange of information and facilitates workflow. Finally, software requires a reliable hardware infrastructure and stable connectivity to provide continuous availability and optimum performance. Based on the previously described hierarchical classification, the implementation procedure for the workflow evaluation and ROIS assessment methodology comprised the following 4 steps: (1) Definition and selection of clinical expert users; (2) mapping of clinical processes and information flow; (3) definition of IS-IT infrastructure (hardware) and configuration; and (4) ROIS (software) assessment and compatibility with clinical workflow.

Step (1): Clinical expert users Successful ROIS and practice performance is intimately dependent on the users and their environment. 12,13 Lorenzi and others 14,15 have defined this component as Peopleware. Following the Hripcsak and Wilcox 16 approach, a subset of the Peopleware was defined by a team of clinical expert users: 2 physicians, 3 physicists, 2 dosimetrists, 3 therapists, 1 IS staff, and 1 administrator. An expert user was defined as an individual with a high level of experience in his or her area and strong understanding of the overall clinical processes or infrastructure of the practice. The expert team was responsible for initial characterization and mapping of processes and infrastructure, carrying out the clinical tasks using the functionality provided by the ROIS and ultimately evaluating the level of compatibility between the ROIS applications and tools and the clinic-specific workflow and infrastructure.

Step (2): Clinical processes and information flow Using standard process mapping techniques, 17-20 each of the primary clinical processes and its corresponding information flow within the department from initial patient encounter to treatment course completion were mapped. The map components were clinical process, INPUT information required to initiate process, OUTPUT information at process completion, and user(s) responsible for and involved in the process. In each case, origin, flow direction, and physical form (paper or electronic) of the information were included in the mapping. Tasks, activities, and subprocesses necessary to take the specific process from start to finish, as well as institutional and departmental systems used for patient information and treatment management, were documented.

Step (3): ROIS IS-IT infrastructure and configuration IS-IT infrastructure maps provide a comprehensive blueprint of all relevant departmental and institutional

Practical Radiation Oncology: October-December 2012

A

ROIS and clinical practice compatibility

e157

Importance for Patient Care Scoring Scale:

High 5

Medium 4

3

Low 2

1

• High. The performance of this activity is essential to guarantee an accurate and safe patient treatment. It has a high effect on dose delivery as well as in the information and process flow. • Medium. The performance of this activity helps in the process and information flow, and has medium dosimetric effect on the patient treatment. • Low. The performance of this activity has very small effect on process and information flow and does not have any dosimetric effect at all on the patient treatment

B

Performance Index Scoring Scale:

Excellent

5

Above Adequate

4

Adequate

3 0 if Not Available

Below Adequate

Poor

2

1

• Excellent. The system performs the activity and has the capability of improving the efficiency and efficacy of our department processes flow. • Above Adequate. The system performs the activity with some extra features that make the activity flow smoothly. • Adequate. The system performs the activity adequately • Below Adequate. The system performs the activity with some minor difficulties that could be overcome. • Poor. The system performs the activity in a way that could not be implemented in our department. • Not Available. The system is unable to perform this activity.

Figure 1

Five-level scoring scale and levels definition for (A) importance for patient care (IPC) and (B) performance index (PI).

systems (backbone for information management) as well as a detailed practice-specific baseline map for all the systems and their corresponding interrelationships. Maps included the systems and hardware used throughout clinical operations to acquire, manipulate, store, and manage patient treatment information from simulation through treatment delivery. Specific hardware details about each component (eg, type of storage, in-house or vendor-provided connectivity services) were documented. The IS team members joined the physicists, ROIS vendor, and institutional IS group to lay out the connectivity map with corresponding network infrastructure. Multi-ACCESS (v 8.30; Elekta / IMPAC Medical Systems, Inc., Sunnyvale, CA) with Eclipse-VARIS (v. 7.3.10; Varian Medical Systems, Palo Alto, CA) was the clinical information system configuration used for testing the assessment methodology. In this configuration, Multi-ACCESS functioned as the radiation oncology patient information management system and EclipseVARIS as the infrastructure supporting the treatment planning process, and offline review analysis tools using Eclipse and VARIS-Vision correspondingly. Eclipse and VARIS-Vision applications shared the same database

infrastructure. Clinical treatment configuration involved both the EXCI and 4DITC (v. 8.0.18; Varian Medical Systems) interfaces with a Varian 21EX Linac equipped with On-Board kV imaging system. Additional systems performing supporting tasks across clinical operations were identified and added to the mapping. A research single-vendor ROIS using the full suite of ARIA applications including Eclipse for treatment planning (v. 8.0; Varian Medical Systems) was also used to validate the functionality and performance of the assessment methodology. The research system configuration included 1 server and 2 workstations (1 for treatment planning and 1 running the ARIA applications). The research single-vendor system was configured to allow team members to perform all clinical processes in our current practice. Effective evaluation required end-to-end testing, a mechanism to safely switch from a clinical production system to a research system at the treatment delivery unit was developed. Two independent swappable hard drives containing the 4DITC (v. 8.0.18; Varian Medical Systems) software were configured, one to be used with the clinical production system (Multi-ACCESS) and the other to be used with the research system (ARIA).

e158

L.E. Fong de los Santos, M.G. Herman

Step (4): ROIS (software) assessment and compatibility with clinical workflow Radiation oncology workflow and infrastructure was classified into 3 major categories: clinical processes (CP), information management (IM), and technologic innovations integration (TII). The CP category represented practice-specific clinical operations, including all tasks and activities related to patient management and treatment. 21,22 The CP elements were defined by the clinical process flow mapping described in Step 2. The IM category is associated with processes related to patient and treatment information management, including all functions related to maintenance, connectivity, and services supporting the ROIS infrastructure. Elements for this category were taken from the clinic-specific information management systems and IS-IT infrastructure mapping performed in Step 3. The TII category was included to track the ability of the ROIS to accept and integrate current and future technological advances. Following mapping of clinical processes and infrastructure, a survey was developed based on elements previously defined in each of the 3 categories: CP, IM, and TII. The quantitative survey metrics and procedure described in Fig 1 followed methodology suggested by Fowler. 23 For the purpose of the assessment methodology and creating a site-specific survey, each task, activity, or subprocess (previously mapped) resulted in a single survey question and became an assessment variable. A total of 208 survey questions (ie, assessment variables) were identified; 169 associated with the CP category, 19 with the IM, and 20 with the TII. Two independent parameters were defined to quantify the level of correspondence between the ROIS and the clinical environment for each assessment variable. The first parameter was importance for patient care (IPC). IPC is a measurement of the importance of each assessment variable as it relates to the quality, accuracy, and safety of the patient treatment. IPC was scored by the clinical expert team for each assessment variable (ie, survey question) on a 5-point scale shown in Fig 1A. As an example, under the CP category for one of the tasks from the plan second check process, the assessment variable “Verification of wedge angle and orientation” generated the survey question for IPC data collection, “Evaluate the following activity or task according to its importance for patient care using the scoring scale: Verification of wedge angle and orientation.” The second parameter was the performance index (PI). This parameter represents the user's perception of the ROIS performance for a specific assessment variable when using the tools and applications provided by the ROIS. The PI was used as a surrogate metric for the level of compatibility of the ROIS tools and applications with the functional needs and infrastructure of the clinical practice. 12 , 24-26 Using the same assessment variable from IPC,

Practical Radiation Oncology: October-December 2012

the corresponding survey question for the PI data collection was to “Evaluate the performance of the ROIS for: Verification of wedge angle and orientation.” During performance evaluation, clinical expert team members stepped through each of their corresponding tasks for all the clinical processes from start to finish using the tool(s) and application(s) provided by the ROIS. Upon completion, clinical experts scored the performance of the system on a 5-point scale shown in Fig 1B. In order to avoid collecting unqualified data (ie, nonexpert opinion), team members who were not responsible for the specific task (as defined by the clinical process flow mapping) did not contribute to the IPC or PI for that task. In the case where the assessment variable was evaluated by more than one of the expert team members for a specific group (eg, more than one therapist), the average IPC and PI were taken for that variable. The survey data were tabulated and graphed to create a 2-dimensional PI-IPC space. The PI-IPC space divided the results into 4 regions: 2 acceptance regions (AR-I and AR-II) and 2 rejection regions (RR-I and RR-II). The template for the PI-IPC space is depicted in Fig 2. The 4 regions provided a graphical overview of the performance of the ROIS relative to the clinic-specific patient care values. Following the scoring scale provided in Fig 1, the adequate level (score = 3) for PI represents the level where the system performance assessment changes from poor to adequate, thus was selected to be the threshold for the regions in the PI axis. The medium level (score = 3) for IPC represents the level where the clinic specific tasks or subprocesses have a medium dosimetric effect (ie, dose discrepancy of ± 5% ≤ ± 10% on target or organ at risk from prescribed dose) on the patient treatment and medium impact on the practice workflow and information flow. The medium level was selected to be the threshold for the regions on the IPC axis.

Figure 2 Performance index-importance for patient care space description diagram.

Practical Radiation Oncology: October-December 2012

ROIS and clinical practice compatibility

e159

Figure 3 Global clinical practice process flow map with 13 major clinical processes, 5 electronic management systems (A to D and ROIS [radiation oncology information system]), and 6 paper-based documents (I to VI). Clinical users, responsible for each clinical process, were identified and added to the mapping.

In addition to the quantitative IPC-PI information provided by the survey, experiential feedback was also recorded using standard qualitative research techniques. 27,28 During the PI scoring, clinical experts were asked to provide information regarding their reasoning when evaluating the performance of the system. Experiential feedback was collected and provided insight into the “why” behind the PI scoring and identified specific issues related to a particular assessment variable, including recommendations on how to improve both clinical processes and applications provided by the ROIS. Following data acquisition, all the assessment variables were sorted, in descending order of IPC, and grouped according to each of the 3 categories; CP, IM, and TII. This site-specific priority list, combined with the experiential feedback and the quantitative IPC-PI information was used to develop a global report describing the compatibility level between the ROIS and the clinicspecific processes and infrastructure. This information was

also used to guide improvement in clinical processes and ROIS functionality.

Results Clinical process mapping A global clinical practice process map is shown in Fig 3. Thirteen primary processes were defined and classified by frequency for the CP category. Five major electronic patient information and process management systems were identified: (A) institutional electronic medical record; (B) radiation oncology electronic ordering form; (C) institutional schedule manager; (D) intensity modulated radiation therapy quality assurance (IMRT QA) manager; and the ROIS. Six paper-based treatment documents and reports were noted in the mapping: (I) physician intent; (II) treatment plan configuration; (III)

e160

L.E. Fong de los Santos, M.G. Herman

Practical Radiation Oncology: October-December 2012

correspondingly. Assessment variables with the same PIIPC value were represented with one single cross or circle in the PI-IPC space, thus some of the marked locations were associated with more than one assessment variable. Table 1 summarizes the results of the PI-IPC mapping and evaluation for the clinical and research configurations. Particular attention was given to the variables in RR-I because each of them had the potential to lead to inefficient or unsafe processes due to its high IPC factor. The assessment variables were sorted by descending IPC and color-coded by their location in the PI-IPC space (ie, following gray shading scheme defined in Fig 2), clearly highlighting critical assessment variables with high risk and low performance (ie, high IPC and low PI). Figure 5 shows an example of a priority list for the physics weekly chart check process. Another component, gathered from the experiential feedback information, was the development of a set of recommendations meant to improve the implementation of ROIS and in turn anticipate a better ROIS performance. These recommendations are shown in Table 2 and classified by the parties responsible for addressing them.

IS-IT infrastructure

Figure 4 Performance index-importance for patient care space space plots for both radiation oncology information system configurations. (A) Clinical and (B) research. AR-I, AR-II, acceptance regions I and II; RR-I, RR-II, rejection regions I and II [as defined in Fig 2].

isodose distribution and dose-volume histogram); (IV) IMRT QA; (V) chart check; and (VI) final treatment reports. Reviewing the constraints and specifications of the information management infrastructure allowed 6 areas to be identified as the assessment variables for the ROIS evaluation of the IM category: ROIS database management, long-term archiving, ROIS availability (across the network, off-hours access), customer support (for ROIS software issues), ROIS integration with institutional infrastructure, and data migration during system upgrade. Similarly for the TII, 4 example technologies were identified: portal dosimetry, MV, kV, and cone-beam CT-based image guidance. Their corresponding assessment variables were implementation, connectivity with ROIS, operability with ROIS, ability to keep updating with ROIS, and ability to perform offline review.

From IS-IT infrastructure maps for the 2 clinical interfaces (ie, EXCI and 4DITC), 3 types of storage devices were documented: a server whose main purpose is hosting and managing the database for each system, institutionally provided distributed file systems (DFS) for rapidly and continuously growing data (eg, images), and local storage media for some workstations. Connectivity services institutionally developed or provided by the vendor were also documented in the mapping. The infrastructure maps were nearly identical from simulation to treatment planning. Differences identified between configurations were the following: • Four connectivity services were documented in the EXCI-interface and 3 in the 4DITC. • The location of the record and verification process was different between the EXCI and 4DITC interface. In the EXCI, the role of verification and recording belonged to the IMPAC application at the Sequencer workstation and it was done field-byTable 1 Evaluation summary results for clinical and research configurations Configuration

Acceptance

Rejection

AR-I

AR-II

RR-I

RR-II

157 143

35 33

11 24

4 7

Assessment methodology and workflow compatibility

Clinical Research

The PI-IPC space for both clinical multivendor and research single-vendor systems are shown in Fig 4A and B

Assessment variables were grouped by their corresponding acceptance (AR-I and AR-II) and rejection (RR-I and RR-II) regions.

Practical Radiation Oncology: October-December 2012

ROIS and clinical practice compatibility

e161

Figure 5 Example of site-specific priority list showing the assessment variables for the physics weekly chart check process. Variables are sorted by descending IPC and color-coded by its location in the PI-IPC space (gray shading scheme and regions were defined in Fig 2). Gray shading provides a visual queue for areas with high risk (ie, low PI and high IPC). AR-I, acceptance region I; BID, twice daily; PI, performance index; IPC, importance for patient; RR-I, rejection region I.

field. In the 4DITC interface, the verification occurred at the 4DITC workstation (a Varian application) and the recording, at the end of each treatment session, on the IMPAC Sequencer workstation. This was a major change from the previous “record and verify system” paradigm, where both tasks were associated with the IMPAC system. A subsection of the IS-IT infrastructure map is shown in Fig 6A and B to illustrate this change.

Discussion The methodology described in this study provides a tool for understanding and systematically evaluating both the needs of a specific clinical practice and the level of compatibility with a given ROIS. A comprehensive analysis must consider the synergy and integration among the components of the modern radiation oncology practice (ie, clinical users, clinical processes, information, software, and hardware) to evaluate ROIS compatibility, system risks, and improvement opportunities for clinical processes and information systems. While the examples and scope in this work relate to one individual practice, the methods and evaluation tools are general and can be applied to any clinic and ROIS combination. The clinical practice process flow map in Fig 3 provides a good example for those wishing to develop their own. Whether the goal is to move to an all-electronic external beam radiation therapy process, to improve the compatibility level between an ROIS and a practice, or to select a most appropriate ROIS to purchase, this methodology can help on a practical level. In each practice, the scope of the evaluation and resources necessary can be as broad or as narrow as the clinical team desires and will provide guidance to improving process as well as compatibility with ROIS functionality. The clinical expert team, representing clinical constituents, performed the characterization and mapping of clinical processes and IT-IS infrastructure, carried out the clinical tasks using the functionality provided by the

ROIS, and evaluated the level of compatibility between the ROIS applications and tools and the clinic-specific workflow and infrastructure using the developed survey. One member of the team was selected to be the project lead, responsible for coordinating the mapping process and for conducting, collecting, and compiling the survey and qualitative feedback data. The clinical team members committed about 0.1 FTE [full-time equivalent] each and the project lead 0.5 FTE for about 3 months in order to complete the project. Keeping the motivation and commitment of the clinical team members were key elements for the success of the project. Ammenwerth et al 4 identify “the motivation for the evaluation” as one of the main challenges of assessing health information systems. Participation in an innovative project, consideration of the clinical team members’ perspective, as well as the departmental commitment to improve our systems and processes for the benefit of patients, were the main factors that motivated the clinical team. The radiation oncology practice is continuously evolving toward more sophisticated imaging, planning, delivery technologies, and electronic information management systems. These changes require an increasingly complex clinical environment and supporting IS-IT infrastructure. 29-32 The initial outcome and information gathered from the implementation of the assessment methodology can be used to guide subsequent system upgrades and resource allocation, with a minimum time involvement of the clinical team. Furthermore, one of the recommendations summarized in Table 2 also highlights the responsibility that each practice has for creating a team (eg, physicists and in-house IS team) with knowledge and understanding of connectivity and infrastructure of clinically relevant systems. This is particularly challenging in multivendor environments where ROIS frequently lags behind advancements in the delivery system. As an example, upgrading from EXCI to 4DITC interface in the clinical multivendor system (Fig 6), imposed a change in workflow as well as in the information exchange between the ROIS and the delivery and image-guidance systems.

e162

L.E. Fong de los Santos, M.G. Herman

Table 2 Recommendations compiled from experiential feedback analysis and classified by responsible group (ie, clinical practice, vendor, or both) Recommendation

Responsible

Clear definition of roles and responsibilities of vendor and user related to the management, service and support of ROIS need to be developed prior its deployment. Successful, effective, and safe approach to ROIS issues shall be conducted by a team formed by physicists, IS staff, and vendor service support. Effective channels of communication among users and between user and the vendor need to be developed in relation with ROIS issues, configuration, and upgrades. Appropriate amount of resources need to be allocated toward the development of a strong IS team. The main role of the IS team will be to provide support for the everyday maintenance and basic troubleshooting of the nonclinically relevant hardware and software (eg, virus configuration, operating system patches, workstations, etc). Proficient knowledge and understanding of database management, DICOM RT and the components, connectivity, and infrastructure of any clinically relevant information systems linked to the ROIS is needed by physicist and IS team. Development of efficient and effective customer services program as well as training resources It is highly desirable to have the vendors provide a virtual environment (beyond a canned demo) ready for the users that will allow them to test the tools provided by the ROIS, simulating their current practice.

Clinical practice and vendor

Clinical practice and Vendor

Clinical practice and vendor

Clinical practice

Clinical practice

Vendor

Vendor

IS, information system; ROIS, radiation oncology information system.

IMRT QA using portal dosimetry moved from an automatic to a manual transfer process where the users had to import each of the images to the clinical EclipseVARIS database for analysis. The daily offline gold fiducial matching for prostate localization workflow

Practical Radiation Oncology: October-December 2012

moved from the Eclipse-VARIS to the Multi-Access (IMPAC) system. While initial assessment and mapping does not provide the specific solutions for each issue, critical failures, connectivity issues, and major clinical process problems can be easily identified and addressed prior to implementing systems or technology changes. Comprehensive assessment of health information systems requires a combined approach of quantitative and qualitative methods. 33 The quantitative evaluation in the current work was presented as the performance index – importance for patient care (PI-IPC) space. The PI-IPC space provided a clear display of areas where either the clinical processes or the ROIS needed improvements to improve clinic-specific performance. The PI-IPC space was mapped to a master priority list, providing guidance on resource allocation to address areas with low PI and high IPC. Table 2 summarizes recommendations compiled from clinical team experiential feedback analysis (ie, qualitative method), showing that there are factors affecting the performance of the ROIS that are individually driven by either the vendor or the clinical practice, but there are some others that highly depended on a close cooperation between the vendor and the members of the clinical team. This result corroborates the premise that ROIS development and implementation requires a close cooperation among the complete Peopleware group that involves clinical users as well as developers (eg, vendors). 14,34 The global report combining the qualitative experiential feedback and the quantitative IPC-PI information was the first systematic and comprehensive approach for the analysis of an information system in a radiation oncology clinical environment. These results guided vendor system developers on the improvement of their ROIS. Furthermore, the proposed assessment methodology and metrics have been adopted and used for continued development of the system. Given the complexity of the overall radiation oncology practice, the scope of this analysis was limited to the evaluation of the ROIS for the external beam workflow. Furthermore, other clinical processes and systems relevant to the patient's episode of care, but outside the radiation oncology workflow (eg, chemotherapy, surgery, pathology, clinical trials, among others), should be included as part of a cross-departmental and cross-institutional assessment study but are beyond the scope of this work. The focus of this work is to provide a mechanism for evaluating clinical process, ROIS compatibility, and developing recommendations that are specific to and within the scope of interest of a given clinical practice. As radiation oncology (and medical) practice continues to become more complex and computer-driven, the importance of the ROIS in a safe clinical practice grows. Medical errors attributed to information systems and technologies in radiation therapy have been reported in the literature. 35-38 Despite some instances where the system itself or an incorrect configuration of a system led to an error in the treatment chart record or the treatment itself, 37 there

Practical Radiation Oncology: October-December 2012

ROIS and clinical practice compatibility

e163

Figure 6 Subsection of information systems-information technology infrastructure map for the 2 clinical configurations used in the evaluation: (A) EXCI and (B) 4DITC. These diagrams highlight connectivity and workflow differences when moving from an EXCI to a 4DITC Linac interface.

are many more cases where unsafe clinical processes or improper use of the ROIS were the direct cause of the error. 36 A lack of understanding of how the ROIS maps into the clinical processes of a department, alone or combined with outdated or unsafe clinical processes, could potentially lead to negative clinical events and improper treatments. The assessment methodology discussed in this paper provides the ground work for developing safer clinical processes and assigning the ROIS its proper corresponding role in patient treatment and information management. Subsequently, according to Despont-Gros et al, 12 the successful implementation and use of health information systems heavily depends on the correct matching between the system and the specific clinical workflow. The dependence of the radiation oncology practice on information systems and technologies is constantly increasing. The described assessment methodology provides a taxonomy and common language which users and vendors can use to construct effective communication channels and improve clinical processes in parallel with clinical tools for ROIS. Implemented in a systematic and iterative manner, the proposed assessment methodology presents a roadmap to properly match an ROIS to and improve clinical processes and to ultimately provide a safer, more efficient and more effective care delivery setting.

Acknowledgments The authors thank Lori M. Buchholtz and James L. Sorenson for their contributions; and the Radiation Oncology team at the Mayo Clinic.

References 1. Dick RS, Steen EB, Detmer DE, eds. The computer-based patient record - an essential technology for health care. Washington, DC: National Academy Press; 1997. 2. Committee-on-Quality-Health-Care-in-America, ed. Crossing the quality chasm: a new health system for the 21st century. Washington, DC: National Academy Press; 2001. 3. Chaudhry B, Wang J, Wu SY, et al. Systematic review: impact of health information technology on quality, efficiency, and costs of medical care. Ann Intern Med. 2006;144:742-752. 4. Ammenwerth E, Gräber S, Herrmann G, Bürkle T, König J. Evaluation of health information systems-problems and challenges. Int J Med Inf. 2003;71:125-135. 5. Ammenwerth E, Brender J, Nykänen P, et al. Visions and strategies to improve evaluation of health information systems. Reflections and lessons based on the HIS-EVAL workshop in Innsbruck. Int J Med Inf. 2004;73:479-491. 6. Gard Garde S, Wolff AC, Kutscha U, Wetter T, Knaup P. CSI-ISC— Concepts for smooth integration of health care information system components into established processes of patient care. Methods Inf Med. 2006;45:10-18. 7. Berg M. Patient care information systems and health care work: a sociotechnical approach. Int J Med Inf. 1999;55:87-101. 8. Rosenbloom ME, Killick LJ, Bentley RE. Verification and recording of radiotherapy treatments using a small computer. Br J Radiol. 1977;50:637-644. 9. Fredrickson DH, Karzmark CJ, Rust DC, Tuschman M. Experience with computer monitoring, verification and record keeping in radiotherapy procedures using a Clinac-4. Int J Radiat Oncol Biol Phys. 1979;5:415-418. 10. Morrey D, Smith CW, Belcher RA, Harding T, Sutherland WH. A microcomputer system for prescription, calculation, verification and recording of radiotherapy treatments. Br J Radiol. 1982;55: 283-288. 11. Herman MG, Williams AL, Dicello JF. Management of information in radiation oncology: An integrated system for scheduling,

e164

12.

13. 14.

15. 16.

17. 18.

19. 20. 21.

22.

23. 24.

L.E. Fong de los Santos, M.G. Herman

treatment, billing, and verification. Semin Radiat Oncol. 1997;7: 58-66. Despont-Gros C, Mueller H, Lovis C. Evaluating user interactions with clinical information systems: a model based on humancomputer interaction models. J Biomed Inform. 2005;38:244-255. Gremy F, Fessler JM, Bonnin M. Information systems evaluation and subjectivity. Int J Med Inf. 1999;56:13-23. Lorenzi NM, Riley RT. Organizational aspects of health informatics: managing technological change. New York, NY: Springer-Verlag. 1995. Grémy F. Hardware, software, peopleware, subjectivity. A philosophical promenade. Methods Inf Med. 2005;44:352-358. Hripcsak G, Wilcox A. Reference standards, judges, and comparison subjects: roles for experts in evaluating system performance. J Am Med Inform Assoc. 2002;9:1-15. Madison D. Process mapping, process improvement and process management. Chico, CA: Paton Press. 2005. Rummler GA, Brache AP. Improving performance: how to manage the white space in the organization chart. San Francisco: JosseyBass. 1995. Damelio R. The basics of process mapping. New York, NY: Productivity Press. 1996. Sharp A. Workflow modeling: tools for process improvement and application development. London, UK: Artech House Publishers. 2008. Brooks KW, Fox TH, Davis DL. Advanced therapy information management systems: an oncology information systems RFP toolkit. In: Hazle JD, Boyer AL, eds. Imaging in radiation therapy. American Association of Physics in Medicine (AAPM) Monograph No. 24. Madison, WI: Medical Physics Publishing; 1998. Brooks K. Radiation oncology information management system. In: Van Dyk J, ed. The modern technology of radiation oncology. 1st ed. Madison, WI: Medical Physics Publishing; 1999509-520. Fowler FJ. Improving survey questions: design and evaluation. Los Angeles, CA: SAGE Publications, Inc. 1995. Goodhue DL, Klein BD, March ST. User evaluations of IS as surrogates for objective performance. Information & Management. 2000;38:87-101.

Practical Radiation Oncology: October-December 2012 25. Goodhue DL. Understanding user evaluations of information systems. Management Science. 1995;41:1827-1844. 26. Goodhue DL, Thompson RL. Task-technology fit and individualperformance. MIS Quarterly. 1995;19:213-236. 27. Strauss A, Corbin J. Basics of qualitative research: grounded theory procedures and techniques. Newbury Park, CA: SAGE. 1990. 28. Denzin N, Lincoln Y. Handbook of qualitative research. Thousand Oaks, CA: SAGE. 2000. 29. Fraass BA. QA issues for computer-controlled treatment delivery: This is not your old R/V system any more! Int J Radiat Oncol Biol Phys. 2008;71:S98-S102. 30. Marks LB, Light KL, Hubbs JL, et al. The impact of advanced technologies on treatment deviations in radiation treatment delivery. Int J Radiat Oncol Biol Phys. 2007;69:1579-1586. 31. Suit H. The Gray Lecture 2001: Coming technical advances in radiation oncology. Int J Radiat Oncol Biol Phys. 2002;53:798-809. 32. Webb S, Evans PM. Innovative techniques in radiation therapy: editorial, overview, and crystal ball gaze to the future. Semin Radiat Oncol. 2006;16:193-198. 33. Stoop AP, Berg M. Integrating quantitative and qualitative methods in patient care information system evaluation guidance for the organizational decision maker. Methods Inf Med. 2003;42: 458-462. 34. Samaras GM, Horst RL. A systems engineering perspective on the human-centered design of health information systems. J Biomed Inform. 2005;38:61-74. 35. Patton GA, Gaffney DK, Moeller JH. Facilitation of radiotherapeutic error by computerized record and verify systems. Int J Radiat Oncol Biol Phys. 2003;56:50-57. 36. Klein EE, Drzymala RE, Purdy JA, Michalski J. Errors in radiation oncology: a study in pathways and dosimetric impact. J Appl Clin Med Phys. 2005;6:81-94. 37. Leveson NG, Turner CS. An investigation of the Therac-25 accidents. Computer. 1993;26:18-41. 38. Shafiq J, Barton M, Noble D, Lemer C, Donaldson LJ. An international review of patient safety measures in radiotherapy practice. Radiother Oncol. 2009;92:15-21.

Radiation oncology information systems and clinical practice compatibility: Workflow evaluation and comprehensive assessment.

To map the level of clinical practice compatibility with a radiation oncology information system (ROIS) through a workflow- and clinical process-based...
1MB Sizes 0 Downloads 5 Views