Laboratory Automation

Review Article

Clinical Chemistry Laboratory Automation in the 21st Century Amat Victoria curam (Victory loves careful preparation) David A Armbruster,1 David R Overcash,2 Jaime Reyes2

Global Scientific Affairs, Abbott Diagnostics, Chicago, IL, USA; 2Global Marketing, Abbott Diagnostics, Chicago, IL, USA. *For correspondence: David Armbruster, [email protected] 1

Abstract The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists.

Introduction “In the early 1960s satellites were circling our planet, President Kennedy had committed the US to put a man on the moon by the end of the decade, and graduate students in the Department of Biochemistry of the University of Tennessee were making predictions on the future of analysis (mostly manual at the time) of body fluids. Over a pitcher of cold beer on a hot and humid summer day, one of us dared to predict that someday there would be instruments in which you could put serum or even whole blood at one end and get the results at the other; ‘printed too,’ concluded the optimist of the group”.1 This quote from Doumas is an appropriate opening for a paper on clinical chemistry automation. Also apt is an observation from a paper presenting an historical perspective of clinical chemistry by Kricka and Savory: “This discipline, which could originally be practised in small laboratories in which relatively few manual tests were performed, now requires highly automated

and integrated laboratories that perform millions of tests each year”.2 The present paper attempts a sweeping but concise review of clinical laboratory automation. Modern laboratory practice in developed nations has long moved beyond piecework manual procedures and uses fully automated systems, often integrated platforms coupling clinical chemistry and immunoassay analysers, pre- and post-analytical modular systems, and even total laboratory automation (TLA), to join those pre- and post-analytical devices with several analytical modules. Overlying this automation, to whatever extent it exists, is informatics: comprehensive software to manage the diverse functions of the modern laboratory, ranging from reagent inventory management to optimised sample workflow to sophisticated result reporting that provides ‘value-added’ clinical interpretation. A book-sized publication would be required to do justice to all of these topics and what is presented here is of necessity only a high level overview. Clin Biochem Rev 35 (3) 2014 143

Armbruster DA, Overcash DR & Reyes J

Clinical Chemistry is a fairly new specialty, an amalgam of chemistry and medicine. As observed by Henry Bence Jones (1813-1873, of Bence Jones protein fame) “Whatever sets forth the union of chemistry and medicine tends to promote not only the good of science but also the welfare of mankind”.3 “By 1840, approximately 1400 organic compounds were known, and the period around 1840 is regarded as the point of origin of the discipline ‘clinical chemistry’, especially in the German-speaking world, because it was then the first textbooks, handbooks, and journals appeared”.4 Early on, the discipline was called ‘pathological chemistry’ or ‘chemical pathology’, with ‘clinical chemistry’ eventually becoming the accepted term after it was introduced by the American Association of Clinical Chemistry (AACC) and the International Federation for Clinical Chemistry (IFCC).4 “Clinical Chemistry” (1883) by C.H. Ralfe of the London Hospital was the first book in English to carry the title clinical chemistry.3 Current estimates suggest that perhaps about 200–300 analytes are routinely tested in clinical laboratories with about 1,000 analytes in total being subject to analysis in various facilities.4 Early on, Clinical Chemistry relied on traditional analytical methods such as atomic absorption, flame emission photometry, gasometry, potentiometry and amperometry, spectrophotometry, immunonephelometry (for competitive and sandwich immunoassays) and electrophoresis and the wide variety of analysers needed for these methodologies.5 A clinical laboratory might have consisted of a hodge-podge of different devices, each devoted to perhaps one or only a few analytes and each requiring a medical technologist to operate it. Rosenfeld opined that “Folin’s use of the Duboscq-type colorimeter for colour comparison in the quantitative analysis of creatinine in urine in 1904 ushered in the modern era of clinical chemistry”.3 By the 1960s, colorimetric methods had been adapted to the photometer and the manual processes were being adapted to the continuous-flow autoanalyser, displacing some of the individual analysers.3 Clinical Chemistry is a polyglot discipline combining chemistry, biochemistry, immunochemistry, endocrinology, toxicology (abused and therapeutic drug testing), analytical chemistry, engineering, informatics and doubtless other specialties, to provide the necessary support to physicians and other healthcare providers to improve the diagnosis and treatment of patients.4 Modern technology includes colorimetry/ spectrophotometry, nephelometry/immunonephelometry, turbidimetry/immunoturbidimetry, atomic absorption, flame emission photometry, potentiometry (ion specific electrodes or ISE), and a variety of chromatographic techniques, including gas chromatography (GC), high performance liquid chromatography (HPLC), gas chromatography-mass

144 Clin Biochem Rev 35 (3) 2014

spectrometry (GC-MS), and liquid chromatography-tandem mass spectrometry/mass spectrometry (LC-MS/MS), plus various ligand assay methodologies, including fluorescence polarisation immunoassay (FPIA), enzyme immunoassay (EIA), radioimmunoassay (RIA), and chemiluminescence immunoassay(CMIA).4 Olsen observed in an understatement that: “Laboratory automation today is a complex integration of robotics, computers, liquid handling, and numerous other technologies”.6 The purpose of automation is to save time and improve performance through the elimination of human error. Automated systems are built around the microprocessor and computers. Stand-alone automated systems by and large did not appear until the 1950s when they could be reliably produced by commercial laboratory equipment manufacturers. The marriage of clinical chemists working independently with a new in vitro diagnostics industry meant that “What was an engineering challenge for one generation of scientists frequently became off-the-shelf technology for the next”.6 Currently the challenge is to use automation to enhance early diagnosis and preventive medicine, caring not only for the individual patient but society at large.4 We are preconditioned, having enjoyed many years of laboratory automation, not to question its value. But a word of caution is advisable. As Rosenfeld noted, “In the closing decades of the 20th century, automated devices produced an overabundance, and an overuse and misuse of testing to the detriment of careful history taking and bedside examination of the patient. This is attributable in part to a fascination with machine-produced data.”3 Undoubtedly this is a reference to the widespread use of screening by testing for a large number of analytes not because the data is necessary for diagnosis/ treatment but because the capability is readily available and it may be easier to order an extensive panel of tests than to request only the specific analytes pertinent to a given patient’s case. Early Automated Analysis In the 1956, Leonard Skeggs developed the first practical and completely automated system for measuring urea, glucose, and calcium, the AutoAnalyzer, an instrument designed to meet the specific needs of the clinical chemistry laboratory.4,7,8 It performed blood analysis from start to finish without manual intervention by a technologist. Skeggs developed several modules to automate the various steps and his models were manufactured by the Technicon Corporation, Tarrytown, NY, USA. In that year, Skeggs and the Technicon Company fielded commercially-available Technicon single and multichannel continuous flow autoanalysers, the initial AutoAnalyzer model selling for $3,500.9 It consisted of a dialysis membrane

Laboratory Automation

for protein-free filtrate, tygon tubing, air bubbles to separate patient samples and ‘scrub’ tubing, mixing coils, a flowthrough cuvette with photometric monitoring and a strip-chart recorder. Later designs offered simultaneous analysis of over 20 analytes at 150 samples per hour. The multichannel systems were non-selective batch analysers, that is, they performed an array of tests on every sample whether all tests were requested or not. Continuous flow analysis changed the character of clinical chemistry testing so that only minutes instead of hours (or even days) were needed to complete an analysis, and personnel were free to develop tests for subspecialties such as toxicology, endocrinology and molecular diagnostics.2 The autoanalyser reached its highwater mark with the SMAC (Sequential Multiple Analyzer with Computer) in 1974, which had a built-in computer. The AutoAnalyzer led to the widespread use of batch analysers, many of which only measured one analyte but could test up to 100 samples in continuous mode. In the early 1980s, the photodiode array for spectrophotometers with grating monochromators allowed a sample to be tested simultaneously for multiple analytes, each test detecting an analytical signal at different wavelengths.4 The advantages of the AutoAnalyzer meant less labour, fast analysis, and the use of screening panels. The disadvantage was the pursuit of the ‘defensive medicine’ concept in response to the threat of malpractice suits, and this unnecessarily increased the laboratory workload.10 Not long after the introduction of the AutoAnalyzer and the ascendency of continuous flow analysis in clinical chemistry, a different approach appeared in 1959 with the production of the Robot Chemist by the Research Specialties Company.2,7 The Robot Chemist used discrete analysis with conventional cuvettes and automatic pipetting and mixing, essentially automating all of the manual steps performed by medical technologists. The Robot Chemist was not ultimately successful because it was too mechanically complex to be practical. It went out of production in 1969, but ironically the Robot Chemist proved to be the automation direction manufacturers would take and the discrete analysis model ultimately replaced the AutoAnalyzer and continuous flow analysis. Another approach to automation was introduced with centrifugal analysis in 1968, developed by Norman Anderson.2 Centrifugal analysers are single-test at a time batch analysers in which analysis is sequential, discrete, and parallel.11 Centrifugal analysis uses the motion of a spinning rotor for mixing, thermal equilibration, transport, and measurement. Sample and reagents are pipetted into rotor compartments and mixed when the rotor spins and they flow over the walls of the

compartments into a reaction chamber. Several manufacturers introduced centrifugal analysers and these models proved successful for about 20 years in clinical laboratories. Just as important as the development of automated analytical systems was the introduction in the 1950s by the Sigma Chemical Company in St. Louis of clinical chemistry ‘kit’ methodology- prepackaged, ready-to-use assay reagents, with instructions for use.7 This was a very significant innovation in the field. Automated analysers were a tremendous boon to clinical chemistry but it must be recognised that automation would be only partially successful if reagent kits were not readily available to ‘feed’ them. Clinical chemists ‘of a certain age’ can remember when reagents were prepared manually in clinical laboratories, a practice that is now essentially defunct. The era of automation blossomed after the introduction of the AutoAnalyzer. Doumas observed that the 1976 edition of Tietz (Fundamentals of Clinical Chemistry) included a chapter on automation, covering 13 automated analytical systems, one of them the DuPont ACA, an unique and revolutionary analyser that enjoyed a long life.1 The next edition of Tietz in 1986 (titled “Textbook of Clinical Chemistry”) had a chapter on instrumentation that included HPLC, mass spectrometry, and guidelines for the selection of instrumentation. The second edition of the Tietz textbook (1993) deleted from the automation chapter the Technicon SMA and another one-time standard system, the Beckman Astra, but included point-ofcare testing (POCT) and specialised immunoassay analysers, reflecting the then current situation. Spectrophotometry, previously known as colorimetry, was now being used in a wide variety of photometric techniques: fluorescence, fluorescence polarisation, nephelometry, chemiluminescence, and electrochemiluminescence.2 Discrete analysers, otherwise known as random access analysers, are selective and perform only those assays ordered on each sample, and these systems came to the fore. By the 1990s, batch analysers, such as autoanalysers and centrifugal analysers that performed the same assay simultaneously on all samples, were on the wane. Batch analysis still has a role to play, but discrete analysers can be used in this mode in addition to performing any number of tests in the random access mode. It is instructive to review some terms to appreciate the various approaches to automation. Batch analysis means that multiple samples are tested in a ‘run’. In contrast, sequential analysis means samples are tested one after the other and results are reported in the same order.11 Continuous-flow analysis is a form of sequential analysis through a continuous stream at a constant rate, e.g. the AutoAnalyzer. With discrete analysis, each sample is tested in a separate cuvette or other reaction chamber with reagents added to each individual sample

Clin Biochem Rev 35 (3) 2014 145

Armbruster DA, Overcash DR & Reyes J

container. Single-channel analysis uses an analytical ‘line’ or channel dedicated to a single test. Multiple-channel parallel analysis uses two or more ‘lines’ or channels, each dedicated to a single test, and analysis occurs simultaneously. With random access analysis, specimens are tested in or out of sequence with each other, as reaction vessels are available and without regard to accessioning order, although testing of designated specimens, such as stats, may be given priority. Assays are either end-point tests (reaction is complete after a fixed time) or continuous monitoring tests (multiple data points recorded over a specified time interval).11 The specimen throughput rate depends on the ‘cycle time’ of the analyser for a fixed sequence of events and the optical measurement cycle.11 Regardless of the type of analyser, laboratories must consider workload, manpower needs and costs, preventive maintenance, down-time, reagent and disposables cost, service costs, and the instrument cost, when choosing a system. The AutoAnalyzer continuous flow, batch analysis paradigm was displaced by discrete systems using positivedisplacement pipettes, with various volumes possible, or fixed volume, for sample processing and reagent dispensing, with some kind of washing step in between sample dispensing to avoid carryover contamination. Mixing is performed by forceful dispensing, magnetic stirring, or mechanical stirring (rods or piezo-electric mixers). Temperature is controlled by waterbaths, heating blocks, or heated air compartments (air bath). Cuvettes are glass or plastic; the glass cuvettes intended to be permanent and the plastic cuvettes made to be disposable after a single use or after extended use with replacement after a set number of tests.11 Most discrete analysers use individual cuvettes, although some may use a type of flow cell. Various types of detectors are used, with a variety of lamp types (tungsten, quartz halogen, mercury, xenon, and lasers). The monochromators use interference filters, prisms, or diffraction gratings.11 The analytical signal is typically detected using photodiode arrays, allowing a wide variety of wavelengths to be monitored. The majority of analysers use liquid reagents, either received as liquid, ready to use, or reconstituted by the laboratory after receipt. However, there have been some very successful analysers that employ ‘dry’ or ‘damp’ reagents. Some of these systems used a tablet form of reagent, for example, the DuPont ACA and the Paramax. Reagents were reconstituted with diluent and mixed with the specimen during analysis. This was a ‘unit dose’ concept that offered advantages such as avoiding carryover. The disadvantage was that the dry reagents were more expensive than liquid reagents, although they were more stable and minimised reagent wastage.11 The most successful analysers of this type are those from Ortho (originally Kodak). Ortho has offered generations of

146 Clin Biochem Rev 35 (3) 2014

analysers that use a unique slide technology. The reagents are dispersed in emulsions on the slides and the reaction is activated as the sample diffuses through the layers on the slide. The spectrophotometric reading is taken using reflectance technology. The slides for the electrolytes contain miniature ion-selective electrodes. The slides are conveniently stored in refrigerators or freezers and are recognised for their convenience. However, it is difficult to develop slides for the wide variety of analytes and specimen matrices now tested and slide systems have of necessity been supplemented by adding conventional open liquid channel options.11 Automated Stand-alone and Integrated Analysers A central laboratory may contain a variety of analysers using a mixture of analytical techniques, including electrophoresis, LC-MS/MS, colorimetry/spectrophotometry, potentiometry, and immunoassay (using various methodologies).8 Some traditional methodologies such as radioimmunoassay, atomic absorption, and flame photometry, may still be retained for reference purposes but are now essentially historical techniques. Typical modern clinical chemistry analysers use automated discrete systems, as opposed to batch analysis instrumentation, which allows for an almost unlimited mix of analyses on a single instrument, combining routine clinical chemistry and immunoassay tests and fewer analysers requiring less floor space and greatly improving operational efficiency.8 Manual procedures, and even semiautomated procedures, are now relatively rare. Benefits of replacing manual procedures with automation include: eliminating potentially dangerous, error-prone manual procedures with automated processes requiring minimal technician involvement, increasing productivity, decreasing TAT, improving safety, minimising error, improving sample handling, and allowing practical reallocation of laboratory staff who are freed from manual tasks. As emphasised by Melanson et al., “Selecting clinical chemistry laboratory automation is a complex, time-consuming process” and “Automation is a customised process that may range from automating only a few steps of the analytical process to total laboratory automation, depending on the needs and resources of each laboratory”.12 Fully automated low, medium, and high volume analysers are available as independent units and it is with these systems that a clinical laboratory typically begins the automation process. These analysers are designed for a wide range of sample workloads, and range from small or modest sized benchtop units or large to very large floor models. They routinely employ spectrophotometry for a wide variety of colorimetric and/or immunoassay tests. They often include an ion specific electrode (ISE) module for electrolyte analysis (Na, K, Cl). They can offer a large menu of assays or, if used for high

Laboratory Automation

volume work in reference laboratories, may be dedicated to a relatively small number of assays, such as test profiles. Typically, the manufacturer of the analyser also provides the reagents for them. If only the proprietary reagents from the instrument manufacturer are suitable for use on the analyser, it is designated a ‘closed system’. If reagents from many vendors may be adapted to an analyser, it is designated an ‘open system’. Assay applications for closed systems are optimised for them and verified by the manufacturer. Laboratories need to perform method validation for every assay they adapt to an open system. It is usually considered desirable for an analyser to have open system capability. During the development and proliferation of fully automated, stand-alone analysers, a dichotomy emerged: mainstream clinical chemistry systems using colorimetry/ spectrophotometry and immunoassay systems that used a variety of other analytical techniques. Radioimmunoassay (RIA) was developed by Yalow and Berson in 1959 and resulted in the award of the Nobel Prize in Medicine in 1977. They did not patent RIA, making it easier for others to further develop the technique. Monoclonal antibodies developed by Millstein and Kohler (Nobel Prize in 1984), improved specificity of antibodies. Fusion mouse myeloma cell lines (‘immortal’ lines) and B-cell lymphocytes producing antibodies, resulted after screening, in cell lines producing specific monoclonal antibodies indefinitely.13 Variations of immunoassay appeared, such as IRMA (immunoradiometric assay with lower limits of detection than RIA and analyte concentrations directly proportional to increasing signal), and EIA (enzyme immunoassay), including EMIT (enzyme multiplied immunoassay technique, patented by the Syva Company), ELISA (enzyme labelled immunosorbent assay), cloned enzyme donor immunoassay (CEDIA), microparticle enzyme immunoassay (MEIA), and fluorescence polarisation (FPIA).14 Enzyme tag immunoassays are still used effectively on general purpose clinical chemistry analysers, such as with EMIT and CEDIA assays. Fluorescent and chemiluminescent tag assays are also available, allowing quantification at lower analyte concentrations, but they require specialised detection systems and typically separate analysers. RIA was automated by Becton-Dickinson with the ARIA and ARIA-II instruments but these analysers were shortlived. EMIT, using non-isotopic labels, allowed for practical automation of immunoassay. Homogeneous assays such as EMIT do not separate bound from unbound constituents, but heterogeneous (sandwich) assays use a separation step to improve selectivity and sensitivity by separating the antibody bound analyte from the other constituents of the assay mixture. Sandwich immunoassays use a capture antibody first and a second labeled antibody to generate the

analytical signal. Automated systems now use variations of both techniques.13 Fluorescent immunoassay (FIA) is perhaps best recognised as the automated technique of Fluorescence Polarisation Immunoassay (FPIA) methodology from Abbott, long used on the TDx analyser. Automated immunoassay is also available on the Elecsys, Modular, Hitachi, and Cobas analysers (Roche), Immulite, Vista, Dimension, and Advia analysers (Siemens), Access (Beckman-Coulter), AIA (Tosoh), AU analysers (Olympus), and Architect systems (Abbott). Some analysers may even employ more than one immunoassay methodology, such as the Siemens Vista. Immunoassay moved away from dedicated special chemistry laboratories to the general core laboratory as fully automated systems were made available that allow both homogeneous and heterogeneous immunoassays to be performed on general purpose chemistry analysers.13 However some immunoassays use methodologies such as nephelometry and immunoturbidimetry that stand in contrast to routine techniques.14 A variety of stand-alone immunoassay analysers were introduced and operated side by side with clinical chemistry instruments, nicely complementing each other, but at the cost of maintaining multiple independent systems. Examples included the Abbott TDx, a successful automated batch FPIA analyser, later replaced by the IMx which could perform multiple immunoassay tests in a single analytical run, both now retired. Batch immunoassay was replaced by random access immunoassay, such as the Ciba Corning ACS:180 and the Abbott AxSYM, and now systems such as the Abbott Architect and the Siemens Vista. Multiplex, or multivariate, analysis through various techniques allows two or more analytes to be measured simultaneously. Multiplexing offers an obvious advantage in speed of analysis however a potential disadvantage is reimbursement for clinical laboratory testing may not allow for payment of analyte test results that are not specifically ordered.13 Inevitably, clinical chemistry and immunoassay analysers were merged as ‘integrated’ systems, combining immunoassay with spectrophotometric and potentiometric assays.15 Clinical chemistry analysers already offered immunoassays such as EMIT tests, reading the analytical signal using a spectrophotometer. Adding an immunoassay unit with a different methodology dramatically increased the spectrum of assays that could be offered. Of course a laboratory could maintain separate clinical chemistry and immunoassay analysers if it desired instead of using integrated systems. Large integrated systems provide advantages over multiple smaller systems but, if they are disabled for any reason, a laboratory may lose the ability to perform testing unless it has back-up systems (although some can continue to operate either

Clin Biochem Rev 35 (3) 2014 147

Armbruster DA, Overcash DR & Reyes J

the clinical chemistry or immunoassay module if the other is inoperable). Thus laboratories must carefully consider their options if they rely almost totally on a single integrated system. Multiple smaller systems provide for redundancy, but require more maintenance and cross-analyser method comparisons to establish and maintain equivalency of test results. The ability to load new reagents during analyser operation (‘on-the-fly’ loading) instead of pausing an analyser to replenish reagents, is an advantage. Open channels for assays not available from the instrument manufacturer are also desirable.12 Sample throughput and turn-around-time (TAT) of automated systems vary with the size and capability of analysers and the complexity of the assay mix and test volume. Marketed throughput and TAT values are dependent on the actual test orders and sample volume and represent only a ‘ballpark estimate’ under manufacturer defined conditions. Manufacturer suggested throughput/TAT are not disingenuous numbers touted by vendors to lure customers but usually a good faith, rough estimate, best guess under likely real-world testing scenarios. The oft reported caveat for automobiles is “Your actual mileage may vary,” and paraphrased for automated analysers becomes “Your actual throughput/TAT may vary.” Integrated systems offer the advantage of consolidation. It is preferable to place as many assays on a single analyser than to maintain two or more analysers because each instrument will require separate QC, preventive maintenance, record keeping, etc. For a laboratory large enough to need multiple analysers or for an integrated delivery system (a medical care network) consisting of two or more laboratories, it is highly desirable to use a family of analysers. This might consist of duplicate instruments or a mixture of analysers from the same manufacturer but designed for different sized workloads. Analysers belonging to the same family, use common calibrators, controls, reagents, cuvettes or reaction vessels, disposables, methodology, and software, which is distinct advantage. Most important, the commonality of analysers belonging to the same family should guarantee equivalent test results so that it makes no difference which instrument is used to test a patient sample. The reference intervals and medical decision levels are equally suitable for all analysers. Automated analysers may be supplemented with stand-alone pre-analytical and/or post-analytical systems, which can be described as partial or task-targeted automation. Laboratory Automation and Total Laboratory Automation There are two approaches to automation: stand-alone or tasktargeted automation and total laboratory automation. TLA is distinguished by the presence of some sort of track system that connects the various components. Stand-alone automation

148 Clin Biochem Rev 35 (3) 2014

provides most of the major benefits of TLA without the cost of a track. With stand-alone automation, sample transport is still performed manually by the laboratory staff, sometimes called the ‘sneaker network’. The stand-alone options offer a smaller footprint (some components are even bench top modules) and lower cost, and thus may be good choices for laboratories with limited floor space. Beginning in the 1970s, the clinical laboratory saw the introduction of robotics and informatics allowing for great leaps forward in automation, leading to the development of TLA.8 TLA is generally defined as laboratory automation that includes pre-analytical, analytical, and post-analytical operations.16 Automated systems that lack one of these components are subtotal. In the early 1980s, Dr. Masahide Sasaki at the Kochi Medical School, Kochi, Japan, developed conveyor belt systems, robots to load and unload analysers, and process control software, and is credited with the first attempt at TLA.10 By the 1990s a few companies began offering automation but these systems were expensive and had limited functionality, and generally fell somewhat short of TLA. Some clinical chemists fielded their home-made, in-house TLA systems, but they were limited to the facility in which they were created and their sustainability was questionable without the key personnel who developed and supported them. TLA combines a wide variety of processes, including accessioning and sorting specimens, decapping tubes, centrifugation, aliquoting, delivery to analysers, recapping tubes, and storage and archiving of samples.8 TLA’s advantages include the standardisation of testing to improve patient care, eliminating the always present potential of human error in the handling and testing of samples. TLA also allows for the reduction of sample handling steps, decreasing the likelihood of handling errors and improving patient safety. In addition, TLA increases productivity, decreases and standardises turnaround time (TAT), improves safety, and allows manpower to be reallocated for optimisation of those tasks that cannot be automated. Laboratories adapt TLA to handle ever-increasing workloads and demands for quicker TATs and standardisation of laboratory operations. Highly efficient TLA can even eliminate the need for separate STAT laboratories, depending in part on the ability to get STAT samples to the central laboratory. TLA is ideally suited to core laboratories that conduct a wide spectrum of testing using highly automated systems coupled with sophisticated laboratory information systems (LIS). The challenge for laboratory directors is to balance cost with the goals of analytical quality, patient safety and clinical service needs. TLA offers a potential solution and has been adapted in various forms by clinical laboratories world-wide.12 Given

Laboratory Automation

that all modern analysers are fully automated, ‘automation’ here refers to the pre-analytical and post-analytical manual procedures of testing. The important considerations to be examined when considering TLA include: methods (provide for the existing assays and also allow for open channel applications); TAT (aim for achieving desired TAT about 90% of the time); specimen handling (allow for stats, decrease manual procedures, check for haemolysis/icterus/lipemia (HIL)); ability to locate, retrieve, and test a sample for dilution, repeat, or add-on testing; throughput (accommodate fluctuating loads through the day); cost (be affordable and even decrease operating costs over time); ability to operate with a laboratory information system (LIS; communicate with the LIS and provide middleware); environment and safety (decrease health hazards); downtime and service (require minimal preventive maintenance and downtime and service will be readily available).12 The first step towards TLA is to conduct a thorough, detailed analysis of the current laboratory processes, i.e. a workflow analysis. It will demonstrate the strengths and weaknesses of the current system, whether it be manual, semi-automated, or automated, and identify the steps that can be eliminated or improved by TLA. The old saw is true: applying automation to the current, suboptimal process only serves to automate a poor process.12 Moving from multiple automated systems to TLA is a big, complicated, and expensive step for a laboratory. Hawker very appropriately notes that failure to properly analyse the needs of the laboratory and understand the current state and processes in the laboratory are the primary reasons why automation projects are not successful, or at least do not live up to the initial expectations of the adapters.16 The workflow analysis completely maps the current processes from specimen receipt, testing, reporting, and storage/disposal of specimens. Hawker lists ten reasons why automation can fail to deliver on its promise: incomplete understanding of the current non-automated process; lack of flexibility due to fixed processes and limited throughput; unrealistic expectations; poorly executed workarounds to interface automated and manual processes; unclear expectations of system functionality; unnecessarily complicated designs; inadequate technical support; failure to conduct realistic impact analysis; hidden costs (labour, supplies, maintenance); and failure to optimise the current processes prior to automation (never automate a poor process). As a rule-of-thumb, a good TLA system should handle at least 80% of the workload. TLA involves some kind of track system that connects the pre-analytical, analytical, and post-analytical components. Melanson et al. have carefully described the key criteria: track design and analyser connection; unambiguous sample identification; aliquoting of primary sample tubes; ability

to process a wide variety of sample tubes; centrifugation capability as part of the track; stat handling capability; repeat testing and dilutions functions; refrigerated storage and retrieval unit; decapping/re-capping of primary tubes; TAT; throughput; service availability; assay quality; potential for sample carryover; clot detection; detection of sample integrity or HIL; ability to handle small sample volumes (e.g. paediatrics),18 open system option;19 hands on technologist time (e.g. preventive maintenance, reagent preparation, QC and calibration); environment, space, heat, noise, and water consumption considerations; informatics (e.g. interface with existing LIS/HIS, middleware to bridge the analysers and LIS).12 A track system is integral to TLA. ‘Laboratory streets’ or ‘laboratory assembly lines’ have sometimes been used as alternative descriptions for the track.4 The track may either be a dual lane circular, or ‘loop’, layout or a linear, or ‘unidirectional’, design.8 Circular tracks allow samples to return to various analysers attached to the track but can require more floor space. Specimen carriers should be standardised and racks, typically with capacity for five samples, are routine. The sample handling module must provide error free specimen identification using barcodes or RFID labels. Software must allow the sample ID to be read and to obtain associated patient information, including the tests requested, from the LIS.8 Such process control software is often called the laboratory automation system (LAS). The LAS should determine the number of aliquots and sample volume for each specimen, route the samples to the analysers, recap the samples and store them for future testing or until disposal, and also control sample level detection, and HIL/integrity evaluation. LAS or middleware includes autoverification and autoretrieval capability for repeat or reflex testing and dilution testing. The racks and the track must be able to handle a wide variety of primary collection tube sizes and types and also sample cups placed in tube adapters. The major instrument manufacturers provide track systems and tracks and they are also available from independent automation vendors as well. Laboratoriess must confirm that a track is compatible with analysers other than those from the track manufacturer. The theoretical throughput claimed by a manufacturer may not match reality. This is not necessarily a false claim by the manufacturer as throughput and TAT will logically be affected by the actual sample volume and the exact number and type of tests requested for each sample. Very efficient integrated analysers combining clinical chemistry and immunoassay systems are attached separately to tracks and some advantages of the stand-alone integrated clinical chemistry/IA systems may be lost when this happens.12 Although not considered here, haematology, coagulation, urinalysis, and microbiology analysers may also be part of a track system.

Clin Biochem Rev 35 (3) 2014 149

Armbruster DA, Overcash DR & Reyes J

Expert rules based software allows for automatic release of test results without issues and flags results that require human attention and review.17 The goal is hands-off, walk-away operation, but technologists are still required for analyser maintenance and operation, replenishing reagents and disposables, responding to analyser messages, and loading specimens onto the initial sample processing module. Laboratories must be realistic and recognise that TLA is not a panacea. They should use only as much automation as is necessary and appropriate. Stand-alone sample processing units seem to make sense for laboratoires with daily workloads of 500–2,000 samples/day. Laboratories with larger workloads can expand automation to include a track to interface with analysers and provide specimen storage/retrieval capability. A laboratory can mix and match modules to achieve the desired configuration. If TLA appears to be the answer, it is important to know if a system is ‘closed’, that is, only compatible with analysers from the same vendor as the processing units, or ‘open’, meaning analysers from other manufacturers can be attached to it. Does TLA deliver on its promise or is it overkill? That is a question each laboratory must answer, and the conclusion should be based on hard data. Sarkozi et al. presented a compelling argument in favor of TLA at their facility.10 They calculated, when expressed in constant 1965 dollars, the total cost per test decreased from $0.79 to $0.15 between 1965-2000 and the TAT decreased so that stat samples requiring a TAT of < 1 hour did not need to be prioritised and tested separately. The introduction of a robotics system for perianalytical automation brought large improvements in productivity together with decreased operational costs, even though the workload increased significantly and the number of personnel decreased. In fact, despite a dramatic increase in productivity, the staff was reduced by 24% over time. In their analysis, productivity increased by 58.2% as measured by tests/employee, and 82% as measured by specimens per employee. A dramatic reduction in total cost per test was due almost entirely to the reduction of labour due to increased productivity and spare capacity in the system allowed for a significant increase in volume without any increase or minimal increase of personnel. This is the kind of hard data required to objectively evaluate the return on investment of TLA. LC-MS/MS A review of current developments in laboratory automation would not be complete without inclusion of liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS), or LC tandem mass spectrometry.18 Related chromatographic methods, such as gas chromatography (GC), high performance liquid chromatography (HPLC), and gas chromatography-mass spectrometry (GC-MS) will not be 150 Clin Biochem Rev 35 (3) 2014

discussed. These methodologies are widely used but often in more specialised laboratories and LC-MS/MS is the analytical technique that has gained most attention in recent years. As explained by Greaves, “The relatively new implementation of liquid chromatography coupled with tandem mass spectrometry (LC-MSMS) techniques as routine assays in diagnostic laboratories provides the unique opportunity to harmonise, and in many cases standardise, methods from an early stage”.19 LC has been used in clinical laboratories in the form of HPLC since about the 1980s and was then coupled with MS.20 LC allows for separation of analytes and MS breaks down analytes, identifies the resulting ion fragments, and quantitates them. Currently, LC-MS/MS represents the analytical state of the art because the dual MS technique provides better selectivity and resolution of analytes than a single MS unit. LC-MS/ MS is now routine in many large, sophisticated laboratories, providing significant analytical improvements and potential cost savings.20 It offers advantages specifically for drugs of abuse testing, therapeutic drug monitoring (TDM), and endocrinology. Immunoassays made it possible to measure dozens of individual proteins and other analytes, but sometimes the results are misleading due to a lack of concordance across multiple methods/platforms, and due to autoantibodies, anti-reagent antibodies (heterophile antibodies), anti-animal antibodies (human anti-mouse antibodies, HAMAs), and the high-dose hook effect. In addition IA suffers from the high cost of antibodies, lot-to-lot differences, calibration bias, and crossreactivity.21 These issues can be avoided with tandem LCMS/MS.22 The strengths of LC-MS/MS include: wide range of analytes that can be measured, high sensitivity (exceptional limit of detection), specificity, precision, accuracy, and the capability of multiplexed analysis.21 Automated IA using chemiluminescence provide low LoD (e.g. 10–100 pg/mL) but LC-MS/MS tests can detect substances down to about 1 pg/mL and are suitable for drugs and hormones such as testosterone and estradiol.5 IA antibodies can measure multiple analytes, such as the members of a large drug family, but may also bind to many unwanted, interfering substances and lack the specificity of LC-MS/MS. Of course LC-MS/MS has its drawbacks. The initial cost of equipment is high, but the cost of reagents for extraction and analysis are typically lower than that of IA.5 The lack of universal, routine, standardised LC-MS/MS methods means individual laboratories must develop their own methods, which is difficult and time-consuming. Ready to use methods from vendors are not yet widely available. LC-MS/MS suffers from interference caused by ‘ion suppression’ by which interfering compounds in the sample matrix decrease

Laboratory Automation

the signal of the target analyte(s). IA offers 24 hour testing capability while LC-MS/MS is not always available round the clock due to its complexity. Tandem MS cannot distinguish compounds of the same mass and may require adjustment to the chromatographic settings to separate compounds prior to entering the mass spectrometer. The usual clinical laboratory workforce is not universally well-trained to operate sophisticated mass spectrometry systems and it typically requires highly trained operators, although newer systems may allow general medical technologists to operate them. Other potential pitfalls include the complexity of serum and plasma samples, post-translation modifications of analytes, molecular polymorphisms, and multiple isoforms of many proteins.22 In short, detecting a portion of a protein in a sample does not mean that the intact protein is measured. Another limitation of LC-MS/MS is that parent compounds can be fragmented by MS in such a way as to produce different fragments that have the same molecular weight, thus producing inaccurate results for the intended analyte.20 Other disadvantages include the lack of standardisation, and the lack of assay ‘kits’ similar to those for routine automated procedures.5 LC-MS/MS procedures are typically ‘laboratory developed tests’ (LDTs).5 They are not approved by a regulatory agencies such as the FDA but are validated by individual laboratories for clinical use. Validation requires determination of analytical sensitivity and specificity and clinical performance, parameters not required by regulatory approved assays. MS procedures are intended for use only by the laboratory that developed them and are not to be used to test samples from other clinical laboratories. MS assays are unique to each facility’s needs and lack lab-to-lab standardisation/harmonisation.20 LC-MS/ MS faces the same challenges as routine clinical chemistry tests and immunoassays, namely metrological traceability to reference materials and/or methods that allow for equivalent patient test results regardless of the laboratory that performs the analysis. It is now recognised that comparability of patient test results is a key desideratum. Some LC/MS assays demonstrate a large degree of variability, as observed for LC-MS/MS methods for a major immunosuppressant drug such as tacrolimus, in contrast to the relative consistency of analysis in terms of accuracy and precision observed for standard automated immunoassays.23 Although progress may occur rapidly, at present there is still a need to make LC-MS/MS a routine methodology in clinical laboratories.21 Industry support is still in its infancy and IVD companies must invest in LC-MS/MS to produce analysers fit for routine use in the clinical laboratory. LC-MS/MS currently appears to be a complementary methodology, although there are many fervent supporters who may disagree.

Partnering of immunoassays with LC-MS/MS has proven a happy marriage in drugs of abuse testing.20 Screening immunoassays reveal specimens that contain a wide variety of drugs, producing putative positives, and confirmation and quantitation of the screen positive samples is performed using LC-MS/MS. In specialised TDM for immunosuppressant drugs, LC-MS/MS offers the ability to reliably quantitate very low drug concentrations. LC-MS/MS is also used for analysis of steroid and thyroid hormones, and vitamins such as Vitamin D, a common application.20 The method also has the unique ability to perform ‘multiplex testing’, that is, it can measure a variety of different analytes simultaneously. LC-MS/MS’s exquisite specificity allows it to be used as the accepted reference method (the ‘gold standard’) for many analytes, but because such a variety of LC-MS/MS assays are available for the same analyte, there can be multiple putative reference methods. It is currently a challenge to successfully integrate MS with TLA. However, as noted at the beginning of this paper, there was a time when completely automated analysis of patient specimens was only a speculative dream. That is now a reality and there is reason to believe that LC-MS/MS units can be attached to track systems and/or other automated modules. Informatics Hawker emphasises that information technology is a critical element of any automation solution.16 Solid metrics should be used to assess pre- and post-automation performance to objectively prove automation effectiveness. Such metrics include: stat and routine TAT, number of process failures [lost, mishandled, mislabeled, mis-delivered specimens; aliquoting and pour-off errors, client complaints per billed units, and productivity parameters (billed units reported per employee)]. Process control software should read specimen ID and access from the LIS the specimen type and orders for each sample.16 It then determines the necessary processes for each sample, e.g. centrifugation, decapping, aliquoting, and the exact analytical route the sample will take through the automated system. Additionally, it should monitor analysers for incontrol production status and make decisions to accept or reject test results using rules based algorithms. This includes evaluation of sample integrity (HIL) and autoverification. Process control software should also determine the need for repeats and dilutions, apply QC rules, and provide custom comments.12 All of these activities and processes fall under the heading of informatics. Over the last two decades, laboratory process management software (middleware) has evolved significantly. Initially middleware was a simple interface engine but today it is a process control and management tool. It allows the laboratorian, to be the expert orchestrator of the process, Clin Biochem Rev 35 (3) 2014 151

Armbruster DA, Overcash DR & Reyes J

focused on supervising and managing the execution of the laboratory operations and resolving exceptions beyond the capabilities of the automated system. Middleware enables the laboratorian by integrating available relevant data and tools to access, visualise and act on information and by automating data handling tasks. This allows efficiency gains by lowering unit (test result) costs and providing greater productivity. Quality improvements include less process variability, more attention to detail, and improved levels of service with less human induced delays for results and better TAT. Some specific advantages include: robust, high bandwidth interfaces; comprehensive data schemas and flexible data integration, filtering and visualisation; work order and test results review rules (autoverification); and Average-of-Normals tracking and QC data manager integration. Informatics allows robust, comprehensive interfaces that automatically and reliably transmit data to and from various systems. Interfaces with laboratory and hospital information systems go well beyond medical work orders and include full patient and ordering physician demographics, diagnosis codes, order priority, and more. Interfaces with instruments include graphs and images, and instruments exception and error reports. This allows middleware to speed the reliable transcription of all relevant information the laboratory scientist needs and presents it in one place, avoiding information loss, printouts and handwritten notes, manual transcription, delay and errors, and related rework. Flexible data integration, filtering and visualisation provided by middleware helps the laboratory scientist make sense of and take action on the analytical information. Middleware can collate and present in a single screen all the information related to a patient and related work orders, physicians, and results, integrated with tools to obtain additional information, make annotations, order reprocessing or new tests and much more. Filters, context-sensitive colouring, and adaptable screen/field layouts enable the laboratory scientist to focus on relevant information while minimising information overload. Interactive real-time dashboards with graphical information displays are available to enable process status awareness and management engagement. Middleware allows laboratorians to manage orders and results for any instruments and any workstation in the lab, not only those closest to the instrument. Most middleware can automate routine tasks normally performed by a laboratorian using Rule Engines that are flexible (capable of sophisticated logic and decisions); powerful (can consult patient health status, history and demographics and initiate a wide range of actions, such as computations, adding/ cancelling tests or routing specimens) and relatively easy to use (can be programmed using visual drag and drop tools). It is

152 Clin Biochem Rev 35 (3) 2014

also possible to automate work order optimisation protocols, e.g. hold a test requested if it is not appropriate given current patient status, history, or demographics. The ease of use makes middleware accessible to a broad spectrum of laboratories. For example, the automation of routine protocols enables laboratories to handle escalating demand for services while meeting access, productivity and cost targets. Furthermore, the automation of routine protocols allows laboratorians to focus their limited time on tasks requiring their special knowledge and experience, such as advising physicians and patients and resolving complex exceptions. Quality Control is an area in which middleware information, integration and process automation can yield important improvements to quality and productivity. The integration of QC data managers enables automated transcription of QC test results. Consequently, manual transcription of non-value added work and associated delays and errors can be avoided. The bi-directional integration with QC data managers enables real-time communication and integration of QC status. A laboratorian can see test results in a single screen along with relevant QC events, such as Westgard rule violations, in a way that improves the ability to take appropriate corrective action. Middleware also provides Average-of-Normals (AoN) modules that continuously evaluate the analytical performance of an instrument providing a near real-time detection of performance degradation. Through automated responses to QC events via a middleware rules engine (e.g. automatically holding results release and routing specimens for testing on a different analyser), laboratories can develop novel best practices in QC. Conclusion The history of automation in the clinical laboratory is long and varied. The Latin aphorism at the beginning of this review, loosely translated as “Victory loves careful preparation,” is apt for laboratories picking and choosing among the various automation options available in the 21st century. Manual testing is clearly of the past century for a modern laboratory except for a few very specialised tests. Even if a laboratory’s workload is of such a low volume and TAT is not a concern, the inherent variability of manual procedures makes them nonviable in comparison to modern automated methods and it is a given that laboratories will inevitably adopt automation. The dilemma for clinical chemists is to decide what kind of automation and what extent of automation is suitable for a given facility. The advantages of automation are undeniable. The challenge is for laboratories to embrace the right kind of automation to best meet their specific patient testing needs. This starts with a careful analysis of a laboratory’s current process and optimising it before making any automation

Laboratory Automation

decisions. It can be tempting to make the jump to some level of automation without first performing a process analysis but it is a risky move. It is better to fully understand what kind of automation is needed and adapt the right kind of automation to meet the need. There are myriad options available, from standalone integrated systems, to pre- and post-analytical modules that can be mixed and matched with analytical units, to total laboratory automation. This is definitely a situation in which “One size does not fit all.” Automated systems must match the specific work volumes and needs of each laboratory. Blindly copying the automation solution that is acceptable for another laboratory is discouraged. At the same time, standardisation of automation throughout a network of laboratories is highly desirable. TLA is the ultimate automation development, but not necessarily the right solution for a laboratory. A facility may find specific, targeted automation to be a better option. Not to be overlooked is the availability of middleware, the sophisticated software that links analysers and the LIS and offers the ability to take test results, combine them with patient demographic data and principles of evidence-based laboratory medicine, and offer ‘value-added’ clinically useful information to healthcare providers. Middleware is applicable to every level of automation. Competing Interests: The authors are employees of Abbott Diagnostics. References 1. Doumas BT. The evolution of clinical chemistry as reflected in textbooks published in the United States. Clin Chem 1998;44:2231-3. 2. Kricka LJ, Savory J. International Year of Chemistry 2011. A guide to the history of clinical chemistry. Clin Chem 2011;57:1118-26. 3. Rosenfeld L. Clinical chemistry since 1800: growth and development. Clin Chem 2002;48:186-97. 4. Durner J. Clinical chemistry: challenges for analytical chemistry and the nanosciences from medicine. Angew Chem Int Ed Engl 2010;49:1026-51. 5. Wu AH, French D. Implementation of liquid chromatography/mass spectrometry into the clinical laboratory. Clin Chim Acta 2013;420:4-10. 6. Olsen K. The first 110 years of laboratory automation: technologies, applications, and the creative scientist. J Lab Autom 2012;17:469-80. 7. Rosenfeld L. A golden age of clinical chemistry: 19481960. Clin Chem 2000;46:1705-14. 8. Streitberg GS, Angel L, Sikaris KA, Bwititi PT. Automation in clinical biochemistry: core, peripheral, STAT, and specialist laboratories in Australia. J Lab Autom 2012;17:387-94. 9. Davis JE. Automation. In: Kaplan LA, Pesce AJ, editors. Clinical Chemistry: Theory, Analysis, and Correlation. St. Louis, MO: CV Mosby Co.; 1984. pp. 261-72.

10. Sarkozi L, Simson E, Ramanathan L. The effects of total laboratory automation on the management of a clinical chemistry laboratory. Retrospective analysis of 36 years. Clin Chim Acta 2003;329:89-94. 11. Maclin E, Young DS. Automation in the clinical laboratory. In: Tietz NW, editor. Fundamentals of Clinical Chemistry. 3rd ed. Philadelphia, PA: WB Saunders Co.; 1987. pp 160-92. 12. Melanson SE, Lindeman NI, Jarolim P. Selecting automation for the clinical chemistry laboratory. Arch Pathol Lab Med 2007;131:1063-9. 13. Wu AH. A selected history and future of immunoassay development and applications in clinical chemistry. Clin Chim Acta 2006;369:119-24. 14. Wild D, Sheehan C, Binder S. Introduction to immunoassay product technology in clinical diagnostic testing. In: Wild D, editor. Immunoassay Handbook: Theory and Applications of Ligand Binding, ELISA and Related Techniques. 4th edition. Oxford, UK: Elsevier; 2013. 15. Wood WG. Immunoassays & co.: past, present, future?—A review and outlook from personal experience and involvement over the past 35 years. Clin Lab 2008;54:423-38. 16. Hawker CD. Laboratory automation: total and subtotal. Clin Lab Med 2007;27:749-70,vi. 17. Boyd JC, Hawker CD. Automation in the clinical laboratory. In: Burtis CA, Ashwood ER, Bruns DE, editors. Tietz Textbook of Clinical Chemistry and Molecular Diagnostics. 5th edition. St. Louis, MO: Elsevier Saunders; 2012. pp 478-83. 18. Himmelsbach M. 10 years of MS instrumental developments—impact on LC-MS/MS in clinical chemistry. J Chromatogr B Analyt Technol Biomed Life Sci 2012;883-884:3-17. 19. Greaves RF. A guide to harmonisation and standardisation of measurands determined by liquid chromatography - tandem mass spectrometry in routine clinical biochemistry. Clin Biochem Rev 2012;33:123-32. 20. Seger C. Usage and limitations of liquid chromatographytandem mass spectrometry (LC-MS/MS) in clinical routine laboratories. Wien Med Wochenschr 2012;162:499-504. 21. Vogeser M, Seger C. LC-MS/MS in clinical chemistry. J Chromatogr B Analyt Technol Biomed Life Sci 2012;883884:1-2. 22. Hoofnagle AN, Wener MH. The fundamental flaws of immunoassays and potential solutions using tandem mass spectrometry. J Immunol Methods 2009;347:3-11. 23. Levine DM, Maine GT, Armbruster DA, Mussell C, Buchholz C, O’Connor G, et al. The need for standardization of tacrolimus assays. Clin Chem 2011;57:1739-47.

Clin Biochem Rev 35 (3) 2014 153

Clinical Chemistry Laboratory Automation in the 21st Century - Amat Victoria curam (Victory loves careful preparation).

The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the tradit...
259KB Sizes 0 Downloads 4 Views