217 GATA 8(7): 217-220, 1991

PERSPECTIVE The Future of Laboratory Automation T O N Y J. B E U G E L S D I J K

Among the many factors that will define the laboratory of the future are the development of advanced computer communications systems, artificial intelligence, robotic systems, and material storage and retrieval systems. This article examines some of these factors and challenges current automation justification procedures in light of the greater competitive environment of today.

The Laboratory Environment for t h e 1990s A strategic perspective on the future of laboratory instrumentation requires an explicit awareness of the larger environment. With competitors operating in global markets, strategic questions must be addressed. Questions such as "Do we want to be a player in this market in the future?" and, if so, "Do we want to be a market leader?" need to be answered. Global competition requires rapid new product development, high product quality, and world-class productivity and profitability. In addition, increasing challenges are being posed by government regulations that require more analyses, greater sensitivity, improved precision, and documented quality assurance. In this climate, changes in traditional economic justification procedures for instrumentation development are mandated. More than 90% of business managers and executives use conventional justification procedures as their major consideration in approving automation [1, 2]. However, traditional financial models are no longer adequate in today's competitive environment. Concepts such as return on investment, payback period, internal rate of return, and cost savings have provided use-

From the Human Genome Center, Los Alamos National Laboratory, Mechanical and Electronic Engineering Division, Los Alamos, New Mexico, USA. Address correspondence to Dr. T. J. Beugelsdijk, section leader, Human Genome Center, Los Alamos National Laboratory, Mechanical and Electronic Engineering Division, Los Alamos, NM 87545, USA. Received 7 June 1991 ; revised and accepted 8 October 1991.

ful models in the past, but these are tactical weapons aimed at short-range profitability targets and have little or no applicability in the strategic longrange arena [3]. The predominance of these tools is related to the traditional concepts of financial (rather than program) management, a focus on a respectable bottom line, and responsibility to investors and stockholders. Unfortunately, these indicators force short-range decision making and mentality.

Forces Driving Automation and Technology Development Many forces drive technology and push automation development and improvements in productivity. Two of the major forces are global competition and changing demographics. The former is well known. A more insidious force results from changing demographics and the graying of today's work force without a sufficiently populous and adequately educated follow-on generation. Some of the other contributing factors are listed next.

Temporal Compression There is an urgent need for more data to be acquired and processed in both shorter times and in real time. Large, complex systems require realtime status information so that control can be effected on a system not significantly different from the one sampled and measured. Information distilled from these data serves as input to such higher levels of decision making, stochastic models, and decision trees, or it can provide feedback that suggests new experiments or additional data that must be acquired. Automation of the activities of sampling, data gathering, information processing, and the like allows for more timely responses to external forces in the marketplace and can result in real competitive advantages.

User-Driven Innovation Products for automation will always be used in ways not thought of or anticipated by their developers. Users are most aware of their own needs, and hence they provide a clear message to automators to be sensitive to innovation in their community and to respond rapidly into areas suggested by their adaptations. In many instances, users will modify existing protocols to take better advan-

© 1991 Elsevier Science Publishing Co., Inc., 655 Avenue of the Americas, New York, NY 10010 1050-3862/91/$03.50

218 G A T A 8(7): 217-220, 1991

tages of an instrument's throughput capabilities. In addition, manufacturers and developers should not focus solely on selected target markets and industries for the application of their technologies. Often larger markets entirely different from those originally targeted exist or can be created. Swings in the fortunes of current user bases can be mitigated by an interdisciplinary focus and by greater diversity.

"Data-Rich" Experiments It is not uncommon for individual experiments today to yield megabytes of data. More and higherquality data generally result in a more complete knowledge of the problem. Biological systems, in particular, yield vast quantities of data that are unparalleled in physical or chemical systems. Increasing demands for preprocessing, storing, retrieving, combining, editing, and other actions must be met and will serve as motivators behind continued automation development.

Sophisticated But Costly Methods Complex hardware is often required for data-rich experiments. Instrumentation with capital costs in excess of $500,000 are becoming commonplace. Duplication of such resources at many user locations soon becomes prohibitive. Centralized user facilities and centers of expertise have been and are b e i n g e s t a b l i s h e d in h o s t s of a r e a s - supercomputing, robotics, magnetic resonance and imaging, and surface science, to name a few. These centers have data repository and communications needs of their own and can be used to leverage the resources at other research laboratories to greater advantage.

Information Overload The acquisition of data and the extraction of information from them is reaching the point at which human interception, interpretation, and intervention will no longer be possible. Expert systems will increasingly assume these responsibilities and act as referees and traffic coordinators for evaluating and interacting with large data repositories. Suspect or inconsistent data will automatically be flagged for reevaluation or regeneration.

T. J. Beugelsdijk

Technologies for the Future Laboratory

Connectivity of Experiments and Data in Space and Time Informatics technology must be coupled directly with the automated laboratory. In addition, individual data-rich experiments must be linked both in parallel and in time. No longer can we afford to focus solely on islands of automation or productivity, as these isolated systems can no longer continue to exist. To achieve this connectivity, there is an overwhelming need for a communication standard for laboratory data and instrumentation along the lines of the Open Systems Interconnect (OSI) model. The current computer communications technology in the laboratory arena is comparable to the national road system in the 1940s before the construction of the interstate highway system. Data paths and exchange protocols must be established and constructed to superhighway status. Moreover, the logistics of data networking and flow should be transparent to the end user, much the same way as all the technology that brings 110 V to a receptacle is transparent to the users of power equipment. Tying into this conduit should be no more complex than the mating of two connectors. Such an effort has been started at the National Institute of Standards and Technology (NIST) under the Consortium for Automated Analytical Laboratory Systems (CAALS). This consortium has members from instrument manufacturers, the industrial user community, and government organizations. Once established, this interconnect standard will enable the quick assembly of automated systems using equipment and software supplied by different vendors. The concept of modularity in chemical function must be carefully defined. Recognizing this need, the US Department of Energy, in a multilaboratory effort, has been developing the Standard Laboratory Module (SLM) concept since 1989. The SLM is defined as a logical grouping of individual laboratory unit operations (for example, weighing, separation, and data analysis) that performs a subtask within an experimental protocol. These SLMs become the building blocks for the modular laboratory. When linked together in an objectoriented programming environment, complete standard analysis methods (SAMs) can be quickly constructed.

© 1991 Elsevier Science Publishing Co., Inc., 655 Avenue of the Americas, New York, NY 10010

219 G A T A 8(7): 217-220, 1991

The Future of Automation

In the past, users made many concessions to the computer--learning their internal workings and idiosyncrasies, becoming proficient in several "optimum" languages, mastering the intricacies and subtleties of interfacing, and managing data flow. In the future, end users will be less and less willing to accommodate individual machines and to interact with them on their individual levels. It is neither productive nor efficient for highly specialized natural and physical scientists to divert their energies to becoming skilled in computer science as well. In doing so, their creativity is abstracted to a level removed from the problems of their science. To realize these gains in productivity and efficiency, the human interface to computers must be able to communicate in some kind of natural language such as English. Although 3-year-old children can speak and understand their native language, we still have not produced a computer program with an overall linguistic performance comparable to that of infants. Computers will not be able to perform many of the tasks people do every day until they, too, share in the ability to use language. Typical of the problems encountered in speech recognition is the miscomprehension of " H o w to recognize speech" as " H o w to wreck a nice beach." The understanding resulting from the mapping of these sentences into knowledge representations is markedly different. Language understanding is further compounded by complex phonetic, lexical, syntactic, and semantic rules. Understanding natural language thus requires both a linguistic knowledge of a particular language and a world knowledge relating to the particular topic being discussed.

Artificial Intelligence The field of artificial intelligence (AI) will contribute much in the future. Currently, however, AI is an area that is still relatively primitive in its art and lacking impressive technologies. The field is currently defined primarily by its problems and not by its technology, and it has many more ambitions than solutions. AI has also been generally oversold, but this is common with new technologies, particularly ones with profound implications. The problem with overselling at one point is that it leads to underselling later, making it difficult to generate interest and support for the technology when it has been developed further.

At present we are not close to creating true learning machines and it is difficult even to know what breakthroughs such a machine would require. This is the central problem of AI. The fundamental challenge is to create programs that can actually learn. We have only a fragmentary understanding in very specialized cases of how learning occurs. The deeper aspects may take decades or even centuries to solve. For example, we don't know how to decompose programs into truly independent fragments that can recombine automatically into something useful. The ordinary result when this is attempted is a combinatorial explosion of possibilities that even the most powerful computers cannot handle. Expert systems represent a drastic cutback in the generality of symbolic manipulation systems. Moreover, many of the packages touted today as expert systems aren't expert systems at all, much less profound AI applications. Many expert systems differ from ordinary programmed systems only in the style of programming--they are rule oriented rather than procedure oriented. However, the early applications of AI in the laboratory of the future will likely be in the area of expert systems, as this is the area that has seen most of the early success. The practical significance of neural networks and neurocomputers is difficult to assess, and indeed much of the current excitement seems to be excessive. The neural network exemplifies the paradox that the enthusiasm of ardent advocates can set up a significant barrier to further development [4]. This barrier is the result of confusing neural networks' long-term potential with unrealistic short-term goals. However, neural networks have substantial long-term potential and valuable applications today. Their function comprises much of the intelligence we associate with biological systems and shares many of these system's attributes, such as high connectivity, nonalgorithmic processing, adaptation, fault tolerance, useful outputs from fuzzy inputs, nonlinear transfer, generalization, self-organization, parallel processing, and simple processing elements. In the future, we will see continued progress in all areas of AI, including logic and language translation. Much careful work will be done on the detailed problems, with a better understanding of computational limitations. We will see an integration of classical science (such as physical phenomena, sensing, and reasoning) with AI, especially in

© 1991 Elsevier Science Publishing Co., Inc., 655 Avenue of the Americas, New York, NY 10010

220 GATA 8(7): 217-220, 1991

the field of robotics. Interesting applications of computer science to AI will also develop; for example, we will see the use of computational geometry in robotic planning.

Robotics The significant experiments of the future will be designed but not performed by humans. Historically, automated devices for the laboratory have been developed to be operable by technicians. We will see an emergence of instruments that are designed to be operable by other instruments. The entire function of the laboratory is to convert raw material (samples) into information (data). As such, a transition must be made from material handling to data handling. The intelligent functions of data generators (that is, instruments) will reside in a local or host computer in which the local control and data handling will take place. Physical operation of the generators (maintenance, supplying disposables and reagents, and so on) and material handling will be relegated to intelligent transport and robotic systems.

Material Management Developers of automation for the laboratory have traditionally chased the bottleneck in improving laboratory throughput. With this reasoning, we have seen the sequential development of specialized automated analyzers, computer-controlled instrumentation, and laboratory robotic systems. A future development for the laboratory will be in the area of sample management. The state of the art for most laboratories is still the shoebox method of sample transport to the lab. By integrating the existing technologies of stackerretriever systems, pneumatic transfer, and keyless data entry through bar code autoidentification systems, it will be possible to make this facet of lab-

T. J. Beugelsdijk

oratory operation much more efficient. Indeed, some development of these systems is already under way. The real benefits of this technology include real-time inventory control, a complete sample transaction audit trail, reduced sample contamination, productivity gains through improved and immediate access, and controlled access. More attention will also be given to overall load balancing. To gain maximum efficiency, use, and throughput from all these automated systems, it is important to balance the capacities of the components carefully to ensure that no one ends up sitting inefficiently idle while others clear their schedules.

Conclusion The true cybernetic laboratory is still only a vision into the future. However, many forces are at work that will make this future more of a reality. By looking and speculating about the form and structure of that laboratory, we shape the future and ultimately make the vision a reality. The needs of the future are extraordinary; the problems of the future are likewise extraordinary. Without such extraordinary challenges, the future would hold only ordinary promise. We can eagerly look forward with great anticipation to an exponential increase in real knowledge, to work that is truly worthy of human intelligence, and to solutions to problems that will benefit all of humankind.

References 1. 1984 Survey of the National Electrical Manufacturers Association. 2. O'Guin MC: Beyond Financial Analysis: Justifying CIM. Los Angeles, Kearney, 1987 3. Rosen J: Mech Eng 109(11):48-54, 1987 4. Jansson PA: Anal Chem 63:357-362A, 1991

© 1991 Elsevier Science Publishing Co., Inc., 655 Avenue of the Americas, New York, NY 10010

The future of laboratory automation.

Among the many factors that will define the laboratory of the future are the development of advanced computer communications systems, artificial intel...
372KB Sizes 0 Downloads 0 Views