A Path to Integration in an Academic Health Science Center* Walter B. Panko, Ph.D. & Wayne Wilson, M.S. Office of Health Sciences Information Technology & Networking University of Michigan Medical Center Ann Arbor, MI 48109-0704 *

Supported in part by G08-LM05329

ABSTRACT

ated with these changes as well as real risks in the operational domain. As a result, there is a great deal of effort directed towards conflicting or at least orthogonal systems which results in inefficient utilization of resources and information systems that meet no one's needs.

This article describes a networking and integration strategy in use at the University of Michigan Medical Center. This strategy builds upon the existing technology base and is designed to provide a roadmap that will direct short-term development along a productive, longterm path. It offers a way to permit the short-term development of incremental solutions to current problems while at the same time maximizing the likelihood that these incremental efforts can be recycled into a more comprehensive approach.

In this paper, we outline a sequence of steps for "getting from here to there." These steps represent a reasonable compromise between what is technically possible today for both the old and new technology bases. More importantly, they remain focused on the future and the hard, important changes that must be made to achieve that future. We also describe some of our experiences in attempting to take these steps.

INTRODUCTION The vision of the optimal information technology architecture for an academic health science center before the end of this cenury is clear. Whether for scientific, educational or clinical purposes, the environment will consist of a distributed network of powerful workstations interacting with each other and specialized servers. These specialized servers will include not only the file and mail servers so common today, but also important additional types such as database servers and transaction processing servers. The interaction between workstations and servers will be enabled by a very limited set of widely-accepted standards. While still not approaching the ideal of a "seamless" environment, information will be accessible more freely via a relatively consistent set of graphical user interfaces (GUIs). While there is a surprising degree of unanimity about this future vision, there is much less agreement about how to reach this vision starting from today's typical environment. Not surprisingly, there is also little progress towards reaching that environment. Too many plans for reaching the future vision include as their first step ignoring or discarding the existing technology base -- along with the operational demands which have helped shaped it. Those with a vested interest in current technology or information architecture are not particularly anxious to adapt their existing procedures or systems in a way which enables the partial application of the new technologies. They recognize the costs associ0195-4210/92/$5.00 ©01993 AMIA, Inc.

278

FROM MANY-TO-MANY TO MANY-TO-ONE The typical academic health science center (AHSC) today is plagued by the many-to-many problem. Many different information systems, implemented in many different operating systems, using many different application approaches, are accessed by many different devices (terminals or workstations) using many different communications paths. Too often, the access methods resemble an IQ test rather than a robust and simple way to access information. It is little wonder that even in those AHSCs which have worked hard to ensure widespread connectivity users remain bewildered and, often, angry. This type of environment is illustrated in Figure 1.

One approach to this problem is to eliminate heterogeneity by forcing everyone to use the same hardware/ software/networking base. It is clear that in an organization as diverse as an AHSC, which values innovation and initiative so highly, that this approach is always unsuccessful. The heterogeneous problems of an AHSC cannot be solved with a homogeneous approach to computing. To attempt to do so drives conflict and the Balkanization of the information environment. Another "solution" is to ignore the problem and hope it does not cause too many derivative problems. This is the most widely used solution to the many-to-many prob-

top systems do not interact directly with the resources, but rather with an integrative layer, consisting of a set of networking-gateway technologies and standards designed to ensure interoperability, which is interposed between them and the resources. For the user of a particular desktop system, the integrative layer becomes the single target. Through it, the users or developer can access the information resources of an institution using a single approach and a single set of tools. However, when examined closely, the many-to-one model is an illusion. Given the current state of standards development and incorporation into products, the most that can be achieved today is the presentation of a single target within a single architecture. Thus, users who use PCs may well interact with a different integrative layer than those who use MACs or UNIX workstations. While less than perfect, this environment represents a clear advance over those common today.

Figure 1. A many-to-many environment lem today. Because of the patience and ingenuity of the users as well as their strong motivation to solve their information-based problems, this approach, if not successful, is at least accepted in many organizations. However, the increasingly sophisticated user community is showing signs of less tolerance for this situation.

The best approach is to drive towards a many-to-one environment (Figure 2). In this environment, the desk-

The conversion from a many-to-many to a many-to-one environment is hardly a new idea. Its worth has been recognized for some time. Yet, it has not been achieved very often. Why? Clearly, it is hard to achieve. The set of standards necessary to achieve a many-to-one environment is neither complete nor stable. The cost is high when the human resources needed to research and implement the interoperability strategies are tallied. Most importantly, the lack of a clear, multi-step migration strategy makes this appear to be yet another "pie-inthe-sky" approach which is likely to never be complete or yield short-term benefits.

DUAL-PROTOCOL STRATEGY PCs

In the University of Michigan Medical Center (UMMC) experience, it seems that a key feature in building a consensus in favor of the cost, effort and compromises necessary to move to a many-to-one environment is the articulation of a clear and credible migration strategy. This strategy must provide short-term gains for users while nudging the momentum of the information technology architecture towards more productive paths. At the UMMC, we have developed such a strategy. For reasons that will be clear below, we call it the Dual Protocol Strategy.

Lab CPU

Radiol. CPU

MACs

HIS

(wS

In examining the information technology environment of the UMMC in early 1990, we found the situation very similar to the one described in Figure 3. The driver behind this fragmented networking environment was the very heterogeneous computing environment at the UMMC. In addition to approximately 6,000 PCs (Macintosh and IBM-compatible), there was a large installed base of VAX-VMS equipment, 1200 dumb 3270 termi-

Cycles

tX-term.J

Mail

Figure 2. A many-to-one environment 279

Various networks

A multi-protocol network

,

Figure 3. UMMC in 1990 nals, and a growing number of UNIX workstations. There were several local area networks (LANs) in place; but they tended to be isolated both physically and by protocol. There were several other "noetworks" which turned out to be distributed switches used for terminalto-host communication. Many desktop devices had multiple connections -- e.g., to a LAN and to a distributed switch. Despite this, access to the full range of information resources was often limited. Access, when achieved, was limited to character-based interfaces; and the potential for client-server computing was virtually non-existent. The original plan called for the development of a unified networking environment for this heterogeneous computing environment based upon the use of TCP/IP and its associated suite of protocols (FTP, telnet, SMTP, etc.). It soon became apparent that this plan was not going to be successful. The reason for this was quite simple. The gains in interoperability derived from the exclusive use of TCP/IP were offset by the loss of the functionality and richness of the native network operating system (e.g., Mac OS). The gains from interoperability were potential and in the future; but the loss in functionality was real and immediate.

Integration Server (Future)

s

POSIX and X

A path to integration in an academic health science center.

This article describes a networking and integration strategy in use at the University of Michigan Medical Center. This strategy builds upon the existi...
792KB Sizes 0 Downloads 0 Views