Presidential Address

Methods for Medical Decision Making MARGARET HOLMES-ROVNER, PhD Society for Medical Decision Making (SMDM) is consistently at the forefront of methodology development in medical decision making. At the same time, the Society has an empirical goal, to improve medical decision making at the individual and social levels. As we consider the impact of the Society, and look toward

investigators could count on either publication or funding. Thus, the inauguration of the journal Medical Decision Making was an exciting venture and an absolute necessity. As a measure of how far we have come, it is worth noting that now our journal editor has to remind us &dquo;not to forget Medical Decision Making&dquo; when we have an important paper to submit. The

The

the demands of the future, we must consider the relative effectiveness of our methodologic and empirical imperatives and ways they must come together in the remainder of the twentieth century. The goals of the Society are clearly laid out by Dr. Eugene Saenger in vol. 1, no. 1 of Medical Decision Making: &dquo;To promote the theory and practice of medical decision making through scholarly activities, including research on and application of analytic methods to medical decisions affecting the care of individual patients, and to health policy decisions affecting the health of larger populations; and To promote greater understanding of medical decision making in all branches of medicine and allied sciences.&dquo;’ He points out that while many scientific societies have some of the same goals, none of them engage in this quest for new analytic methods both for individual patient care and also for the health of large populations. Saenger also takes on the challenge of the technological imperative and points out that physicians are ill prepared by culture or training to resist the fabulous offerings of biomedical advances, thus establishing the important interdisciplinary imperative the Society for Medical Decision Making has supported and enjoyed. In the same issue, the second president of the Society, Harvey Fineberg, challenges us to address the issues of analyzing new scientific knowledge and medical technology in an era of limited resources in a fashion that promotes medical decisions that are consonant with patients’ preferences and values. Pursuit of these goals has led inevitably to present involvement in medical effectiveness research, broadly construed.~2 In 1980, the pursuit of better methods to study and improve medical decisions was not a field in which

methods and the results of studies of medical decision making appear to have moved from maverick to mainstream. However, the ranks of the practitioners of decision making research are inevitably limited by the level of sophistication required to participate in the dialog. In vol. 1, no. 1 of Medical Decision Making, Dennis Fryback reviews the then-new book, Clinical Decision Analysis, by Weinstein, Fineberg, Elstein, et al., the &dquo;red book&dquo; which many members of SMDM cut their decision-analytic teeth on. Fryback begins by saying, &dquo;Finally! A decision analysis textbook specifically aimed at clinicians&dquo; and then goes on to say, &dquo;Students should be aware that this is not necessarily an easy book; it is not a quick and painless way to learn decision analysis, nor a book that can be read 3

lightly.&dquo; An important aspect

of the Society continues to be the pursuit of difficult methods with rigor and, for most, pleasure, across disciplines. The consequence, we recognize with some ambivalence, is that the extent to which we are involved in the intricacies of increasingly sophisticated methods of analysis limits the extent of our appeal to a mass audience. I suggest that as we become central in producing the tools to shape health care and health care decision making, we must also address creatively the problem of making the knowledge produced and reported accessible to broader audiences. How to do that is a strategic question important to the Society. To address a broader audience, we need first to understand who has heard us to date, and what the Society has embraced from the broader health care arena. Where have we found success? Who has been our audience? As an organization, we have not undertaken any programmatic agenda beyond the promotion of theory and practice of medical decision making. As individuals, we have done a great deal to apply these methods in important arenas. Members of SMDM have been influential in the creation of the

Presented at the thirteenth annual meeting of the Society for Medical Decision Making, October 21, 1991, Rochester, New York. Address correspondence and reprint requests to Dr. HolmesRovner : Michigan State University, B 220 Life Sciences Building, Department of Medicine, East Lansing, MI 48824.

159

Downloaded from mdm.sagepub.com at UNSW Library on July 5, 2015

160

Care Policy and Research (AHCPR) and the work of the Medical Effectiveness program. Many members of the Society are participating in AHCPR-supported outcomes research and guidelines work. As an example of the role members of the Society have played in medical outcomes work, it is instructive to cite the recent Report to Congress of the AHCPR on the feasibility of linking research-related databases to federal and nonfederal medical administrative databases.4The reason for the Agency’s supporting data linkage is to fulfill its mission of supporting research in medical effectiveness. An appendix to the report presents abstracts of exemplary papers that represent a taxonomy of the work that makes up medical effectiveness research. There are five categories: Geographic Variations in Health Care Delivery and Utilization, Appropriateness of Care, Medical Decision Making, Patient Outcomes Assessment, and Quality of Care. In each category, members of the Society are prominent among the authors of the exemplary papers cited. Two things appear to be going on. One is that there are people in the Society who define the field. The second is that the Society, its meeting, and its journal continue to be important to the people who define these fields, providing an opportunity for synergism among related fields. Before we rejoice that we have established a voice, and set out to simply turn up the volume, however, we need to reflect on the danger that there is a received wisdom developing that may be confining in its narrowness. To understand our preferences, we need to examine innovations that have not had an immediate audience among members of the Society. A primary example, which is somewhat puzzling, is the introduction of quality improvement. This work had a powerful and eloquent introduction at the Society for Medical Decision Making. Don Berwick devoted his presidential address to it four years ago. This year we had a short course on continuous quality improvement, but it hasn’t really &dquo;caught on.&dquo; We need to think about why that is, if we are to retain the excitement of experimentation with new methods even as we move on to make our collective expertise in what are becoming standard methods available to others. Why is it that we have been terrifically excited about outcomes and guidelines for two years, but less so about quality improvement? I suspect it may be that there has emerged a kind of dogma, of which guidelines are the logical end-point. As you may remember, Al Tarlov suggested, in the 1990 SMDM symposium, that his sense, coming to address the annual meeting, was that he was attending a meeting of a religion in process of being developed, and he wasn’t exactly sure what it was.s It is probably fair to say that there is a fundamental objective, if not a dogma, to most of the work presented. It is to establish what is the most efficacious diagnostic or therapeutic approach to a

Agency for Health

produce a measurable effect on The factors considered are usually technical, that is, properties of the laboratory studies and clinical procedures and, for the most part, related to one immediate medical decision. Patient utilities are measured in the context of a decision analysis. Physician cognitive strategies are analyzed from the perspective of how they may or may not deviate from the optimal model. The standard approach in an SMDM meeting for a new treatment or diagnostic problem is as follows: You do a decision tree, do a literature synthesis, apply techniques of meta-analysis to accumulate probabilities, assess utilities (sometimes of real patients), crunch the tree on available software, do a cost-effectiveness analysis, and, viola!, a guideline, whose logic we are convinced will be compelling to all our rational colleagues. Related work on clinical prediction rules begins to address what happens in some actual clinical setting, but does not try to understand at a deep organizational or human perceptual level the sources of variability and the functions they may serve that have only indirect effects on the patient morbidity and mortality associated with that illness clinical

problem

patient

outcome.

to

episode. we meet the challenge to broaden our perand still keep honing the methodologic tools spectives we use, and sometimes develop, in the important pursuit of optimal models and of effectiveness research? I offer two suggestions. One is to anticipate and encourage a broadening of access to methodologic tools for study of health care decision making. The other is to the context within which we embed our understanding of decision-making problems. How should we broaden access to methods that are already in the public domain? One thing that produces a narrow focus is the necessity to get deeply into a rather complicated methodology in order to execute a reasonable study. That means, by definition, we have to focus narrowly on the intricacies of the method, leaving a small audience for its application and replication. However, that doesn’t have to be the case if we use creatively the power of microcomputers and the flexibility that they really provide. Many of the very complex analytic tools we use could be embedded in software that structures the approach to the problem as well as performing the calculations required. Rather than encourage naive approaches to problems, carefully structured analytic software can avoid analytic traps known to the most sophisticated methodologists. An example was discussed at the 1991 annual meeting by Mark Roberts.’ His paper, &dquo;Monte Carlo Sensitivity Analysis: Be Careful Where You Role the Dice,&dquo; discusses a subtle error in the analysis, which, it was pointed out in discussion, is possible with some decision-analytic software, and not others. Carefully developed methodologic tools can structure the approaches to problems to bring to bear the prior knowledge and experience of the best theorists in the field.

How do

broaden

Downloaded from mdm.sagepub.com at UNSW Library on July 5, 2015

161

There is great potential to develop software that can be used by content specialists whose methodologic expertise has been added to primaiy training in content and different methods of analysis. That will allow a much broader range of people to engage in effectiveness research, people who may have a unique perspective to bring to bear on clinical or policy problems. For example, hierarchical statistical models may very well take small area variation studies to a new level in explaining variability as a function of variability embedded in characteristics of patients as members of families and then other groups. Another example comes from work being done in medical geography at Michigan State University. The medical geographers are creating software that will map the state by county and by township with all the descriptive demographic characteristics they can accumulate from the census and other sources. One can generate summaries of the urban/rural characteristics of a place, tell age structure, ethnicity. With the data in an organized, but flexible, form, we should soon be able to import a series of fields from existing medical claims databases. To investigate small area variations and generate hypotheses for careful investigation, it should be possible to do so without being a demog-

rapher or a geographer. If, eventually, artificial intelligence shells can be written that let people in hospitals write decision rules or reminder systems based on their own priorities and analysis of local claims or clinical data, that would hold great potential for the continual improvement of decision making. There is a great need to create software that retains the sophistication of the method, but lets someone who understands the social structure or the clinical problem better than the decision methodologist at least participate and probably to do the entire analysis of the problem. It should not be necessary to have a world-class interdisciplinary team in every hospital, university, or HMO to do first-rate studies of health care decision making. There is, or course, the caveat that intelligent users of methods must come to understand the limitations of the tools and their data. It is certainly possible for analytic tools to be used naively and to simply allow people to reach foolish conclusions quickly. Using my demographic software without a medical geographer, I may reach a conclusion about the obvious relationship between poverty and utilization of mammography, only to have a geographer show that you can’t tell anything about poverty at the county level, but that you have to use the township as the unit of anal-

ysis. However, the need to provide a level of training that trains intelligent users of computer-assisted analysis is an area in which members of the Society for Medical Decision Making, and perhaps the Society as an organization, have a clear potential. We have an obligation to train intelligent users of our methodologic

then must encourage the deof the resulting studies. We veloping sophistication have to count on the collegial interaction at the SMDM and other annual meetings and in journals to pick up errors and instruct investigators in productive directions. The scientific community has a reasonable track record in this regard. There was a time when it was possible to throw a lot of data into a regression analysis and write an article whose main finding was that there were a lot of significant variables. For the most part, that is no longer possible. I would hope that people in the Society will embed their finest thoughts into vehicles that other people can use. This will be a very important process if it can be accomplished. So far, we have mostly thought about exporting our well-received short courses to other meetings and other campuses. Let me remind you again that we are relatively few and that our numbers are now fairly stable. While a few training institutions produce the methodologists of tomorrow, we need a more intensive effort to train the trainers who are already in the field in related areas in the intelligent use of methodologic tools we should increasingly create in user-friendly packages. Besides creating and using a new generation of analytic tools, the other major way to avoid a narrow dogma is to start really broadening the context of the problems we analyze. The Society for Medical Decision Making has been devoted to the analysis of medical decisions case by case, developing in each case the optimum resource use and information use necessary to maximize life expectancy and quality of life in a fashion consistent with patient preferences and values. While this is clearly an important undertaking, it falls short of producing public policy. We have to develop models that embed medical decisions into the context of the patient-provider interaction, and into the context of the culture and the resource limitations that clearly will continue to impinge on even optimal models. How do we turn decision making research into public policy? The success of the medical effectiveness work provides a good beginning and a good model for how to move to the next level. One of the things the medical effectiveness work has done right is to start with good descriptive research. So long as we started with models that we all agreed did not capture human behavior, there was no real reason for policy makers to listen. Beginning with an observation of variability makes it possible to discuss alternatives that exist in reality before suggesting what may or may not be improvements. The other thing the medical effectiveness field did right was to engage policy makers in a dialog about what the important issues were. Bill Roper began his symposium presentation in 19905 by saying that the history of the medical effectiveness movement had surged ahead when the Health Care Financing Administration (HCFA) invited John Wennberg, Robert armamentarium. But

Downloaded from mdm.sagepub.com at UNSW Library on July 5, 2015

we

162

Brook, and David Eddy to

come

to

Washington to give

help policy makers at HCFA catch up in for techniques determining what the issues surroundeffectiveness in health care were. That was a very ing important dialog. The agenda for what to study was really set by HCFA. The guidelines panels focus on those procedures that cost HCFA the most money. That is surely not the only arena in which medical decision making can support rational public policy. We need to find the entry points for which there are similar matches between the burning questions and the research we do. We also need to broaden the range of persons with whom we engage in the dialog. The Agency for Health Care Policy and Research has taken a creative approach to supporting the effectiveness research. If it remains the only constituency for effectiveness research, however, the whole enterprise is at risk. Further, we need to speak in a language that is meaningful and that makes it very clear that we are not strictly self-interested. That is a very tall order, for a variety of reasons. In the first place, the politics of mistrust between physicians and policy makers handicaps discussion from the outset. It may be that multidisciplinary research efforts at least increase the perceived possibility that the public good is the end being seminars to

sought. The medical effectiveness movement has set an imin approaching problems with an intellectual integrity that absolutely must be maintained as we move into broader arenas. Even then, in the present climate, the task will not be easy. Cognizance of inevitable mistrust must be part of the context we consider when we undertake to speak honestly and analytically with policy makers about the relevance of our analysis to public policy. Our research is relevant to public policy, but we need to broaden the context of our analysis and our discussions. Decision analysis has allowed us to clarify some small tradeoffs in years gained for resources spent in a series of decisions. We have found ready dialog with patients, clinicians, and payers about the trade-offs. To a great degree, we have been able to build consensus about how to get the most from health care resources in individual cases. Now, in addition, we need to start to clarify the trade-offs for the whole system if we pursue alternative sets of arrangements. If we can begin to engage patients, physicians, and policy

makers in creating the alternatives, the Society for Medical Decision Making can make major contributions to the creation of the consensus that is viable

public policy. For policy makers in particular, we have a lot to offer. We can help quantify areas of uncertainty. We can clarify where there is evidence about what works and where there is no good evidence. Negative results are still not fashionable. That has to change. We have to draw macro trees that include spectra of preventive and treatment strategies. We have to do very good descriptive research. The paper by Hershey et al. describing the &dquo;bandwagoning&dquo; and &dquo;altruism&dquo; phenomena that influence people’s willingness to take flu shots provides information that is at least as important to policy as determining the optimal timing of the shot.’ In doing utility assessment, it is not enough to tell patients to give us their utilities and we will give them back a decision. Utility assessment alone is not good enough in a continuous quality environment. It should be only a step in an iterative process between recipients of care and providers. The dialog must clarify for patients, clinicians, and payers the consequences of the aggregate of decisions in which we all participate all the time. If we can start the dialog in many places, and provide the tools and the data to shed light on difficult choices, then we will begin to contribute to public policy. 7

portant precedent

References

.

EL. Once to every man ... an introduction? (editorial). Making. 1981;1:1-3. 2. Fineberg HV. Medical decision making and the future of medical 1.

Saenger

Med Decis

Med Decis Making. 1981;1:4-6. Frybeck DJ. Book review of Clinical Decision Analysis. Med Decis Making. 1981;1:94-7. Agency for Health Care Policy and Research. Report to Congress: The feasibility of linking research-related data bases to federal and non-federal medical administrative data bases, April 1991.

practice.

3. 4.

5. Tarlov AR. Health outcomes research

and its interfaces with medical decision making. Symposium, twelfth annual meeting of the Society for Medical Decision Making. November 30, 1990; 523.

Roberts MS. Monte Carlo sensitivity analysis: be careful where you role the dice (abstr). Med Decis Making. 1991;11:328. 7. Hershey JC, Asch DA, Thumasathit T, Meszaros J, Waters W. Decisions to vaccinate against contagious disease: the roles of altruism, free riding, and bandwagoning (abstr). Med Decis Mak6.

ing.

1991;11:323.

Downloaded from mdm.sagepub.com at UNSW Library on July 5, 2015

Methods for medical decision making.

Presidential Address Methods for Medical Decision Making MARGARET HOLMES-ROVNER, PhD Society for Medical Decision Making (SMDM) is consistently at th...
445KB Sizes 0 Downloads 0 Views