Clinical Utility of Serum Tests for Iron Deficiency in Hospitalized Patients EDWARD R. BURNS, M.D., S. NAHUM GOLDBERG, B.A., CHRISTINE LAWRENCE, M.D., AND BARRY WENZ, M.D.

IRON DEPLETION is mankind's most common nutritional deficiency.13 The most reliable means of diagnosing iron deficiency is by assessing reticuloendothelial (RE) stores of a Prussian Blue-stained bone marrow smear.9 Bone marrow examinations provide data that are sensitive and specific, but such examinations are uncomfortable, expensive, and burdensome to the physician and patient. A biochemical test or a battery of tests that accurately diagnoses iron deficiency has long been sought, but the single best laboratory test for making the diagnosis of iron deficiency anemia has not been determined. All of the tests that are commonly used to screen for iron deficiency have significant limitations. Neither the mean corpuscular volume (MCV)7 nor expert evaluation of a peripheral blood smear14 possesses sufficient sensitivity; and the total iron-binding capacity (TIBC) and transferrin saturation (%Sat) lack specificity.25 The serum ferritin is well correlated with iron stores.18 Low ferritin levels are generally accepted as being diagnostic of iron deficiency,20 but, as an acute phase reactant, ferritin levels may be nonspecifically elevated by inflammatory conditions.8 This may occur even in the presence of iron deficiency, yielding false negative information.

Received March 3, 1989; received revised manuscript and accepted for publication July 24, 1989. Address reprint requests to Dr. Burns: Department of Laboratory Medicine, Albert Einstein College of Medicine, 1400 Pelham Parkway South-6N, Bronx, New York 10461.

240

Departments of Laboratory Medicine and Medicine, Albert Einstein College of Medicine, Bronx, New York

The combined use of serum tests for the diagnosis of iron deficiency generally does not increase the tests' efficiencies. The exception to this observation is the use of the serum ferritin in conjunction with either the MCV3 or erythrocyte sedimentation rate.33 However, these algorithms are difficult to use and continue to provide less than ideal sensitivities and specificities. Hospitalized patients are frequently anemic because of multifactorial and complicated problems. It is specifically this group of patients whose clinical conditions adversely influence the accuracy of serum tests for iron deficiency. Paradoxically, the use of these tests in such patients is common. The present study was designed to evaluate the correlations between the four most common serum indicators of iron deficiency and absent bone marrow iron stores in hospitalized patients. Methods Records of all bone marrow examinations performed at the affiliated hospitals of the Albert Einstein College of Medicine during a five-year period were reviewed, and the clinical laboratory data on all of these patients were abstracted. A total of 301 anemic patients, simultaneously evaluated with bone marrow aspirates, serum ferritin, serum iron, TIBC, and %Sat, were selected for this study. Bone marrow evaluations were performed within one week of obtaining the serum screening tests for iron status. Hospital policy does not allow elective red blood cell transfusions in nonhemorrhaging patients without a hematologic diagnosis. As such, patients did not receive transfusions before bone marrow examination. Diagnoses varied and included inflammatory and infectious diseases, alcoholism, gastrointestinal hemorrhage, trauma, and solid tumors. A diagnosis of iron deficiency was assigned to all patients who had no visible iron in Prussian Blue stains of their bone marrow aspirates. Iron Staining All marrow slides originally reported as lacking stainable iron were reevaluated at the time of this study. A

Downloaded from http://ajcp.oxfordjournals.org/ by guest on May 19, 2016

Serum iron and ferritin measurements lack the requisite sensitivity and/or specificity to accurately diagnose iron deficiency. To determine their utility in hospitalized patients, the authors compared the results of these tests with the presence of stainable iron in bone marrow aspirates of 301 patients. Forty (13.3%) had absent marrow iron. The serum diagnosis of iron deficiency was accepted on the basis of the following: iron < 11 MHIOI/L, total iron-binding capacity (TIBC) > 45 timol/h, transferrin saturation (%Sat) < 0.20, and ferritin < 13 tig/L for females and < 25 Mg/L for males. Using these criteria, iron deficiency was correctly diagnosed by serum iron in 41%, TIBC in 84%, %Sat in 50%, and ferritin in 90% of the patients. The serum ferritin is clearly the only useful serum test for diagnosing iron deficiency in hospitalized patients but is limited by a low sensitivity. The bone marrow examination is the most sensitive test for diagnosing iron deficiency in hospitalized patients. (Key words: Anemia; Iron deficiency; Ferritin; Transferrin) Am J Clin Pathol 1990;93: 240-245

SERUM TESTS FOR IRON DEFICIENCY

Vol. 93 • No. 2

minimum often stained particles of marrow squash preparations were examined at 1,000X magnification. Storage iron was distinguished from artifact by identifying and considering only granular-appearing stained material located within marrow spicules. The stability of the iron stain and reproducibility of quantifying iron stores were prospectively evaluated by staining a series often replicate thick marrow smears at weekly intervals with a single batch of hydrochloric acid-potassium ferricyanide stain. These replicates were coded, sorted at random, and simultaneously examined at the end of one month. Marrows were scored for iron content using a scale that ranged from zero to 4+. Patients whose marrow specimens contained no stainable iron were considered to have iron deficiency.

Statistics A serum diagnosis of iron deficiency was accepted on the basis of one or more of the following test results: serum

iron < 11 /amol/L, serum ferritin 45 /umol/L, and %Sat < 0.20. Serum test results were classified as true or false on the basis of the corresponding bone marrow iron stores. Sensitivity, specificity, and predictive values were calculated according to Bayes' theorem.15 The data were also assessed as test pairs by combining different tests for Bayesian analysis to determine which combination of tests provided maximum sensitivity and specificity. Correlations between test results were calculated using Spearman's coefficient correlation for ranked data. Attempts to improve the test efficiencies were made by adjusting the cut-off levels for each of the tests. Sensitivity and specificity for the tests were redetermined using values above and below the initial cut-off values based on the normal ranges listed in Table 1. Results Accuracy and Reproducibility of Morphologic Examination Coded replicate slides stained over the course of one month showed excellent correlations when scored for iron content from 0 to 4+ and evaluated as unknowns by a single trained observer. In more than a third of the specimens (21 of 58) no iron was seen in the first seven particles examined, however, stainable iron could be observed when more particles were examined. Based on these preliminary observations, a minimum of ten particles were examined for each patient included in the analysis. Iron Deficiency Of the 301 patients included in the study, 157 were male and 144 were female. None of the patients had received parenteral preparations of iron dextran. Forty patients (13.3%) had no stainable iron in their bone marrow and were therefore considered to be iron deficient. When compared with this standard, iron deficiency was correctly diagnosed by the serum ferritin in 90% of patients, by the TIBC in 84%, by the %Sat in 50%, and by serum iron in 41%. Although the serum iron was the most sensitive test

Table 1. Normal Ranges and Criteria for the Serologic Diagnosis of Iron Deficiency Analyte Iron TIBC %Sat Ferritin Males Females

Normal Range

CV (%)

Published Normal Range

Cut-Off

8-24 M mol/L (46-132 Mg/dL) 44-85 timoX/L (246-474 mg/dL) 0.12-0.46

4.6

NA

11-27 M mol/L (60-150) Mg/dL 45-72 ^mol/L (250-400) mg/dL 0.20-0.55

400 mg/dL

Clinical utility of serum tests for iron deficiency in hospitalized patients.

Serum iron and ferritin measurements lack the requisite sensitivity and/or specificity to accurately diagnose iron deficiency. To determine their util...
707KB Sizes 0 Downloads 0 Views