Perceptual andMotor Skills, 1992, 7 4 , 1010.

O Perceptual and Motor Skills 1992

ASSESSING COGNITIVE DOMAIN LEVELS OF INSTRUCTIONAL MATERIALS ' ROBERT J. BIERSNER

Silver Spring, MD Summary.-A method is described which improves on readabhty formulas in accounting for the cognitive complexity of instructional macerials. Ideally, education courses are designed systematically to progress within the cognitive domain from simple (i.e., acquisition and retention of basic knowledge) to complex levels, i.e., application and analysis ( 1 ) . If improperly designed, students may become overwhelmed by the instructional material, resulting in high attrition rates The cognitive complexity of instructional material often is assessed in terms of reading grade. Formulas usually include a constant that adjusts the variable terms (e.g., sentence length, syllables per word) for comprehension chfficulty. These constants typically are derived using reading comprehension tests administered to selected persons who read sample passages. Instructional materials developed for a Navy correspondence course were used to test the validity of two formulas as measures of cognitive complexity. Materials were 145 written passages and the multiple-choice/true-false question that accompanied each passage, selected in a quasirandom fashion from the course manual. The format of these items had to conform to the readability criteria (2). These items then were classified into nonerror/error samples according to answers provided by a randomly selected sample of 30 Navy sailors who completed the course during off-duty training. The nonerror sample ( n = 80) were items answered correctly by the 30 sailors, while the error sample ( n = 65) contained items answered incorrectly by at least five of these sailors. The reading grades of the material were based on one of two formulas (2, 3). Cognitive complexity was assessed o n a four-point rating scale based on four cognitive domain levels described by Bloom (1); two college seniors read the material and rated the cognitive complexity required to answer a question using information provided in the passage ( 1 : Little thinking required to answer the question; 4: much thinking and organization requited to answer the question). Raters, unfamiliar with these materials, rated practice items until both agreed on 14 of 15 items. Cognitive complexity was the average of two ratings assigned to the 145 items. Interrater reliabhty for the 145 items was 0.84. T h e Flesch and Smith and Kincaid reading grades for the nonerror and error passages and questions did not differ significantly. Cognitive complexity, however, was significantly higher for error items than for nonerror items (err o r M = 2.53, SD = 0.88; nonerrorM = 2.11, SD = 0.76; tl4, = 3.06,p< ,005). Results show that reading grades d o not account entirely for the cognitive complexity of instructional material. Also, this assessment of cognitive complexity may assist syllabus developers in assessing the cognitive complexity of instructional material and identifying an optimum course sequence. REFERENCES 1. BLOOM, B. S. (Ed.) (1956) Tavonomy o/ educatzonal objectiver: the classification of educational goals. Handbook I: Cognitive domain. New York: David McKay. 2. FLESCH,R. (1951) HOW to test readability. New York: Harper. 3 . SMITH, E. A , , & KINCAID, J. I? (1970) Derivation and validation of the Automated Readab~lityIndex for use with technical manuals. Human Factors, 12, 457-464.

Accepted May 5, 1992 'Request reprints from R. J. Biersner, 9914 Capitof View Avenue, Silver Spring, M D 20910

Assessing cognitive domain levels of instructional materials.

Perceptual andMotor Skills, 1992, 7 4 , 1010. O Perceptual and Motor Skills 1992 ASSESSING COGNITIVE DOMAIN LEVELS OF INSTRUCTIONAL MATERIALS ' ROBE...
48KB Sizes 0 Downloads 0 Views