首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Multiple‐Choice Tests and Student Understanding: What Is the Connection?
Authors:Mark G Simkin  William L Kuechler
Institution:College of Business Administration, University of Nevada, Reno, Nevada 89557, e-mail:
Abstract:Instructors can use both “multiple‐choice” (MC) and “constructed response” (CR) questions (such as short answer, essay, or problem‐solving questions) to evaluate student understanding of course materials and principles. This article begins by discussing the advantages and concerns of using these alternate test formats and reviews the studies conducted to test the hypothesis (or perhaps better described as the hope) that MC tests, by themselves, perform an adequate job of evaluating student understanding of course materials. Despite research from educational psychology demonstrating the potential for MC tests to measure the same levels of student mastery as CR tests, recent studies in specific educational domains find imperfect relationships between these two performance measures. We suggest that a significant confound in prior experiments has been the treatment of MC questions as homogeneous entities when in fact MC questions may test widely varying levels of student understanding. The primary contribution of the article is a modified research model for CR/MC research based on knowledge‐level analyses of MC test banks and CR question sets from basic computer language programming. The analyses are based on an operationalization of Bloom's Taxonomy of Learning Goals for the domain, which is used to develop a skills‐focused taxonomy of MC questions. However, we propose that their analyses readily generalize to similar teaching domains of interest to decision sciences educators such as modeling and simulation programming.
Keywords:Constructed-Response Tests                        Student Assessment                        Multiple-Choice Tests
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号