DRAFT: This module has unpublished changes.

Teacher Research Project

DRAFT: This module has unpublished changes.
User-uploaded Content
DRAFT: This module has unpublished changes.

As part of my mission to develop a reflective and data-driven practice, I conducted a Teacher Research Project on the question: "Can the explicit practice of metacognition improve students' performance on multiple-choice assessments?" The project also investigated how to identify causes of errors made on multiple-choice assessments, and how to use such assessments formatively to improve student understanding and testing skills.


Download the PowerPoint presentation for this project here: Pang TRP print.ppt


The impetus for this project arose when I discovered an apparent disconnect between the level of understanding I saw in my classroom on a daily basis and the test results that emerged when I gave my students a cumulative, mostly multiple-choice midterm exam. Despite showing a good grasp of concepts on formative assessments, many students had a hard time choosing the right answer on multiple-choice items. It was also hard for me to pinpoint the reasons behind their poor performance--reasons that may have differed between students, and between individual questions. With these issues in mind, I searched the primary literature for ideas on how to develop multiple-choice assessents that would teach my students test-taking skills while simultaneously allowing me to give specific formative feedback.


Download my annotated bibliography here: TRP annotated bibliography.doc


Inspired by the variety of modified multiple-choice assessments that have been attempted, and by the idea that metacognitive skills must be explicitly taught and contextualized, I developed "more than multiple choice" quizzes in which each "authentic" question is accompanied by prompts designed to cue the recall or synthesis of certain ideas and/or the use of certain thought processes or strategies. Students must still rely on their own content knowledge, but are guided in reading and thinking carefully and discretely, and in taking advantage of the testing format to maximize their chances of answering correctly. In addition, being able to understand their reasoning allowed me the opportunity to give feedback specifically targeting each student's individual misconceptions or mistakes.


Download my first "more than multiple choice" quiz here: cell quiz.doc


After a few iterations of quizzes with such "scaffolding," I gave my students a similar quiz in which they could free-write explanations of the thought processes or strategies they used to answer each question. In analyzing the results of this quiz, I found that students who consistently wrote out metacognitive explanations scored a higher average than those who did not. Since both groups were a heterogeneous mixture of normally high-performing and low-performing students and had a similar average on the midterm exam, these results suggested a positive link between the explicit practice of metacognition and improved performance on multiple-choice assessments. This finding was corroborated by the many positive reactions I received from students, saying that this format helped them think more carefully and choose the right answer more often.


In the future, I hope to provide my students with an improved and extended course of "more than multiple choice" assessments, repeatedly and explicitly training specific strategies to increase the transfer of such strategies to future and authentic testing situations. In this age of standardized testing, the ability to demonstrate understanding on a multiple-choice assessment is an essential academic skill, and my approach shows promise of being a valuable part of an education that helps students simultaneously develop higher-order reasoning skills and strategic test-taking skills.

DRAFT: This module has unpublished changes.