Elaboration – Center for Teaching and Learning /ctl Wed, 25 Mar 2026 20:23:52 +0000 en-US hourly 1 /ctl/wp-content/uploads/sites/88/2024/01/cropped-android-chrome-512x512-1-32x32.png Elaboration – Center for Teaching and Learning /ctl 32 32 Exploring the Impact of Required Justifications in Multiple-Choice Elaboration Questions on Student Experiences and Performance /ctl/exploring-the-impact-of-required-justifications-in-multiple-choice-elaboration-questions-on-student-experiences-and-performance/ Wed, 04 Feb 2026 18:04:18 +0000 /ctl/?p=5120 This study investigated a hybrid assessment format called Multiple-Choice with Elaboration Questions (MCEQs). In these questions, students not only select a multiple-choice answer but also must justify their choice in writing. The research was conducted across four sections of an upper-division psychology research methods course at a large public university.

]]>
This study investigated a hybrid assessment format called Multiple-Choice with Elaboration Questions (MCEQs). In these questions, students not only select a multiple-choice answer but also must justify their choice in writing. The research was conducted across four sections of an upper-division psychology research methods course at a large public university. The researcher found that students performed better on the questions that required elaboration compared to traditional multiple-choice questions and students also rated them more positively, viewing them as better assessments of their learning. The study suggests that MCEQs can capture deeper student understanding than traditional multiple-choice items and may support better memory and engagement with material.

This research offers several actionable insights for faculty:

  1. MCEQs combine efficiency and depth: they retain the quick grading benefits of multiple-choice items while also requiring students to demonstrate their reasoning.
  2. Requiring justification can help students engage more critically with content, potentially improving conceptual understanding and reducing surface guessing.
  3. Written responses provide richer data for instructors to identify student misconceptions and tailor instruction or feedback accordingly.
  4. Students may view assessments as fairer and more authentic when they can explain their answers, which might support motivation and reduce test anxiety.
  5. Implementing MCEQs requires careful rubric design and possibly more grading time than standard multiple-choice. However, the educational payoff may justify the added effort.

Read the full article here:

Overono, A. (2025). Exploring the impact of required justifications in multiple-choice elaboration questions on student experiences and performance. Journal of the Scholarship of Teaching and Learning, 25(4). 

]]>
Eight Ways to Promote Generative Learning /ctl/eight-ways-to-promote-generative-learning/ Wed, 21 Jan 2026 21:28:25 +0000 /ctl/?p=5173 Fiorella and Mayer argue that learning is generative—students learn best when they actively make sense of new information by selecting, organizing, and integrating it with prior knowledge. They synthesize research identifying eight evidence-based strategies that consistently promote deeper understanding and transfer across contexts. These strategies shift learners from passive reception to active sense-making.

]]>
Fiorella and Mayer argue that learning is generative—students learn best when they actively make sense of new information by selecting, organizing, and integrating it with prior knowledge. They synthesize research identifying eight evidence-based strategies that consistently promote deeper understanding and transfer across contexts. These strategies shift learners from passive reception to active sense-making.

The 8 generative learning strategies (with applied examples)

  1. Summarizing: Example (History, face-to-face): After a mini-lecture on Reconstruction, students write a 3-sentence summary explaining its goals, challenges, and outcomes in their own words.
  2. Mapping (concept maps / graphic organizers): Example (Biology, online asynchronous): Students create a concept map linking cellular respiration stages (glycolysis, Krebs cycle, ETC) using a shared digital mapping tool.
  3. Drawing: Example (Physics, hybrid lab): Students draw a free-body diagram of forces acting on an object before running a simulation on motion.
  4. Imagining (mental imagery): Example (Anatomy & Physiology, online synchronous): While reading about blood flow, students mentally visualize the path of oxygenated blood through the heart chambers, guided by instructor prompts.
  5. Self-Testing (retrieval practice): Example (Psychology, online asynchronous): Students complete low-stakes quiz questions from memory (no notes) after a module on classical conditioning, followed by immediate feedback.
  6. Self-Explaining Example (Mathematics, face-to-face): While solving worked examples, students explain aloud or in writing why each step is taken in solving a system of equations.
  7. Teaching (explaining to others) Example (Education, hybrid): Students record a short video teaching a learning theory (e.g., constructivism) to a hypothetical first-year teacher audience.
  8. Enacting (gestures or physical manipulation): Example (Chemistry, in-person lab): Students use hand gestures to model electron movement during covalent bonding before writing structural formulas.

Key Takeaway: The most powerful learning gains occur not from what instructors present, but from what learners actively generate—and the effectiveness of each strategy depends on matching it to the content, learner prior knowledge, and learning context. Thoughtful selection and scaffolding of generative strategies can reliably improve comprehension and transfer.

Read the full article here:

Fiorella, L., & Mayer, R. E. (2016). Eight ways to promote generative learning. Educational Psychology Review 28(4):717–41.&²Ô²ú²õ±è;.

]]>