Student AI Bill of Rights
The National Student Legal Defense Network recently released a Student AI Bill of Rights, a document outlining considerations for higher education as generative AI becomes more prolific in learning and the workplace.
Three weekly tidbits from the Center for Teaching & Learning plus upcoming events.
The National Student Legal Defense Network recently released a Student AI Bill of Rights, a document outlining considerations for higher education as generative AI becomes more prolific in learning and the workplace.
This 2026 study investigated whether the order in which students complete generative tasks (like generating their own examples of concepts) and retrieval tasks (like cued recall) affects learning outcomes. Using a 3×2 experimental design with 208 university students, the researchers compared three task sequences — generative-before-retrieval, retrieval-before-generative, and restudy-before-generative — under two timing conditions: completing tasks immediately after an initial study phase or completing them two days later. Students were tested one week after finishing the learning tasks on both retention and comprehension of four psychology concepts.
As the semester winds down, resist the urge to fill every remaining class session with new content. Instead, dedicate at least one class period to a low-stakes retrieval activity such as asking students to recall key concepts, work through application problems from memory, or generate their own examples of course ideas without their notes.
From faculty member Ethan Mollick at the Wharton School of Business, here is a collection of prompts you can use with an AI chatbot to help you and your learners get better results. Prompts are grouped into three main categories:
This research examined how 226 undergraduate students learned using concept maps under different conditions, comparing task types (fill-in-the-blanks, shuffled concepts, self-constructed, and summaries) with activity structures (individual only, individual-then-collaborative, and collaborative-then-individual). The study measured learning outcomes through comprehension and recall tests while analyzing nearly 4,200 verbal exchanges during collaborative activities. Results revealed a significant interaction between task type and activity structure: students who individually self-constructed concept maps and then discussed them collaboratively (I+C) achieved the strongest learning outcomes, particularly for delayed recall.
Reading content in Brightspace does not guarantee your learners will remember all of it. In order for it to stick, the student has to actively do something with the information they are reading.
You can prompt students to retrieve an important piece of information, explain a concept as applied in a different context, or consider how they might use a new skill in their practice. These activities take mental energy, and students are likely to just move along without doing them unless you grab their attention and make them interactive. But, how?
This free, short, self-paced course was co-created with the American Association of Colleges and Universities, and it includes short videos, activities, and discussion prompts. When completed you will be able to:
This open-access systematic literature review, published in Frontiers in Education, synthesizes current research on HyFlex (Hybrid-Flexible) course models in higher education — a format in which students choose, session by session, whether to attend in person, join synchronously online, or engage asynchronously. The review draws on studies from across institutional contexts to examine how this radical flexibility affects student engagement, attendance, and learning outcomes. Rather than advocating for one modality over another, the authors investigate what conditions make flexible course designs succeed or fail, and the findings challenge some widely held assumptions about what students actually do when given a choice.
We routinely ask students to use a formal citation style when referencing sources in their work, but have you ever explicitly explained to them why?
In a post-truth information landscape, it is increasingly difficult to distinguish credible information from cherry-picked facts and polished, convincing interpretations, especially as generative AI makes sophisticated-sounding misinformation easier to produce and harder to detect. Now more than ever, our students need to be able to question the veracity of claims and follow evidence back to its source. Citation practices are a foundational skill for doing exactly that, yet we often assign them without explanation.
Yesterday, the U.S. Department of Labor announced the launch of “Make America AI-Ready,” a free artificial intelligence literacy course that will help American workers learn the basics of AI simply by texting “READY” to 20202.