“If ChatGPT Writes it, Did the Student Learn It?” - 4 Practical Ways to Keep Essay Assessment Authentic

Generative AI has changed what we can easily expect student writing to show us; these concise research-informed ideas help essays remain a window into genuine student learning.
by LSA Learning & Teaching Technology Consultants

Generative AI has made many of us revisit a core question: what exactly are we assessing when we assign essays? Rather than banning AI tools or policing student work with unreliable detectors, which rarely aligns with the values of a liberal arts classroom, many instructors are redesigning assessment so the learning process is visible. These research-informed strategies invite students to demonstrate their thinking at multiple points in the writing cycle with the aim of reducing the appeal of AI outsourcing while reinforcing the habits of inquiry and reflection.

1. Grade the Process as Well as the Product

Students who hand in a polished final essay with no visible process provide little evidence of their thought process. Building graded milestones into essay assignments, such as proposal, annotated bibliography, draft, and short reflection, turns those invisible steps into evidence of learning.
A helpful resource in this regard is a Google Docs extension tool called Process Feedback which can be used to collect short, timed writing samples, and compare drafts to document growth over time. Its emphasis is pedagogical rather than punitive, showing how process-based assessment can promote revision habits. Research on formative assessment supports this approach: giving feedback and grades across stages deepens learning and discourages academic dishonesty (Carless & Boud, 2018; Nicol & Macfarlane-Dick, 2006).

2. Short, Timed In-Class Writing as a Diagnostic Baseline

A brief (say 15-30 minutes) in-class writing task tied to the essay prompt creates a reference point for each student’s voice and reasoning. It’s a quick, low-stakes way to see how students frame arguments before heavy editing or outside help. The writing can become an early checkpoint for feedback and later comparison with the final submission.
Studies suggest that short, timed writing tasks designed to get thoughts down rather than polish a conclusion can effectively measure fluency and cognitive engagement when used as one piece of a broader assessment process (Huot & O'Neill, 2009; Weigle, 2002). While these exercises may feel old-fashioned, they can anchor assessment in observed learning rather than suspicion.

3. Add a Brief Oral “Mini-Defense” or Short Video Reflection

A short conversation or a two-minute video where students explain their arguments, key source choices, or main revision decisions helps make thinking visible. These oral reflections often reveal how students connect sources, interpret evidence, and refine claims. These reflections are not meant as oral exams but as windows into reasoning that text alone may hide.
Research on oral reflections shows that such dialogues reveal conceptual understanding, promote metacognition, and strengthen ownership of ideas (Sambell et al., 2012). When students articulate why they made certain choices, instructors gain richer insight into learning and fabricated work becomes easier to spot naturally. To this end, Harmonize, an LSA-supported discussion tool,  offers a platform for students to record and upload video. Zoom may also be a good option for short video reflections.

4. Make Peer Review a Structured, Credit-Bearing Step

Peer review becomes more meaningful when guided by a clear rubric focused on argument, evidence, and clarity. Assigning modest credit for both giving and responding to feedback helps students take the process seriously. With sample drafts or model reviews for calibration, peer review turns into a collaborative space where students learn to read and reason together. Peerceptive is another LSA-supported external tool that can be used for peer review.
Empirical studies consistently show that structured peer review enhances writing quality and self-regulation (Double et al., 2020; Huisman et al., 2018). It also models collaborative critique that defines academic communities, an outcome worth highlighting in liberal arts contexts.
Each of these moves is modest on its own but together they make student learning more visible and meaningful, even in an era when AI can produce polished prose in seconds. In doing so, essays regain what makes them valuable in the first place: an authentic record of learning in progress. 
If you would like to speak with an instructional consultant about essay assessment design, or about GenAI in the classroom, you can request a consultation here. We’re always happy to help!


References/Additional Resources:

Carless, D., & Boud, D. (2018). The development of student feedback literacy: Enabling uptake of feedback. Assessment & Evaluation in Higher Education, 43(8), 1315–1325.

Double, K.S., McGrane, J.A. & Hopfenbeck, T.N. (2020). The impact of peer assessment on academic performance: A meta-analysis of control group studies. Educational Psychology Review, 32, 481–509.

Huot, B. A., & O’neill, P. (2008). Assessing writing: A critical sourcebook. Bedford/St. Martin's.

Huisman, B., Saab, N., van den Broek, P., & van Driel, J. (2018). The impact of formative peer feedback on higher education students’ academic writing: A meta-analysis. Assessment & Evaluation in Higher Education, 44(6), 863–880.

Nicol, D. J., & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199–218.

Sambell, K., McDowell, L., & Montgomery, C. (2012). Assessment for Learning in Higher Education (1st ed.). Routledge.

Weigle, S. C. (2002). Assessing writing. Cambridge University Press.

Email
Release Date: 01/29/2026
Category: Learning & Teaching Consulting; Teaching Tips
Tags: Technology Services

TECHNOLOGY SERVICES

G155 Angell Hall, 435 South State St, Ann Arbor, MI 48109–1003
734.615.0100
LSATechnologyServices@umich.edu 

Technology Services Contact Center Chat