Adventures in Corequisite STEM Calculus: Grading for Growth



Kelly Spoon

Kelly Spoon

Kelly Spoon is a Professor of Mathematics at San Diego Mesa College in the San Diego Community College District. She has been at the forefront of her institution’s response to AB705 and AB1705, playing a key role in coordinating Mesa’s corequisite classes for Intermediate Algebra, Statistics, and, most recently, Calculus.

Welcome back! In this blog series we’ve been exploring instructional practices that help students with varying levels of preparation to succeed in a corequisite-supported STEM calculus class. This post is about assessment and grading, which brings us back full circle to our discussion of my syllabus. In that first post, I wrote about the importance of building your students’ trust in your commitment to them and to their learning. Nowhere is this more clearly communicated than through your grading policies and assessment practices. 

Before we delve into the specifics, let’s take a look at the grade distribution from my Fall 2023 supported calculus class; 31 of my 39 (80%) students passed the course, compared to our campus pass rate of 60%, and 29 of the 31 earned an A or B. 

Are these results evidence of an easy class or evidence of a grading and assessment schema that fosters persistence and learning? In this post, I will discuss how I maintain rigor in my supported calculus course, motivate students to persist, and ensure that the grades they earn reflect their learning. 

In both my unsupported and supported calculus classes, I have made two big changes to my approach to assessment and grading: (1) the grade is based on evidence of learning, not on effort, and (2) I grade for growth, allowing opportunities for skill attainment over time. 

I’ll give an overview of my approach, share trouble-shooting strategies, and end with a list of resources that informed and motivated me to change the way I grade.

My goal is to eventually move toward ungrading practices. As an interim step, I am adjusting my assessment practices to determine how best to gauge a student’s understanding. So for now, I still calculate the course grade by weighting assignment types. Here is my current grading breakdown: Exams (two midterms and a final) are 40% of the grade, weekly written quizzes 30%, frequent online Knowledge Checks 20%, a portfolio 10%.

Even though the grade is based on a weighted mean, I have taken the first step in a move toward standards based grading, which is defining the standards. I participated in a Faculty Inquiry Group (FIG) on Standards Based Grading in Calculus where my colleagues and I identified a core set of learning objectives. We adapted Maria Andersen’s ESIL framework to evaluate the depth of engagement required for each topic in our department’s course outline.

  • Existence: Does the learner know it exists?
  • Supported: Can the learner do it with help from notes and peers?
  • Independent: Can the learner do it independently?
  • Lifetime: Can the learner maintain the skills for success in future STEM courses?

While we did not come to a full consensus, the conversations clarified for me how I would assess each topic. For example, a topic at the Existence level might only occur within a group activity in class, such as a Card Sort to derive d/dx sin(x) using the limit definition. A topic at the Supported level might appear in a class activity and be assessed with a challenging problem in the portfolio, such as  a related rates problem. A topic at the Independent level might be assessed (and mastered) on a weekly quiz and on an exam, such as finding the derivative of a composition of functions. A topic at the Lifetime level might be assessed (and mastered) on quizzes, exams and the final exam, such as using u-substitution to evaluate an integral. 

To ensure that my course is rigorous, the course grade reflects the student’s learning. I do not assign points for attendance or for effort on homework. I do not give extra credit. The grade is based on evidence of the student’s mastery of calculus skills. Written assessments performed in front of me comprise 70% of the course grade (exams and quizzes). I also have normed the difficulty of my exams with others in my department and with colleagues at other colleges. 

Tip: For effective test norming, I recommend two activities: First, each participant brings an anonymized exam to the session. Working alone or with a partner, we analyze these exams for commonalities and differences in question types and concepts. This fosters a ‘notice and wonder’ approach. Through this activity, participants can discuss problem types and levels of difficulty. Secondly, a scoring exercise I adapted from my time as an adjunct at Grossmont College involves faculty scoring a flawed student response on a 10-point scale. We then discuss the varied scores, displayed on a dot plot, to calibrate grading standards. This sparks insightful discussions that guide faculty towards more consistent grading practices.

Now to the idea of grading for growth: 75% of the course grade is based on assessments that students can redo to varying extents. This is an important contrast to classes where a single poor exam score can significantly alter the final grade in the course – and students aren’t given the opportunity to show if they later learned the material. Redo opportunities are available in my class as follows: 

  • The online Knowledge Checks (20% of the grade) are DeltaMath problems that are automatically graded as correct or incorrect. This platform provides a variety of question types that challenge students and prevent rote memorization. When a student misses a problem, they are required to successfully solve three versions of the problem. The platform provides support with worked examples. 
  • Weekly quizzes (30% of the grade) are keyed to the set of core standards that were co-developed in the FIG. Each quiz covers two standards, one problem per standard. I grade these quickly on a scale of 0-2, with a 2 representing a perfect solution or a complete solution with a transcription error or other minor mistake. Students take each quiz up to 3 times to improve their score.
  • The portfolio (10% of the grade) is a collection of the student’s work on challenging problems that is completed in sections throughout the semester. I give formative feedback on these problems and students can correct their work before submitting their portfolio at the end of the semester. 
  • Finally, I give two midterms (each  is 15% of the grade), one on techniques of differentiation and the other on integration. Students have the option of retaking one midterm on the day of the final exam. 

You might wonder how I manage the logistics of multiple retakes, including class time, grading, and assessment creation. Or perhaps you’re concerned that allowing reassessments might lead students to simply mimic answers from previous attempts. My colleagues and I have developed effective strategies to address these challenges, ensuring that students demonstrate genuine mastery over time. I will discuss these strategies next. 

My students can retake each in-class quiz up to three times. This gives faculty pause because of the potential for increased workload. Here are some ways to mitigate this, along with lessons learned after three semesters.

Scheduling reassessments

Within the supported calculus class, I have eight hours of contact time each week. I dedicate the final 45 minutes of my Friday class meeting to weekly quizzes. Students start their weekly quiz at 11am and have until 11:45 to work on that quiz and any retakes that are available. I also have office hours directly after class on Fridays, so I can extend that time as needed.


I grade on a scale of 0-2, where 0 indicates no understanding, 1 shows partial understanding, and 2 reflects correct understanding. Since I do not spend time justifying scores or deciding how many points to deduct for errors or to award for correct steps, I can grade all quizzes within 30 minutes, including entering scores into Canvas.

I don’t get hung up on the “1” score, which can represent a wide range of errors. In my gradebook, a ‘1’ is marked as ‘🟡 Almost’. Typically, the student has shown they understand the calculus idea of the standard such as applying the product rule, but they may have struggled with notation (e.g., d/dx =) or with an algebraic step or maybe there was a mistake in taking the derivative of a function. It can feel unnatural to give all of these errors the same score, but that score indicates to students that they are on the right track and encourages them to address minor errors while also holding them to a high standard.

Creating Reassessments

Creating reassessments efficiently is a common concern among faculty considering this approach. To address this, I use several tools that significantly reduce the time required. I choose tools that: 

  • Have questions categorized by standards similar to those I am assessing.
  • Have a variety of question types per standard, so students won’t be able to mimic their way through the reassessment. 

I always try to make reassessments more challenging (or significantly different than the previous version) to encourage students to appropriately prepare for the first attempt. Here is more information on some of my favorite tools for generating quizzes.


Developed by Steven Clontz, CheckIt is a free, open platform that offers randomized exercises. As  part of the Team-Based Inquiry Learning (TBIL) initiative, an extensive question bank in CheckIt was built for Calculus and Linear Algebra. The questions in the Calculus TBIL Exercise Bank are keyed to clearly defined standards. Features like ‘LMS Export’ and ‘Assessment Builder’ allow for exporting multiple versions of questions (up to 999) as a QTI file or refining a question until it meets specific needs before exporting to OverLeaf or your favorite LaTeX compiler for further editing. You can see an example learning objective and set of questions on continuity below.


DeltaMath is a favorite homework platform of many high school math teachers. I personally love the curve sketching capabilities and that it is free for students and largely error free. With their paid version (which is $170 per year at the individual level), I can print assignments to PDF and make reassessments with DeltaMath as well – it also allows a bit more control as you can re-generate particular outcomes OR the entire quiz.


Lastly, giving students the opportunity to make quiz questions and answer keys is a great class activity (and a way to lessen your workload). Groups can also quiz each other.

Encouraging students to prepare for a reassessment

A common challenge my colleagues and I have encountered is students attempting reassessments without sufficient preparation, which increases our grading workload. To address this, we’ve implemented several strategies to ensure students are better prepared and to streamline our assessment process:

Limiting Attempts: We allow up to three reassessment attempts per quiz with a defined closing date of the end of the unit. This structure encourages students to take initial attempts more seriously and to use reassessments as genuine opportunities for improvement rather than just trying again to see what happens.

Scheduling Reassessments: All reassessments are scheduled on Fridays. This regularity helps students plan their study time effectively and simplifies our weekly planning. Some practitioners limit reassessment days further, but I find a weekly opportunity strikes the right balance between accessibility and manageability.

Entry Requirements: To qualify for a reassessment, students must complete a preparatory task, such as a remediation worksheet or a redo of the original quiz problem along with a similar new problem. When students understand that they need to demonstrate readiness to retake a quiz, they tend to prepare more diligently, resulting in more substantive learning and less frequent need for multiple reassessments. 

Here are some resources that have motivated me to change my grading policies and practices. 

If you’re interested in how folks are doing alternative grading methods, check out the resources above. I also strongly suggest attending the 2024 Grading Conference which is happening this week! Registration closes June 10!

Do you have any questions that haven’t been answered in this series or during Thursday community hours? Fill out my survey to ask a question for the final installment of Adventures in Coreq STEM Calculus.

Kelly Spoon

Kelly Spoon is a Professor of Mathematics at San Diego Mesa College in the San Diego Community College District. She has been at the forefront of her institution’s response to AB705 and AB1705, playing a key role in coordinating Mesa’s corequisite classes for Intermediate Algebra, Statistics, and, most recently, Calculus.

On Key

Related Posts

classroom where students are gathered around cards at a desks

Adventures in Corequisite STEM Calculus: Warm-Up Routines that Build Curiosity, Confidence, Conceptual Connections, and Community

Blog post #2 in the Adventures in Corequisite STEM Calculus. Professor Kelly Spoon discusses Warm-up routines that generate curiosity, deep thinking, and engagement, setting the stage for a productive lesson. This blog series illuminates the critical course design decisions and instructional strategies pivotal in supporting students with varying levels of math preparation to succeed in STEM calculus.

Read More »