In this guide:
Grading
Gradescope automatically grades bubble sheet assignments as soon as they have been uploaded. This may take a few minutes, depending on the number of submissions.
Once grading is complete for all submissions, you can publish the grades to your students from the Review Grades page.
Reviewing uncertain marks
Occasionally, Gradescope needs more information to be more confident about what a student has selected as their answer. This could be because the bubble was not fully shaded, or they changed their mind.
To review any uncertain marks:
- Access the Grade Submissions page of your assignment.
- Select the Uncertain Marks notifications at the top of the page.
- A modal listing all instances of uncertain marks will appear with an image of each attempted answer. Select the appropriate bubble option for each answer.
- Select Confirm All Marks.
Manual grading
After your bubble sheet has been auto-graded, you can make manual edits and adjustments if needed. You can still:
- award custom partial credit to certain answer options
- give more detailed feedback
- adjust the rubric in any way
To begin manual edits or adjustments:
- Access the Grade Submissions dashboard.
- Select the question where you would like to amend or edit the grading.
- Make any changes to the rubric for that question. Changes that are made to the rubric will apply to all students who have that version and rubric item applied.
Reviewing grades
Once grading is complete for all submissions, you can publish the grades to your students from the Review Grades page. If your assignment has multiple versions, you can view, publish, and download grades for a specific version by selecting the specific version tab located at the top of the Review Grades page. Select the All tab to action all of the versions at once.
For individual versions, you will also have the option to Export Evaluations and Export Submissions. This cannot be done from the All tab for all versions at once.
If your bubble sheet assignment has multiple versions and you receive a Regrade Request, you can refer to the Versions column on the Regrade Requests page to see the version for each student.
Student Answer Report
A .CSV report of every student’s answer and score for each question can be exported by selecting the Download Responses button.
The student answer report can be useful for completing analysis, such as curving or comparing scores for corresponding questions across versions.
Posting grades to your LMS
If your institution has an LMS (Learning Management System) integration enabled, you can post grades from your Gradescope assignment to the LMS via the Post Grades to [LMS Name] button on the Review Grades page.
-
If your bubble sheet assignment has more than one version - It is possible to either post all grades from every version to a single LMS assignment, or post the grades of each version to separate LMS assignments. Before posting your grades, your Gradescope assignment needs to be linked to a single LMS assignment, or link each version to a separate LMS assignment.
- To post all grades to a single assignment or gradebook column in your LMS, select the Post Grades to [LMS] button on the All tab of the assignment’s Review Grades page.
- To separate the grades for each version to a separate assignment in the LMS, select each version’s tab on the Review Grades page and click Post Grades to [LMS] on each tab.
For more information on using Gradescope with your LMS and determining your LTI version, please refer to the LMS Workflow section of the help center.
Statistics
We gather and correlate data from your students’ submissions to provide you with informative statistics regarding your assignment.
Bubble sheet assignments have statistics and insights which are explained in the guidance below; learn more about Gradescope’s general statistics and tags.
Does your assignment have multiple versions? The reliability and standard error statistics are calculated at the version level; an overall assignment statistic is not provided under the All tab.
Reliability
Reliability (Cronbach’s Alpha) - Gradescope’s Reliability score is calculated using the Cronbach’s Alpha equation. Reliability can be used to measure the likelihood that an assignment’s achieved grades reflect your students’ subject knowledge. A score between 0 and 1 is displayed after evaluating all student answers to all questions within the assignment.
A score indicating assignment reliability typically falls above 0.5.
Don’t worry if your assignment did not achieve a perfect score of 1; research and publication level exams aim for a reliability score of 0.8.
Want to improve your reliability score? There are a few things you can consider to improve assignment reliability.
- Cover a range of topics on the assignment subject
- Increase the number of questions
- Use a variety of question types
- Review the discriminatory score for each of your assignment’s questions on the Item Analysis page.
Standard Deviation
Standard deviation (STD DEV) illustrates how individual students' grades vary compared to the assignment's average.
- Lower STD DEV - Students have achieved very similar scores. If shown on a graph, the grades would appear in a cluster near the assignment’s recorded mean.
- Higher STD DEV - Student grades are widely varied. If shown on a graph, the grades would be spread between the lowest and highest achieved grade.
Standard Error of Measurement
Standard error (SEM) assesses an assignment’s precision. Calculated using the standard deviation, the STD error is used to determine the student's highest and lowest hypothetical grade by adding and subtracting its value from their achieved grade.
A student’s hypothetical grade range is an estimation of results if the student were to repeatedly take the assignment. A higher STD error, causing higher hypothetical grade range, could mean that the assignment is less accurate in portraying student knowledge.
For example, if the STD error was 3.0 and the student achieved a score of 60; the student’s hypothetical grades would range from 57 to 63.
Item Analysis Report
The item analysis report provides a breakdown of how your students answered each question as well as investigatory insights into each question’s discriminatory score.
Each question is listed along with every potential answer for that question. The percentage of students that selected each answer is provided. The percentages shown in green highlight that question’s correct answer; any percentages shown in red indicate that more students selected an incorrect answer than the correct one.
Discriminatory score
The discriminatory score uses the point-biserial correlation coefficient to calculate the correlation between students answering a question correctly and their overall assignment score. Any questions receiving a discriminatory score below 0.2 will be automatically flagged in the insights panel.
Scores below 0.20 indicate that a question may not accurately distinguish between your high-performing and low-performing students. This could be due to:
- the question is too easy or too difficult.
- the wording of the question may be difficult to understand or misleading.
If any questions receive either 100% or 0% correct responses, a discriminatory score cannot be calculated and a value of N/A will be shown.
Insights panel
The insights panel automatically flags any questions that have received a discriminatory score below 0.20.
Select a flagged question to investigate its question difficulty. The question difficulty is determined by the percentage of students that answered it correctly.
- Hard - less than 50%
- Medium - 50%-84%
- Easy - 85%-100%
To provide further insight into the question’s difficulty, we have provided data on how the top and bottom scoring 27% of students from that assignment have performed on that question.
After investigating the provided insights into the flagged question, you can access the question’s rubric or the assignment’s answer key to make any adjustments.