Skip to main content

Grading a Programming Assignment


For programming assignments, Instructors can either autograde the code, manually grade it with a rubric and in-line comments, or use a combination of autograding and manual grading. 

In this guide:


Creating and using an Autograder file 

With our autograder platform, you have full flexibility in setting up whatever language, compilers, libraries, or other dependencies you need. You provide us with a setup script and an autograder script, along with whatever supporting code you need, and we manage accepting student submissions, running your autograder at scale, and distributing the results back to students and to you.

As an instructor, you can create a new programming assignment on Gradescope and upload your autograder zip file using our autograder specifications. Your code produces output in the format that we request. Students submit to Gradescope and have their work evaluated on demand. They can submit as many times as they want and get results back as soon as the autograder finishes running.

If you update your autograder after students have submitted, you can rerun it on all submissions at once from the assignment’s Manage Submissions page.

A student's autograder score is determined from their last, most recent submission.

Submission interface showing autograder test results and submitted code

If you want to learn more about setting up programming assignments, check out our Gradescope Autograder Documentation and Autograder Best Practices.


Manual Grading

You can also grade students’ code manually using the Gradescope grading interface. To enable manual grading, check the “Enable Manual Grading” box when creating a programming assignment. You can also enable this on the settings page for an existing assignment. Your students can then upload their code, and you can grade it from the Grade Submissions tab, using a rubric and in-line comments. Note that students won’t be able to see any scores or feedback from manual grading until you Publish Grades from the assignment’s Review Grades page. 

If manual grading is enabled for a programming assignment, you can indicate one or more manually graded questions on the assignment’s Edit Outline page. You can have both autograded and manually graded components on one programming assignment, or you can skip the Configure Autograder step, set the Autograder question on the Edit Outline page to be worth 0 points, and only use manual grading to grade students’ code. Rerunning the autograder on any (or all) submissions will preserve any manual grading.

Regardless of whether you’re using Autograding, Manual Grading, or a combination of the two, once submissions are in, you can download all students’ code submissions in bulk by selecting the Assignment > Review Grades (left side panel) > Export Submissions (lower right corner). 


Regrading Submissions

If the autograder score for programming submissions needs to be recomputed or rescored, you’ll find a Rerun Autograder button on the assignment’s Manage Submissions page as well. Rerunning the autograder will only rescore the most recent and active submission. Any manual grading will remain unaffected.


Using Code Similarity

Code Similarity is available for courses with an Institutional license.

Code Similarity is a tool to help determine how similar students’ code is. It does not automatically detect plagiarism but rather shows you how similar two programs are to one another.

Compatible programming languages 

Though Programming Assignments and Code Autograder can process all programming languages, currently, Gradescope Code Similarity can only review the following languages for similarity:

C, C++, C#, F#, Go, Java, JavaScript, Matlab, MIPS, ML, Python, and R.


Generating a Code Similarity report

  1. To get started using Code Similarity, in the programming assignment, you will find a step called Review Similarity in the side panel menu. New Review Similarity step in the left sidebar
  2. On the Review Similarity page:
    1. Choose a programming language. This will compare student submissions across those file types and ignore the rest.
    2. (Optional) Upload starter code (or template files) so that similarity matches will ignore student code that matches those files.
    3. Select Generate Report.
How to choose a language, and upload starter code

Viewing Code Similarity Reports

Once the report has been generated, a list of students and their submitted files will be displayed, along with the other student whose submission was most similar to that file.

By default, the results will be sorted by the length of the code that was similar across the pair of submissions. This number is related to the number of tokens in the passages that were similar. A token is a single element of code, e.g. keywords, identifiers, operators, literals, etc.

Student-submitted files for a sample submission, each listed along with the file that's most similar to it

You can click on a student’s file to be taken to the similarity page where you see a comparison view of the file selected and its most similar matching file from another student in the course.

Student-submitted files for a sample submission, each listed along with the file that's most similar to it

In the right sidebar, you can jump to different matching blocks of code by using the numbered buttons. You can also see which students had the next most similar files.


Printing a student's similarity report

On a student's similarity report page, you can select Print Report to print a document containing the code and all highlighted areas where the two code files are similar.

Was this article helpful?
0 out of 0 found this helpful

Articles in this section

Powered by Zendesk