Creating and Using Rubrics
Rubrics are both assessment tools for faculty and learning tools for students that can ease anxiety about the grading process for both parties. Rubrics lay out specific criteria and performance expectations for an assignment. They help students and instructors stay focused on those expectations and to be more confident in their work as a result. Creating rubrics does require a substantial time investment up front, but this process will result in reduced time spent grading or explaining assignment criteria down the road.
Reasons for Using Rubrics
Research indicates that rubrics:
Increase transparency and consistency in grading.
- Rubrics can help normalize the work of multiple graders, e.g., across different sections of a single course or in large lecture courses where TAs manage labs or discussion groups.
Increase the efficiency of grading.
- Well-crafted rubrics can reduce the time that faculty spend grading assignments.
- Timely feedback has a positive impact on the learning process.
Support formative assessment.
- When coupled with other forms of feedback (e.g., brief, individualized comments) rubrics show students how to improve.
Enhance quality of self- and peer-assessment.
- By giving students a clear sense of what constitutes different levels of performance, rubrics can make self- and peer-assessments more meaningful and effective.
Encourage students to think critically by linking assignments with learning objectives.
- If students complete an assignment with a rubric as a guide, then students are better equipped to think critically about their work and to improve it.
Reduce student concerns about subjectivity or arbitrariness in grading.
- Rubrics establish, in great detail, what different levels of student work look like. If students have seen an assignment rubric in advance and know that they will be held accountable to it, defending grade decisions can be much easier.
Tips for Creating Effective Rubrics
- To create performance descriptions for a new rubric, first rank student responses to an assignment from best to mediocre to worst. Read back through the assignments in that order. Record the characteristics that define student work at each of the three levels. Use your notes to craft the performance descriptions for each criteria category of your new rubric.
- Alternately, start by drafting your high and low performance descriptions for each criteria category, then fill in the mid-range descriptions.
- Use the language of your assignment prompt in your rubric.
- Consider rubric language carefully—how do you encapsulate the range of student responses that could realistically fall in a given cell? Lots of “and/or” statements.
- To create properly scaled language that anticipates and captures as many students responses as possible:
- Use “and/or” statements and “may”
- E.g., “Introduction and/or conclusion handled well but may leave some points unaddressed;” “Sources may be improperly cited or may be missing”
- Work your adverbs and adjectives, e.g.:
- Completely Effective, Reasonably Effective, Ineffective
- Superb, Strong, Acceptable, Weak
- Compelling, Reasonable, Basic
- Advanced, Intermediate, Novice
- Proficient, Not Yet Proficient, Beginning
- Outstanding, Very Good, Good, Basic, Unsatisfactory
- Exemplary, Proficient, Competent, Developing, Beginning
- Use “and/or” statements and “may”
Tips for Testing and Revising Rubrics
- Score sample assignments without a rubric and then with one. Compare the results. Ask a colleague to use your rubric to do the same.
- Ask a colleague to use your rubric to score student work you've already scored with the rubric and then compare results.
- Get your colleagues' feedback on the alignment of your rubric's grading criteria with your assignment and course-level learning objectives.
- Discuss your rubrics with your students and determine what they do and do not like or understand about them.
Tips for Using Rubrics
- Create a generic rubric template that you can modify for specific assignments.
- Keep the rubric to one page if at all possible. Give the rubric a descriptive title that clearly links it to the assignment prompt and/or digital grade book.
- Give the rubric to students in advance (i.e., with the related assignment prompt) and discuss it with them. Explain the purpose of the rubric, and require students to use the rubric for self-assessment and to reflect on process.
- Allow students to score example work with the rubric before attempting actual peer- or self-review. Discuss with the students how the example work correlates to the competency levels on the rubric.
- Consider engaging in active-learning, rubric development exercises with your students. Have your students help you identify relevant assignment components or develop drafts of your performance descriptions, etc.
- When returning work to students, only highlight those portions of the rubric text that are relevant.
- Couple rubrics with other measures or forms of feedback. Giving brief additional feedback that responds holistically and/or subjectively to student work is a good way to support formative assessment.
- Include relevant learning objectives on your rubrics and/or related assignment prompts.
- To document trends in your teaching, keep copies of rubrics that you return to students and review them later on. Analyzing groups of graded rubrics over time can give you a sense of what might be weak in your teaching and what you need to focus on in the future.
- Be aware that Canvas has a built-in rubric tool.
- iRubric can create rubrics to support classes in OnCourse.
- Rubrics resource page from the Eberly Center at Carnegie Mellon University (includes several discipline-specific examples):
- Sample Rubrics from the Association for the Assessment of Learning in Higher Education:
- Association of American Colleges and Universities VALUE (Valid Assessment of Learning in Undergraduate Education) Rubrics:
- Holistic Essay-Grading Rubrics at the University of Georgia, Athens:
- Sample Essay Grading Rubric:
- Rubric Generators:
- Quality Matters Rubric for Assessing University-Level Online and Blended Courses:
- iRubric Tool:
- Canvas Instructors' Guide for Rubrics:
Barkley, E.F., Cross, P.K., and Major, C.H. (2005). Collaborative learning techniques: A handbook for college faculty. San Francisco, CA: Jossey-Bass.
Barney, Sebastian, et al. “Improving Students with Rubric-Based Self-Assessment and Oral Feedback.” IEEE Transaction on Education 55, no. 3 (August 2012): 319-25.
Besterfield-Sacre, Mary, et al. “Scoring Concept Maps: An Integrated Rubric for Assessing Engineering Education.” Journal of Engineering Education 93, no. 2 (2004): 105-15.
Broad, Brian. What we Really Value: Beyond Rubrics in Teaching and Writing Assessment. Logan, UT: Utah State University Press, 2003.
Hout, Brian. Rearticulating Writing Assessment for Teaching and Learning. Logan, UT: Utah State University Press, 2002.
Howell, Rebecca J. “Exploring the Impact of Grading Rubrics on Academic Performance: Findings from a Quasi-Experimental, Pre-Post Evaluation.” Journal on Excellence in College Teaching 22, no. 2 (2011): 31-49.
Jonsson, Anders and Gunilla Svingby. “The Use of Scoring Rubrics: Reliability, Validity, and Educational Consequences.” Educational Research Review 2 (2007): 130-44.
Kishbaugh, Tara L.S., et al. “Measuring Beyond Content: A Rubric Bank for Assessing Skills in Authentic Research Assignments in the Sciences.” Chemistry Education Research and Practice 13 (2012): 268-76.
Leist, Cathy, et al. “The Effects of Using a Critical Thinking Scoring Rubric to Assess Undergraduate Students’ Reading Skills.” Journal of College Reading and Learning 43, no. 1 (Fall 2012): 31-58.
Livingston, Michael and Lisa Storm Fink. “The Infamy of Grading Rubrics.” English Journal, High School Edition 102, no. 2 (Nov. 2012): 108-13.
Stevens, Dannelle D. and Antonia J. Levi. Introduction to Rubrics: An Assessment Tool to Save GradingTime, Convey Effective Feedback and Promote Student Learning. (Sterling, VA: Stylus Press, 2005).
Wilson, Maja. Rethinking Rubrics in Writing Assessment. (Portsmouth, NH: Heinemann, 2006).
Authored by James Gregory (September, 2014)
Updated by James Gregory (September, 2015)
Updated by James Gregory (February, 2016)