Go to Collaborative Learning Go to FLAG Home Go to Search
Go to Learning Through Technology Go to Site Map
Go to Who We Are
Go to College Level One Home
Go to Introduction Go to Assessment Primer Go to Matching CATs to Goals Go to Classroom Assessment Techniques Go To Tools Go to Resources

Go to CATs overview
Go to Attitude survey
Go to ConcepTests
Go to Concept mapping
Go to Conceptual diagnostic tests
Go to Interviews
Go to Mathematical thinking
Go to Performance assessment
Go to Portfolios
Go to Scoring rubrics
Go to Student assessment of learning gains (SALG)
Go to Weekly reports

Go to previous page

Classroom Assessment Techniques
Scoring Rubrics

(Screen 3 of 6)
Go to next page

Teaching and Learning Goals
Students learn to communicate about science in a variety of ways and especially improve their writing skills. The quality of students reasoning and logic increases. Instructors gather a variety of data about students' understanding and performance.

Suggestions for Use
I design rubrics for the multiple forms of assessment I use in my courses: short writing samples, essays, poster displays, research papers, public hearing papers, oral presentations, weekly homework assignments, and concept maps. Each rubric stands on its own, but the general criteria in many rubrics are similar. For example, rubrics for written assignments have the same criteria for acceptable style and grammar; responses must address the question, and arguments must be presented in a logical order [compare Figures 1 and 3]. Alternatively, a rubric designed to evaluate the mechanics of a poster display may include a checklist to guide the student in developing all of the sections of the poster. Each component of the poster should then have additional criteria for evaluation. For example: What are the criteria for the title of a poster? Is the title informative? Are specific key words used?

Example 3. Scoring Rubric for Essay Questions
Level of Achievement
General Presentation
Reasoning, Argumentation
(10 pts)
  • Provides a clear and thorough introduction and background
  • Addresses the question
  • Addresses the question.
  • Presents arguments in a logical order
  • Uses acceptable style and grammar (no errors)
  • Demonstrates an accurate and complete understanding of the question
  • Uses several arguments and backs arguments with examples, data that support the conclusion
  • Quality
    (8 pts)
  • Combination of above traits, but less consistently represented (1-2 errors)
  • Same as above but less thorough, still accurate
  • Uses only one argument and example that supports conclusion
  • Adequate
    (6 pts)
  • Does not address the question explicitly, though does so tangentially
  • States a somewhat relevant argument
  • Presents some arguments in a logical order
  • Uses adequate style and grammar (more than 2 errors)
  • Demonstrates minimal understanding of question, still accurate
  • Uses a small subset of possible ideas for support of the argument.
  • Needs Improvement
    (4 pts)
  • Does not address the question
  • States no relevant arguments
  • Is not clearly or logically organized
  • Fails to use acceptable style and grammar
  • Does not demonstrate understanding of the question, inaccurate
  • Does not provide evidence to support response to the question
  • No Answer
    (0 pts)

    Step-by-Step Instructions
    There are many routes to developing a useful scoring rubric, however, all of them involve the following five steps:

    To provide a useful example of how these steps "play out" in a real world context, I will describe how I developed rubrics for my own introductory biology and ecology courses.

    1. I developed the goals for my course and daily class meetings.
    Keep in mind the assessment tasks must be linked to student learning goals and outcomes. So writing goals is the first step. These are examples of stems and sample goals from introductory ecology or biology courses:

    Students will be able to demonstrate their ability to:
    • utilize science as a process
    • communicate an understanding of and links among biological principles
    • write about, criticize and analyze concepts in biology
    • use the process of scientific inquiry to think creatively and formulate questions about real-world problems
    • apply content knowledge in the resolution of real-world problems
    • reason logically and critically to evaluate information
    • argue science persuasively (in both written and oral format)
    • illustrate the relevance of ecology to your lives by applying ecological knowledge in the resolution of real-world problems

    2. I selected the assessment tasks:

    What type of assessment will provide me data about students' achievement of each of these goals?

    Based on the goals for my courses, I selected different forms of extended responses, both written and oral, and concept maps to gather the data that would convince me that my students achieved the goals. The kinds of questions I asked students and the types of projects I assigned, were designed to promote students' reasoning. For example, for the first three goals I have listed, various types of assessment that could be used to gather the type of data desired.

    • Utilize science -- performance assessment e.g., students conduct a scientific investigation.
    • Communicate an understanding of and links among biological principles -- e.g., concept maps, Vee diagrams, extended written responses (Novak and Gowin 1984, Novak 1998).
    • Write about, criticize and analyze concepts in biology -- written critical analysis of articles and papers.

    3. I developed a set of performance standards:

    The performance standards I used in my introductory biology course on "logical reasoning" and "critically evaluating information" were different than the performance standards I developed for my upper division biology majors. The difference was based on the developmental stages of the students and their experience in college-level science courses (Magolda 1992, King and Kitchener 1994).

    4. I differentiated performances based on criteria:

    Examine the rubric for Quizzes and Homework. The criteria for responses fall into two major categories: general approach and comprehension. Although these two categories are not discrete as indicated by the dotted line between them, students can see all of the itemized components of an exemplary answer. These categories can be divided further. For example, comprehension could be divided into content knowledge, conceptual understanding, and reasoning and critical thinking skills (Freeman 1994). Freeman (1994) includes communication skills as a category in rubrics. Essentially, my rubrics cover the same categories; the difference is in the number of columns used.

    Notice, when it is possible to quantify the categories, I did so. So, for example, the criteria for acceptable style and grammar in an exemplary answer is based on no errors.

    Our ability to differentiate among criteria is critical to the effectiveness of the scoring rubric. So words like "good" are too subjective. The criteria must be designed so that you and your students can discriminate among the qualities you consider important.

    When we evaluate students' extended responses, we tend not to score them point by point, however, by elaborating on the criteria that comprise the different levels of performance, we provide the students substantive guidance about what should be included in their extended responses.

    5. I assigned ratings (or weights) to the categories.

  • Exemplary (5 pts) - highest category of college-level work
  • Adequate (4 or 3 pts) - acceptable college-level work
  • Needs Improvement (3 or 1 pts) - not yet college level-work
  • No answer: 0 points
  • Point values: Do you assign points on a 5, 3, 1 scale? or a 5, 4, 3 scale? I have tried both. I chose 3 as the middle or as an adequate score. Most student responses in this category can readily be improved through group work, practice, effort and instruction. Therefore, in an effort to develop students' self-efficacy and to promote their achievement of higher standards, I chose the 5,4,3 point scheme.

    On a five-point scale, the data do not enable me to discriminate between two consecutive points, such as 3 and 4, in terms of evaluating the response. Rather, three categories were readily distinguishable by my students and me, therefore, little if any time was spent "arguing" for points. The criteria for evaluation were clear and understood.

    Go to previous page Go to next page

    Tell me more about this technique:

    Got to the top of the page.

    Introduction || Assessment Primer || Matching Goals to CATs || CATs || Tools || Resources

    Search || Who We Are || Site Map || Meet the CL-1 Team || WebMaster || Copyright || Download
    College Level One (CL-1) Home || Collaborative Learning || FLAG || Learning Through Technology || NISE