Facilitating a Rubric Development Conversation

Why a facilitator?

Having or designating a facilitator during the rubric development discussion enhances the overall quality of the conversation by ensuring balanced participation, creating a safe space, and managing time and progress. Having someone in the facilitator role contributes significantly to fostering effective communication, promoting collaboration, and ultimately achieving consensus among team members when identifying rubric criteria and rubric-level descriptors. 

Finding Common Ground for the Academic Focus and Creating Criteria for the Rubric

Scenario 1: All members of the group are similar content and grade band

This scenario is the easiest to find common ground. Narrow the choices of standards to the content area everyone on the team has in common. If you are an elementary group, looking through the ELA anchor standards or math practice standards is a good place to start.

Some questions that may help the team find common ground:


If the team chooses a focus area based on a standard, read the descriptors for that standard. Look for phrases that are observable, measurable, and desirable that are likely to change. Think through several units to be certain that the activities and practice students are likely to do will give ample opportunity to gather observational or student work evidence sufficient to allow for assigning rubric scores across the criteria. It is helpful for each teacher to share some ideas of observable or assessable examples for each rubric criterion.

Some examples of academic focuses and criteria are located below.

Scenario 2: All members of the group are similar content but different grade bands

This scenario helps members find commonalities across grade levels. Narrow the choices of standards to the content area that all team members have in common. If you are an elementary group, looking through the ELA anchor standards or math practice standards is a good place to start.

To help identify commonalities, first read math practice and/or ELA standards. 

Some questions that may help the team find common ground:


If the team chooses a focus area based on a standard, read the descriptors for that standard. Look for phrases that are observable, measurable, and desirable that are likely to change. Think through several units to be certain that the activities and practice students are likely to do will give ample opportunity to gather observational or student work evidence sufficient to allow for assigning rubric scores across the criteria. It is helpful for each teacher to share some ideas of observable or assessable examples for each rubric criterion.

Some examples of academic focuses and criteria are located below.

Scenario 3: Members of the group have diverse content and either similar or different grade bands

This scenario requires more discussion than the other two to help team members find commonalities across content areas and potentially grade levels as well. A good place to start is to look through the ELA anchor standards or math practice standards to find one or two examples that might work for everyone present.

Some questions that may help the team find common ground:


If the team chooses a focus area based on a standard, read the descriptors for that standard. Look for phrases that are observable, measurable, and desirable that are likely to change. Think through several units to be certain that the activities and practice students are likely to do will give ample opportunity to gather observational or student work evidence sufficient to allow for assigning rubric scores across the criteria. It is helpful for each teacher to share some ideas of observable or assessable examples for each rubric criterion.

For example, if an art teacher is using the same data tool as a math teacher, having an academic focus of Math Practice 8: Look for and express regularity in repeated reasoning (Wisconsin Standards for Mathematics, pg 23) may be a stretch for an art teacher to incorporate in a meaningful way. On the other hand, an art teacher may find Math Practice 7: Look for and make use of structure (Wisconsin Standards for Mathematics, pg 23) applicable because many of the math practice descriptors could be observed as the rubric criteria in the art classroom, i.e. students look closely to discern a pattern or structure, students can recognize the significance of an existing line in a geometric figure, students can shift perspective.

Some examples of academic focuses and criteria are located below.

Examples of Academic Focuses and Criteria

ELA Example (from CCSS ELA Anchor Standards for Reading)

Focus: Key ideas and details

Criteria:

Makes logical inferences

Cites specific textual evidence

Determines central idea

Analyzes how individuals interact over the course of a text


Mathematics Example (from CCSS Math Standards for Mathematical Practice)

Focus: Make sense of problems and persevere in solving them.

Criteria:

Analyzes givens in a problem

Plans a solution pathway

Checks answers

Represents the problem (drawing, objects, or other ways)


Science Example (from NGSS Science and Engineering Practices)

Focus: Developing and using models


Criteria:

Develop a model to represent something in the natural or designed world

Compares models

Revises model during learning

Uses model to test cause and effect relationships

Facilitating Group Discussion: What Do the Engagement and Academic Criteria Look Like in Your Classroom?

Once the team members have decided on the academic criteria, each person should share some concrete examples of what the engagement and academic criteria look like in each teacher’s classroom. Use the following prompts to stimulate thinking and sharing.


Continue around the group until everyone has shared several examples.

You may have questions about the validity of the data and the inter-rater reliability of the scores. Use these answers to come to a common understanding about this issue. In addition, this video, Indicator vs. Measure, can help to differentiate between a measure and an indicator. The data collected for data tool trials are indicators, not measures.