Ongoing Monitoring

Ongoing Monitoring

Decision-making is a ubiquitous part of the day-to-day operations of a school. Educators constantly make decisions regarding content, instructional strategies, school improvement goals and action steps, to name a few. When these decisions are made by a leadership team using a standardized decision making process and informed by data, they are more likely to lead to effective action steps targeted at solving specific problems (Newton, Horner, Algozzine, Todd & Algozzine, 2009). This chapter explores how PBIS leadership teams use data to monitor progress, inform decisions, and establish cycles of continuous improvement.

Many of the concepts described apply to the use of academic data to make decisions aimed at improving academic outcomes. Academic success directly affect behavior, and vice versa. As such, it is recommended that teams consider integrating academic and behavior data when problem solving around both academic and behavioral problems (McIntosh & Goodman, 2016).

What Data is Most Important?

As discussed in the previous section, school leadership teams need to have a systematic method for determining which data leads to the most effective decision-making, and is therefore most worthy of collecting. Teams should consider the strategic elimination of the collection of data that does not lead to improved outcomes for students. Furthermore, teams should bear in mind that even data that can lead to improved outcomes for students is worthless if not used to inform decisions. Therefore, teams should regularly review the data that the school collects.

As mentioned above, the data that is collected, analyzed and used for decision-making should be directly related to achieving important school goals. There are two different frames in which data can be considered: 1) outcome (effect or impact) vs. input data (cause or fidelity); and 2) formative vs. summative data. Outcome and input data are discussed first.

What data is measured depends upon the desired outcome for students. Therefore, the outcome (effect) data that is selected should directly measure that which the team is hoping to achieve. In addition, desired outcomes should be observable and measureable. For example, if a team desires to improve student attendance, it needs to collect, track and analyze student attendance data.

Similarly, input data are measures of the fidelity of implementation of the systems and practices that are intended to achieve the desired outcomes. Input data measure the adult behaviors that are expected to result in the desired outcomes. As with desired outcomes, the adult behaviors selected as measures of inputs should be directly observable and measurable. Again, using our example of a school attempting to improve student attendance, the team decides to encourage attendance by recognizing those students who attend school on a daily basis. To do this, they ask teachers to give specific positive feedback along with a special attendance ticket to every student who is in attendance on a particular day. Students write their names on the tickets, and on Fridays, all of the tickets are collected, counted, and placed in a drawing. The weekly count of the tickets are used as a measure of the degree to which the staff is implementing the plan.

Data Analysis Cycles

Highly effective PBIS leadership teams use cycles of data collection and analysis that align with their team meeting schedule (Hamilton et al., 2009; Means, Chen, DeBarger & Padilla, 2011; Newton, Horner, Algozzine, Todd, & Algozzine, 2009), times when the data are available, and the intended use of the data. These regular cycles use specific data sets to inform decision-making (Horner, Sugai, Todd, 2001). Cycles typically fall into two categories: 1) monthly or semi-monthly, and 2) annual or semi-annual.

Monthly or Semi-Monthly Cycles

PBIS Leadership Teams often meet on a monthly basis throughout the school year. This is the optimal time to monitor progress toward the desired outcomes and the implementation of the action plan. The team should include a review of the monthly Big 5 Data report as part of the standing agenda for monthly PBIS leadership team meetings (see below). In addition, the following information should be available for review, as needed:

Other Outcomes Data, as Appropriate

  • Staff managed or minor behaviors;
  • In-school suspensions (ISS);
  • Out of school suspensions (OSS);
  • Attendance;
  • Tardies;
  • Academic Data
    • Common Formative Assessments
    • Benchmark Assessments

Input (Implementation Fidelity)

  • Evidence of lessons taught (i.e. staff lesson sign-off forms; walkthrough data) ;
  • Evidence of reinforcement of appropriate behavior (i.e. count of tangibles given; walkthrough data);
  • Evidence of consistent correction of inappropriate behaviors (i.e. walkthrough data; staff implementation fidelity rating ;
  • SW-PBS Tiered Fidelity Inventory (TFI);
  • Artifacts identified by action plan for providing evidence of completion of action steps;
  • School generated surveys.

Annual or Semi-Annual Cycles

At a minimum, the team should conduct an annual review of all data that can illustrate the current status and trends, as well as provide cause for reflection, celebration, and re-commitment. In addition, many teams take a quick “state of the school” assessment at either midyear (semester or trimester). You will note that some monthly data sources are repeated at the mid-year and year-end review. These reports are typically cumulative rather than monthly reports.

In addition, some data is typically only available once or twice per school year. This data provides “big picture” information regarding the state of the school. To maximize the accuracy and usefulness of this data, it should be reviewed as it becomes available.

Data available for periodic review includes the results from the following PBIS Assessments:

  • School Safety Survey (SSS) –taken in the fall of each year by all staff, students and parents;
  • Self-Assessment Survey (SAS) –taken in spring of year by all staff;
  • School-wide Evaluation Tool (SET) –external observation typically taken in late winter or early spring;
  • SW-PBS Tiered Fidelity Inventory (TFI) –taken in the spring by PPBS teams implementing and/or training at the Tier 2 and Tier 3 levels; teams new to Tier 2 or Tier 3 training also take the TFI in the fall for a baseline score.
  • Effective Systems To Collect, Monitor, Analyze, and Share Data

The PBIS/MTSS Leadership Team will need to ensure the data are collected accurately and in a timely manner, and graphic reports available when meetings are held (Horner, Sugai, & Todd, 2001). This requires the development of clear and efficient procedures, and the assignment of roles and responsibilities. Additionally, professional development may be needed for some or all staff members that participate in survey completion, data collection, data entry, report generation and data analysis. Time spent on establishing efficient and effective systems to collect, enter, report and analyze data will yield accurate data reports that facilitate decision-making.

In creating effective systems for data collection, entry, reporting and analysis, the SW-PBS Leadership Team will need to consider the following questions for each data source that will be used in decision making:

  • Who enters data/completes the survey/tool?
  • When is the survey/tool completed?
  • Who prepares graphic summaries/reports and when?
  • Who analyzes the data from the survey/tool?
  • Who suggests possible action steps?
  • Who has authority to decide on which action steps to take?
  • How are data summaries and resulting action steps shared with stakeholders?

When developing systems to collect, monitor, analyze, and communicate data, particular attention must be paid to clarifying who informs the decision-making process and who makes the final decision, (Garmston &Wellman, 1999; Newton, Horner, Algozzine, Todd, & Algozzine, 2009).

Resource: Monthly Individual Student Progress

Resource: Run Query Data from Aeries