Monitoring Implementation
Ed Direction Data Fellows: Asynchronous Module
June 2022
Ed Direction Data Fellows: Asynchronous Module
June 2022
Welcome to the Monitoring Implementation Optional Asynchronous Module.
You will work through this module by scrolling through this learning space. To expand documents and slide decks that are included, you can click on the gray arrow at the top right corner of each item.
Feel free to focus on the pieces of this module that are most relevant to your topics of interest.
Please complete the Exit Ticket and Module Completion Form at the end of the module. We will use your submission to track completion.
Please contact datafellows@eddirection.org if help is needed.
Click on the button to the left to open a Note Catcher, which is mirrored to follow the content as it is presented on the Learning Space. As you navigate through this module, you are welcome to use this optional tool to capture your notes.
Refer to your note catcher each time you see this icon.
Session Outcome: This optional asynchronous module will provide additional guidance around best practices for monitoring implementation of their LEA’s RSSP plan.
Success Criteria: Participants will be able to apply their understanding of the following as they serve as Data Fellows:
The rationale for measuring implementation alongside student performance
Methods for measuring implementation
Methods for compiling implementation data
Where We’ve Been, Where We’re Headed
This module builds on key concepts introduced in Webinar 7: Measurement, Analysis, and Implementation Planning. Specifically, it provides deeper support for Measurement Plans and Implementation Plans. It will also make explicit connections to the Switch framework covered in the asynchronous module Change Management as a tool for investing people in collecting implementation data. This module will also provide a many examples and templates of tools that will help you gather the data needed to monitor your implementation goals.
Data informs our understanding of student learning. They can also show us our progress toward implementing best practices within schools. By assessing the current state of implementation through data collected as part of the RSSP work, teams can analyze progress and identify appropriate next steps.
It’s important to collect information on both impact (student performance measures) and adult actions (implementation measures) at regular points along the way so that you’re always aware of which steps are occurring as envisioned, which steps may not be occurring, whether these steps seem to be working, and where changes need to be made.
Because it is more readily available, implementation data are sometimes called leading indicators. By contrast, impact data are considered lagging indicators. These terms mean exactly what they sound like. Leading indicators are available early and often as evidence of whether we are on track to achieve goals. Lagging indicators lag behind.
For example, a common source of impact data is formal interim assessment given in the middle of the year. While this is important data, you likely don’t want to wait until half the year is gone to gauge progress. All along the way, you could be looking at implementation measures (such as whether teachers are consistently providing the selected interventions). This provides immediate information that allows for ongoing reflection and course correction related to your plan.
In addition to its timeliness, implementation data is critical for assessing the correlation between your strategy and changes in student performance data. Say math scores do improve next year. Many factors may have contributed to this, including some that are not explicit in the RSSP plan. Having detailed data about implementation of the plan is vital to understanding whether the improvement can be connected to it.
Pause and Reflect: Use the following questions to reflect on the current state of your LEA’s RSSP plan. If it’s helpful to you, use the space in your note catcher to jot your thoughts.
What is the compelling rationale for including implementation monitoring in your plan? Draw upon the reasons above as well as your specific context.
To what extent does your plan already include implementation monitoring?
Will you need to make a case for why this needs to be included/increased with other team members/stakeholders?
What resistance and barriers do you anticipate?
Inevitably, your district’s current data culture will play a significant role in how you responded to these questions. If your district is unaccustomed to gathering and using data to drive decisions, you will need to do more work around rationale and investment (revisit the Data Culture webinar if needed). If your district already has a strong data culture, implementation monitoring may feel like a natural extension. Either way, how you go about setting up implementation monitoring is a great opportunity to exercise leadership and enhance your district’s data culture.
The asynchronous model on the Switch framework is a helpful tool for the change management aspects of implementation monitoring. Remember, the Switch framework is based on this metaphor:
The rider is the rational, intellectual part of us. When facing a change, our brains demand specific, compelling reasons for where we are heading and why we should go there.
The elephant is the emotional part of us. Change can bring up many negative feelings including anxiety and frustration. These need to be accounted for, so we feel optimistic and invested about where we are going.
The path is the road we’ll take to get from current reality to where we want to be. As leaders, we want to make the route as smooth, clear, and easy as possible for the rider and elephant.
Pause and Reflect: When it comes to implementation monitoring in your district, what may come up for the rider, the elephant, and the path? In the table below, we’ve given you an example of each. Add more examples from your own district to the final column on your note catcher.
As we move through the module you’ll have additional chances to anticipate barriers and consider how to overcome them.
Part 1 of this module dealt with the adaptive aspects of implementation monitoring. Just as crucial are the technical and logistical considerations, which Part 2 addresses. As with all efforts to drive improvement, it’s vital that you are intentional about all aspects of implementation progress monitoring. You’ll need to think through in great detail your strategy for what you will measure and how you will measure it.
What to Measure: Common Implementation Sources
Typically, implementation can be broken down into two substantive categories: CAPACITY and FIDELITY.
CAPACITY measures help your team understand whether teachers have the ability (or capacity) to implement the districtwide priority for learning acceleration identified by the RSSP team.
FIDELITY measures should help your team understand whether teachers are implementing the priority tools/strategies for learning acceleration identified by the RSSP team in the way intended (with fidelity).
How to Measure:
As with all data sets, both QUANTITATIVE and QUALITATIVE measures come with benefits and drawbacks.
QUANTITATIVE data is numerical. It can be collected objectively, allows for easy comparisons, and can be translated to visual displays such as tables, charts, and graphs.
QUALITATIVE data is descriptive. It affords deeper nuance and captures trends that don’t lend themselves to numerical evidence.
Both quantitative and qualitative measures of capacity and fidelity can be helpful. However, meaningful quantitative measures will be the most easy to track, compare, and visualize. This is particularly true as datasets become large, include multiple sites, extend over time, etc (as will be the case with nearly all RSSP plans).
As you consider how to most effectively monitor implementation, it’s important to consider multiple factors related to feasibility and utility:
1. What data already exists? → If you haven’t already, do an inventory of what fidelity and capacity data your district already collects. What surveys are already administered? How is PD/coaching data collected? As much as possible, align your implementation metrics to these tools.
Pause and Reflect: Do you have a comprehensive knowledge of the data already collected in your district? If not, how will you get this information? How will relying on existing data positively impact the rider, elephant, and path?
2. What new work would you and others have to take on to get this data? → Once you know what data exists, you must next consider what new processes would be required for particular implementation metrics you would like to pursue. Can it be done? Is it worth what it would take in terms of design, training, and maintenance of new tools? Generally, some amount of new work is appropriate, but you need to carefully weigh the overall scope of what the district takes on and prioritize strategically.
Pause and Reflect: What capacity for new data collection does your district realistically possess? Where is this capacity best deployed?
Pause and Reflect: As you weigh the pros and cons of various metrics, what seems most strategic for your RSSP plan?
Now that you have an understanding of the adaptive and technical aspects of implementation planning, Part 3 provides you with many examples of implementation measurement tools In their current state, these may not meet the exact needs of your district. However, they can be modified to reflect your strategy, goals, and district context.
Surveys can provide the School Transformation Team with both qualitative and quantitative data about the implementation of evidence-based instructional strategies. In order to create a meaningful survey, teams must take key steps to ensure that the survey will provide the necessary data needed to identify possible supports for staff.
To modify this resource, align the questions to your capacity and/or fidelity goals. Consider a mix of quantitative and qualitative questions. Add demographic data such as grade level, campus, years of experience if this will help monitor your goals.
This tool identifies overarching components of effective meetings regardless of the specific purpose/content of the meeting. For each component, it provides a continuum of effectiveness with a brief description. Observers can indicate how well developed each component is by matching it to the continuum descriptors.
To modify this resource, supplement or replace provided rows with meeting components more aligned to your district’s culture and priorities.
The tool allows the observer to score various meeting look-fors as a 0, 1, or 2 based on the degree to which they are present. Provides an overall score which can be useful for comparisons.
To modify this resource, edit rows to reflect the priorities for your district’s meetings. Use the format to build a Look-For document for other settings.
Using a predetermined list of “Look Fors,” School Transformation Team members can collect data through a series of short (15-20 min) observations of teachers. Data collected through these snapshot observations can be used to identify trends in how evidence-based instructional strategies are being implemented schoolwide and can reveal “bright spots” in the school where the strategy is being implemented effectively.
To modify this resource, switch categories and look-fors to match the instructional strategy being observed.
This tool will auto populate a variety of charts and graphs based on the data entered.
To modify this resource, align this tool to the Snapshot Observation Tool you create for your district.
This tool provides a place to capture ongoing information related to individual student participation in tutoring sessions.
To modify this resource, use for attendance in whatever intervention blocks you are implementing. Adjust columns to reflect the demographic information and data you will need.
This tool provides a place to capture ongoing information related to class/group participation in tutoring sessions.
To modify this resource, use for attendance in whatever intervention blocks you are implementing. Adjust columns to reflect the demographic information and data you will need.
Effective school leaders plan for the Stages of Implementation (see Column 1) that any program or initiative must go through in order to be successful. While experiencing these stages, leaders also use the Levels of Professional Learning Evaluation (Column 2) and their related questions (Column 3) and tools (Column 4) to measure progress and make meaningful decisions about school transformation effort.
A template is not available for this tool. Based on specific research. Please use “as is” if it is useful to you.
It’s time to put together your learning from this asynchronous session by applying it to your own RSSP plan.
Step 1: Read over the potential applications below.
Step 2: Review your RSSP’s goal and measurement plan.
Step 3: Depending on the current state of your RSSP’s plan, complete any of the potential applications below not already well-developed in your plan.
Based on your district’s strategies and goals, identify the most critical aspects of implementation to monitor. Turn these into specific goals that can be measured.
Consider the data already being collected in your district. What could you utilize to help you monitor implementation?
If you do see a need to develop a new tool/process to monitor implementation, which resource from Part 3 is the best aligned? Make a copy and modify it for your own district.
Identify stakeholders you will need to engage with in order to actually collect this data. How will you appeal to their rider, elephant, and/or path in order to invest them in this data collection?
Congratulations on completing the Progress Monitoring module. Please complete the Exit Ticket and Module Completion forms by clicking on the links below. We will use the information you submit to track your completion.