ARE WE USING EVIDENCE TO EFFECTIVELY ASSESS & ADAPT OUR PLANS?
Teams that use theories of change (describing the expected elements of success) are able to create coherence or increased understanding amongst the team and other stakeholders or actors.
Teams that appropriately use evidence to adaptively manage their projects, achieve greater conservation outcomes, as they are able to more iteratively move forward in the face of uncertainty.
The Conservation Standards works in both data-rich and data-poor environments
Measure the proportion of indicators with data that has been standardized, analyzed and is accessible to relevant team members to inform decision making.
This indicator is a measure of whether reflect & adapt sessions are taking place with all required stakeholders to review and adapt the theory of change, assess project progress, and address issues. This indicator can be assessed as a count of the number of formal reflect and adapt sessions that have taken place in the project's lifespan (at a realistic frequency), as binary data (Reflect and Adapt sessions are taking place OR they are not), or a tool such as the Conservation Standards Scorecard could be used to collect categorical data.
Evidence, discussions, decisions, and changes to the strategic plan should be documented and the strategic plan should be updated to ensure it remains relevant (including the action plan, work plan, operational plan, budget and monitoring plan). Data pertaining to the documentation of results can be collected as binary data (e.g. results are being documented in an appropriate format, OR it is not), or a tool such as the Conservation Standards Scorecard could be used to collect categorical data.
Proportion of project indicators with data being collected according to the Monitoring, Evaluation & Learning (MEL) plan
To assess both the quality and quantity of data being collected, evidence could be ranked or disaggregated according to type (see Salafsky et al. 2019)
Proportion of project indicators with data that are analyzed and accessible to the team
This indicator is a measure of whether relevant team members are able to access and review analyzed monitoring data to inform decision making.
Annual change in implementation progress
Annual change in implementation progress (see indicator in the Implementation tab). This indicator assesses whether the proportion of activities that are on-track or completed is increasing over time, compared to those that are delayed or obstructed.
Degree to which documented evidence is used to inform decisions
Indicators pertaining to a measure of adaptive management can be collected as: binary data (e.g. data has been analyzed and is referred to in project management meetings or NOT), OR categorical data using a tool such as the Conservation Audit Tool or the Conservation Standards Scorecard. Data can be disaggregated by the type of evidence (as defined in Salafsky et al. 2019, see below). Evidence is defined as being the "relevant information used to assess one or more hypotheses related to a question of interest."
Number of reflect and adapt sessions within the project's lifespan
This indicator is a measure of whether reflect & adapt sessions are taking place with all required stakeholders to review and adapt the theory of change, access project progress, and address issues. This indicator can be assessed as a count of the number of formal reflect and adapt sessions that have taken place in the project's lifespan (at a realistic frequency), or as binary data (Reflect and Adapt sessions are taking place OR they are not) or a tool such as the Conservation Standards Scorecard could be used to collect categorical data.
Degree to which adaptations to the plan have been documented and described
Evidence, discussions, decisions, and changes to the strategic plan should be documented and the strategic plan should be updated to ensure it remains relevant (including the action plan, work plan, operational plan, budget and monitoring plan). Data pertaining to the documentation of results can be collected as binary data (e.g. results are being documented in an appropriate format, OR it is not), or a tool such as the Conservation Standards Scorecard could be used to collect categorical data.
Number of research outputs shared externally
Methods & Details
Degree to which institutional knowledge is documented and accessible internally
Including lessons learnt, decisions, strategy effectiveness, indigenous knowledge, scientific evidence etc.
The purpose of this tool is to allow teams to easily track not only whether they have all the elements of a good strategic plan, but also elements of implementation efficiency, project support, and the delivery of outcomes and impact.
In this paper, Salafsky et al. (2019) construct a typology of different kinds of information, hypotheses, and evidence and show how these different types can be used in different steps of conservation practice
This guide describes how program design teams can use results chains to clearly articulate outcome statements and develop indicators for managing biodiversity programs. It also clarifies how teams can use the same indicators, derived from the same results chains, for multiple purposes including monitoring, evaluation, and learning across programmatic scales