According to Peersman(2014), well-chosen and well implemented methods for data collection and analysis are essential for all types of evaluations. Based on impact evaluation - that is, “evaluations that provide information about the intended and unintended long-term effects produced by programmes or policies” (Peersman, 2014) of Scouts Canada, data collection and analysis methods are necessary for responding to my evaluation questions (UNICEF Innocenti., 2014). Data collection and data analysis is conducted by Scouts leaders, coordinators, parent volunteers. As I believe that democratic evaluation approach would be most appropriate for Scouts Canada when using impact evaluation. The purpose of an impact evaluation is to measure impact and understand to what extend these contribute to program or policy, so the judgement can be made about the program or policy value. Democratic evaluation approach is an inclusive approach to “evaluation emphasizing participation and collaboration can enhance the efficiency of data collection, improve learning, and strengthen commitment to act on results and also reflect the highest aspirations and ideals of a democratic society” (BetterEvaluation, 2016).
Data collection examples include pre-/post- program survey, key information interview, observation of programme implementation, review/discussion
Data analysis examples include content analysis, thematic coding, summary statistics, parametric test, time-series analysis
There are 4 items need to keep in mind while monitoring and evaluation specialist is conducting data collection and data analysis (UNICEF Innocenti., 2014):
1. Need data collection such as qualitative data and quantitative data increase the credibility to answer the key evaluation questions
2. Examine what data are already available such as programme performance and monitoring data, records and communication, statistics or surveys. Then decide what additional data are needed to be collected for the part of the evaluation
3. check if there’s anything overlooked. Using data collection matrix to answer each key evaluation question. (see below for Data Collection and Data Analysis Matrix example)
4. check if the selected methods are durable in terms of time, capacity, and financial resources.
Key Informant Interviews involve “interviewing people who have particularly informed perspectives on an aspect of the program being evaluated” (BetterEvaluation, 2016). Monitoring and evaluation specialists, Scouts participants, parent volunteers and Scouts leaders/coordinators are people that I would interview.
Sample of Key information interviews:
1. Explain your personal experience with the Scouts program.
2. What did/didn’t you like about the Scouts program and why?
3. Which outdoor activity do you enjoy the most/least and why?
4. Can you tell me a situation that may demonstrate your confidence/leadership skills during these activities in the Scouts program?
5. Can you tell me about the new friends you made in the Scouts program?
Data Analysis
Data analysis will be conducted by the Scouts leaders/coordinators.Data analysisinvolves reviewing the data to answer the research question and understanding its significance (Speak Up). The following steps will help me get organized:
Step 1: Organize and prepare data for analysis
Step 2: Ensure data confidentiality.
Step 3: Start to make sense of data
Impact evaluation needs to draw a conclusion on large population. In the case of Scouts Canada, there are a large participants. By using sampling method would help to draw representative participants, this way it assesses the program impact. Probability and purposive techniques are two main sampling techniques that are able to cover all participants. Probability techniques’ sample is random from all participants which can meet the aim of democratic evaluation approach to serve the whole community. This allows people to be informed of what others are doing and sees the evaluator as someone who brokers the process (BetterEvaluation, 2016). Purposive techniques focus on specific cases chosen through a transparent selection process. Democratic Evaluation focuses on inclusive practices which foster participation and collaboration. It ensures public accountability and transparency. The quality of the data needs to manage throughout all the stage of the evaluation process. The fundamental elements of data quality are Validity, Reliability, Completeness, Precision, Integrity, Timeliness (UNICEF Innocenti., 2014).
Evaluation findings should follow the Ethical Guidelines for Evaluation. Scouts Canada should also consult the Ethical Research Involving Children for advices on how to evaluate involve children. The Ethical Guidelines for Evaluation involves a set of guidelines for participants in the evaluation. They are respect the dignity and diversity, rights, confidentiality, and avoidances of harm. When reporting the evaluation findings, we should engage in variety of stakeholders. All evaluation findings should be reported to key stakeholders in a formal evaluation report at the end of the evaluation. The evaluation findings reports will be in both electronic and print copy. The context of the report should be written in a comprehensive and concise manner that is understand by all stakeholders. It will be included a brief background of the Scout program and the impact of the program. The data should be kept simple, such as using graphs and tables. I will make sure that it focuses on the key evaluation questions with clear visualizations to strengthen the message.
Utility Standards are “intended to ensure that an evaluation will serve the information needs of intended users” (Sanders, n.d.). By using democratic evaluation approach, all Scouts Canada program participants are involved in the evaluation process. Evaluator such as the leaders/volunteers/coordinators have years of experience as their credibility. The information collected from the key evaluation questions is broadly selected to address questions about the program and responsive to the needs and interests of Scouts. The evaluation report clearly states Scout Canada ‘s context, purposes, procedures, and findings. The evaluation report is done at least every year to ensure program effectiveness and meaningfulness for Scouts.
Feasibility Standards are “intended to ensure that an evaluation will be realistic, prudent, diplomatic, and frugal” (Sanders, n.d.). When conducting an evaluation, evaluators are going to the sites that Scouts Canada program are running. The evaluation is practical and need to keep disruption to a minimum when information is obtained. As democratic evaluation targets audiences include multiple stakeholders, this can ensure political viability. The evaluation is with various interest groups. The evaluation is cost effective because the stakeholders are volunteer based.
Propriety Standards are“intended to ensure that an evaluation will be conducted legally, ethically, and with due regard for the welfare of those involved in the evaluation, as well as those affected by its results” (Sanders, n.d.). The evaluations are designed to assist Scouts Canada to address and effectively serve the needs of Scouts. Formal agreements are necessary and need to be signed by the parents/guardians of all Scouts. The impact evaluations using democratic approach is designed and conducted to respect and protect the rights and welfare of human subjects. The interaction of Scouts and leaders/volunteers/coordinators is respectful and positive. They will be spending time together to accomplish tasks; this helps them to establish their relationships throughout the program. It is important for coordinators/volunteers/leaders to develop connections with Scouts, this could maintain an open conversation when there’s conflict of interest. Open ended questions in the assessment promotes improvement of Scouts program. The purposes and procedures of the evaluations are monitored and described in details.
Accuracy Standards are “intended to ensure that an evaluation will reveal and convey technically adequate information about the features that determine worth or merit of the program being evaluated” (Sanders, n.d.). The evaluations aim to improve the Scouts program. Therefore, documentation will be clear and accurate. The context of Scouts program is presented in details and the evaluation explains its purposes and procedures. The information of Scouts programs is valid, reliable and systematic. Both analysis of quantitative and qualitative information is appropriately and systematically analyzed using open ended evaluation questions, survey, interview and observation. The conclusions in an evaluation is explicitly justified. Collaboration from all people involved will ensure biases free and personal feelings on both formative and summative evaluation, so the evaluation is fairly reflecting the evaluation findings.