In 2011, a national program study was conducted by research firm Rockman et al., which looked at performance assessments, surveys, and standardized test scores to evaluate students’ research and writing skills, ability to interpret historical information, academic performance, and interest in past and current events. They then compared their evaluations of students who participated in National History Day (NHD) to their peers who did not participate in the program.
The study, conducted at four sites around the country, found that on nearly every measure, NHD students’ scores or ratings were higher than their peers who did not participate in the program.
Affiliate-level evaluations and reports can supplement the national study and showcase the work you have done in your program. Among other things, Affiliate-level evaluation can help you to:
Fundraise
Report on your work for donors or sponsoring institution
Build a case for teacher recruitment
What type of evaluation do you plan on doing? For the most part, surveys are the easiest tools to use and will reach the broadest audience. Focus groups can give you more details as you solve specific problems, but are often more time-consuming and expensive.
How do you gain consent? When surveying minors, you must include language to gain parental consent in your contest waivers.
How will you collect and analyze the data? Distributing paper surveys at workshops or contests may be easier to administer. It may result in higher response rates but must be manually entered into a tool to calculate results. When you have email addresses for participants, online surveys are likely easier to administer. A Google Form may work for simple surveys. For more advanced surveys and data analysis, you may want to consider a paid platform such as Alchemer (formerly known as SurveyGizmo) or SurveyMonkey.
Will your survey be anonymous? The most honest feedback comes from anonymous surveys. What sorts of personally identifiable information do you actually need?
What will you do with it? If you are going through the work to collect feedback, plan to do something with it. Create a report that shares the success of your program. While occasionally uncomfortable, look through responses and identify the changes you can make based on user feedback.
Minnesota: At a Glance, Classroom Outreach, Educators, Participation Snapshot, Student Outcomes
Crafting an evaluation can be tricky. Here are some tips for any evaluation:
Begin by turning your learning outcomes into questions that will help determine whether these have been achieved.
Limit the number of open-ended questions, which are time-consuming to complete and compile.
Strive for multiple-choice questions with an even number of potential responses – when you use an odd number, many people will gravitate toward the middle option.
Allow anonymous responses but include an area where responders could self-identify.
Try to keep it to two pages at most.
An online survey tool enables you to download the results into a spreadsheet and will save a lot of time, particularly if you have a large number of responders. SurveyMonkey and Alchemer (formerly SurveyGizmo) are two online platforms that allow you to create limited, free surveys that automatically tabulate the results. Google Forms is free and will allow you to create longer surveys, but may be limited in its ability to "crunch the data." If you choose to use an online survey, send it within a few days after the event so that it is still fresh in the minds of your participants. And consider providing an incentive for them to complete it. The incentive might be an automatic entry into a drawing for membership in a museum, a discount coupon to a bookstore, etc.
Sometimes, a paper survey is best, particularly if you have a small group assembled (e.g., a workshop) and you hand out and collect the survey the same day. Even if distributing a survey in paper format, you can still use an online tool to tabulate the results. Build the same questions from your paper survey into the online tool and manually enter each response.
While not based on user feedback, another useful tool is an assessment or report of how the National History Day program meets your state standards. This can be a great tool for teacher recruitment, for teachers to justify their participation in the program, or to demonstrate relevancy to program potential donors.
So, you've pulled off this fabulous workshop for teachers in your Affiliate, and it seems like everything went smoothly. How do you know if the teachers liked it, learned anything, and will implement what they’ve learned in their classrooms? Evaluations are an important tool to get specific feedback on all areas of the program and the contest. They also send a message to your audience that you value them and care about their experience.
Surveying students is a great way to honor their participation and demonstrate that you value their perspective. You can also ask questions that get at changes in behavior (did they go to a library? read more difficult sources) and attitude (are they more interested in attending college) that demonstrate the value of the National History Day program beyond the competition.
These sorts of informal findings can provide a great supplement to NHD's formal evaluation by Rockman et al. Through this snapshot, your institution and program donors may value seeing how the program works in your Affiliate.
If you plan to survey students, plan to include consent into the contest waivers parents/guardians sign. Let them know how the survey will be distributed, that their responses have no impact on their success in the competition and that they will be collected anonymously (hopefully) and used to improve the program.
Minnesota History Day Student Regional and Student State Surveys
Judges are a unique and important subset of audience that can provide feedback on their experience with your contest.
Ask basic comfort questions about how well they found their way if they enjoyed their lunch, etc.
Also, ask about the helpfulness of your materials in their preparation for judging, the effectiveness of your orientation, and whether they thought the judging process was fair and non-cumbersome.
You may also want to know what they thought of the quality of the entries overall and their most memorable experience of the day. This can be interesting from the perspective of both novice and veteran judges.
Like your other evaluations, try to keep this one to two pages at most and avoid open-ended questions. While an online survey may seem like a better idea to save you time in compiling responses, asking judges to complete a paper survey before they leave for the day will likely yield a greater response rate and sharper answers. Plan ahead to have an intern or a volunteer compile the responses during the week after the contest.
Given the wide variety of parental involvement, it will be a challenge to develop a survey that will mean the same to more than a few parents. For example, simply asking parents to rank the usefulness of the Rule Book will not uncover the fact that some parents will never have heard of it while others will have read it cover to cover.
Should you want to survey parents, though, there is a lot that you can learn if you give some thought to the structure of the questions. As for the format, consider using an online survey program to email parents.
In addition, some teachers may be interested to learn more about the parent experience. Consider summarizing and sharing some of their feedback with teachers to improve their school-level programs.
Minnesota History Regional Parent and State Parent Surveys