A Formative Evaluation of the Recruiter Training Program

Background

Organization

Magic Lamp, Inc. (a pseudonym), is a hospitality, media, and entertainment company in the United States. The company was founded in 1928, and has diversified its portfolio to include film, television, broadcasting, theme parks, consumer products, publishing, and international operations. The organization has experienced rapid growth in the number of hourly employees within the organization's theme parks. In response to this growth, the recruitment team increased the number of recruiters on staff to help support company growth.

Program

The Recruiter Training Program was launched in 2016 to support new recruiters joining the company. In May 2021, the Training department redesigned the program into a fully online format that was condensed to one week. The current onboarding process is a five-day blended-learning training program, delivered via group sessions and self-guided video instruction. Additional training days are included in the overall planning to allow for additional learning support at the request of the learner and/or recommended by the training manager. The focus of this project is to evaluate the strengths and areas of opportunity for the Recruiter Training program. The impact of the effectiveness of the Recruiter Training program is illustrated in Appendix A. Recruiter Training Impact Model.

Appendix A - Training Impact Model.pdf

Table 1: Training Impact Model (click to expand)

Stakeholders

Magic Inc. has three types of stakeholders/impactees:


Upstream Stakeholders

Several stakeholders played a role in the design, development, and delivery of the courses included in the Recruiter Training Program. The key upstream stakeholders are the following:

  • Manager of the training department (Client)

  • Two instructional designers

  • Six trainers


Direct Impactees

New recruiters are the primary users or direct impactees of the Recruiter Training Program. During the writing of this proposal, 13 new recruiters had recently completed the training program. The program will be completed by an average of 15 new recruiters every quarter. The direct impactees include:

  • New recruiters (n=60/year)


Indirect Impactees

Success or failure of the Recruiter Training Program would have an indirect impact on the following groups of people:

  • Current staffed team of recruiters and coordinators (n= approx. 159)

  • Up to 3,000 total external candidates who interact with an individual recruiter on an annual basis.

Evaluation Request

While the Recruiter Training Program received great support from stakeholders and informal positive feedback from participants, there had not yet been a formal analysis and evaluation of the new recruiter training program. In January 2022, the Manager of the Training department, requested an evaluation of the Recruiter Training Program to discover the strengths and weakness of the program.

Evaluation Methodology

Evaluation Purpose and Type

Based on discussions with the client and instructional designers, we discovered that the evaluation findings would be used by (1) the training department leadership team, including the instructional designers and trainers, to revise and improve the program's course objectives and assessments; and (2) to provide new employees with adequate training support to help them become productive as soon as possible.

Because the leadership team at Magic Inc., intended to use the evaluation findings to recognize areas for improvement, the evaluation team conducted a formative evaluation of the Recruiter Training Program. We incorporated a goal-free evaluation approach to determine the positive and/or negative factors resulting from the training program. We followed Chyung's (2019) 10-step evaluation process with identification, planning, and implementation phases (Figure 1). This systematic process guided our approach to determining what data is needed for each of our identified dimensions and how we would collect and triangulate this data. Our evaluation was systemic in nature in that we constantly went back and forth between steps and stages as modifications were made.

Figure 1. Chyung’s (2019) 10-step evaluation process

Appendix B - Data Collection Methods.docx.pdf

Table 2: Data Collection Methods (click to expand)

Dimensions and Data Collection Methods

Based on the client and additional stakeholders' input, we helped identify two dimensions and their relative degrees of importance weighting (IW):

  1. Instructional Designs in Lessons (Very Important): How well are instructional objectives and assessments aligned?

  2. Performance Support (Extremely Important): How much support is provided at the performer level, process level, and organizational level?

While incorporating the following frameworks, we used multiple sources of data, including interviews with instructional designers, trainers, and new recruiters (program participants). Additionally, my team used multiple types of data collection methods:

  • Survey—Web-based survey using Rummler's 9 boxes Model

  • Interview—Semi-structured Zoom interviews using Kirkpatrick's Level 1 and Level 2

  • Extant data review—Existing Instructional Plans showing course objectives and assessments were reviewed using Bloom's Taxonomy levels: Knowledge, Understand, Apply.

Results

Dimension 1: Instructional design in the lessons

The dimension question was, ““How well are instructional objectives and assessments aligned? (per Bloom's Taxonomy)”

Extant Data

The evaluation team analyzed the available training resources, instructional plans, video tutorials, and job aids for the recruiter training program (n=15). The team analyzed the data using a rubric based on Bloom’s Taxonomy, which measured the intended level of learning objectives, that level in which instructional activities support, and the assessment methods. Bloom’s Taxonomy supported the systemic review of the extant data to understand if the objectives and assessments were aligned to the resources and activities.

  • 12 of the 15 data sources reviewed did not support all three levels of Knowledge, Comprehension or Application of the learner

  • 15 out of 15 data sources showed that resources and activities in the instruction were found up to the first level only - Knowledge

  • There was not an identified formal assessment tool used during the training program (0 out of 15)


The results from our rubric fall into the "Unacceptable" category.

Survey

The evaluation team conducted an anonymous, web-based survey to assess how new-hires felt about the recruiter training program (N=13, n=11, 85% participation) We developed questions focusing on the design and structure of the course and whether the trainees could apply the tools when building their skills. We used a 5 point Likert scale for our survey to assess if the courses learning objects and assessments were aligned.

  • 90% of respondents agreed that the training resources and tools were effective for new-hires to help meet learning objectives

  • 86% of respondents agreed that they had adequate opportunities to apply their acquired skills learned in the program.


The results from our rubric fall into the "Excellent" category.

Interviews

The team conducted confidential, semi-structured Zoom interviews with two instructional designers, two trainers, and 3 new-hires (trainees). Questions were incorporated into the interviews to align with the survey responses such as asking about participants’ about learning objectives, training resources/activities, and assessment methods. The rubric that was used was Excellent (overwhelmingly positive comments), Acceptable (Mostly positive, few negative comments) and Not Acceptable (Overwhelmingly negative comments)

  • Trainers - Not Acceptable

  • Instructional Designers - Acceptable

  • New-Hires (trainees) - Acceptable


Dimension 2: Instructional design in the lessons

How much support is provided at the performer level, process level, and organizational level? (per Rummler's 9 Boxes)


Survey

The evaluation team conducted an anonymous, web-based survey to assess how new-hires felt about the recruiter training program (N=13, n=11, 85% participation) We developed questions focusing on the individual learning community, the teaching and learning, and how they can apply the learnings to their role. We used a 5 point Likert scale for our survey to assess if the courses learning objects and assessments were aligned.

  • 97% of respondents agreed they felt a sense of belonging and connection with the organizations values

  • 89% of respondents agreed that they had adequate support during the duration of the training program


The results from our rubric fall into the "Excellent" category.

Interviews

The same populations were interviewed as in Dimension 1. The following are the results:

  • There were overwhelming positive remarks from the new-hire trainees.

  • Respondents enjoyed having daily check-in's with the trainers to get the support they needed.


The results from our rubric fall into the "Excellent" category.



Dimensional results are summarized in Table 3 below.

Table 3: Summary of dimensions and results

Conclusions

The evaluation team concluded that the program provided Excellent training support and welcoming environment and this was indicated through the survey results and by the trainees during the interviews. While the resources and videos were available, the evaluation team concluded that there was a lack of objectives, learning plans, and understanding of the purpose of each lesson. There was also a misalignment of assessments to respective learning objectives that was discovered during the extant data review. These discoveries were identified through extant data review, survey results, and feedback from both the trainers and trainees.

Recommendations

After a full analysis of both dimensions, the evaluation team made several recommendations to the Recruiter Training Program. These recommendations were made based on the organizations goals, feasibility and/or likelihood of implementation.

Dimension 1 Recommendations:

  1. Create daily instructional plans for trainers to follow (i.e. - Gagne's Nine Events of Instruction Lesson Plan) The impact of this will benefit the trainers who expressed great frustration about not having proper support prior to leading each training session.

  2. Develop activities in the curriculum that address learning objectives and allow learners to apply acquired skills (i.e. - mock interviews and/or practicing the steps of processing and offer) The impact of this will benefit the learners who expressed desire to have more opportunities for experiential learning activities.

  3. Design and develop a training assessment or performance assessment instrument (PAI) (i.e. - Feedback survey, knowledge check quiz, or trainee observations) The impact of this will benefit both the learner and trainer to best customize the learning support throughout the training. It also measures the progress of learner and better assesses their readiness at the conclusion of training.

Dimension 2 Recommendations:

  1. Individual Level: Continue to host daily “Office Hour” group check-in. Trainees expressed the value of these meetings as critical to their learning.

  2. Process Level: Implement a peer-to-peer feedback system and/or a trainee-to-trainer coaching system. The impact of this could allow learners to give and receive feedback in a safe environment during mock interviews.

  3. Organizational Level: Recognize the completion of the Recruiter Training Program in a formal setting. In order to show support up and across the organization, this impact could relay the message to new-hires their inherent value to the organization.

References

Bloom, B., Krathwohl, D., Masia, B. (1956). Taxonomy of educational objectives: The classification of educational goals · Volume 1. Green Longmans.

Chyung, S. Y. (2019). 10-step evaluation for training and performance improvement. Thousand Oaks, CA: Sage.

Kirkpatrick, D. (1996). Evaluating training programs: The four levels. San Francisco: Berrett-Koehler.

Rummler, B. & Brache, A. (1990). Improving performance: How to manage the white space on the organization chart. Wiley.

RummlerBrache Group. (2022). Rummler-Brache Performance™ Matrix. https://www.rummlerbrache.com/


Appendices

Appendix A. Recruiter Training Impact Model

Appendix B. Full Report - Evaluation Final Report