Challenge 1: Implement formative evaluation plans
Criteria for successful completion of this challenge: Evidence of implementing a formative evaluation plan to provide information that can be used to make adjustments and improvements in the design. Evidence must show a formative evaluation plan (expert review, one-to-one evaluation, small group, and field trial). Reflection must address: Which phase(s) of formative evaluation did you conduct? Which data did you collect (e.g., clarity and accuracy of instruction, general attitudes, procedural issues, etc.)? What were the results of formative evaluation and how did it affect your design?
Examples: Evaluation Plan (EDCI 528), Design Documents (EDCI 572), Learning Module (EDCI 575), eLearning Project (EDCI 569), artifacts showing strategies for implementation of an evaluation plan (design, performance, workplace, educational, other).
Reflection
The challenge is to show that you know how to implement a formative evaluation plan. In EDCI 572, using the Dick & Carey model of instructional design, I designed and developed a fully functioning yoga teacher training module that taught learners how to follow a particular method for designing a cohesive and logical yoga class. I called it the Class Blocking Methodology. The primary artifact I have included here is an Implementation and Formative Evaluation Report, and the secondary artifact I have included is the design document that lays out the implementation and evaluation plan. These documents show that I am capable of not only designing an effective implementation and evaluation plan, but in actually deploying it. I pilot tested the learning module with two representative learners and the Report showcases my findings and my adaptations to the module as a result of formative feedback.
As we learned in the Program Evaluation course (EDCI 577), using the Kirkpatrick Four Levels Model, one should not end with evaluation but start with it. It is therefore very important to the field of instructional design. Evaluation is ingrained and integrated into everything we do as instructional designers, and the logical design and implementation of quality formative and summative evaluations is paramount to success. The artifacts demonstrate the competency because the Design Document lays out the theoretical plan, and the Report lays out the findings of the implemented plan. One of the challenges of being an instructional designer is definitely remaining aware of not residing in a space of only theoretical design; what you design and develop must also be of practical use and of quality for the learners. My artifacts show that I can embody both theory and practice, and learn from my initial plan in order to improve the learning experience and be a good listener and ally to the needs of the learners.
At the time of the EDCI 572 course, I had not ever followed a formal instructional design methodology, and the Dick and Carey model was quite overwhelming. I did, and still do, appreciate it as a very comprehensive approach to designing a well-oiled, well-thought-out learning experience that hits all of the academic standards held high by programs like the LDT program at Purdue. Prior to joining this program, I spent over 15 years teaching English in a higher education setting, and worked successfully with instructional designers to improve my teaching and course design toolkit. Luckily, I brought with me other non-academic teaching experiences that aided me in my creation of these artifacts because I have experience designing an assessment plan for a co-curricular student services office and had the pleasure of being mentored by a veteran in the assessment arena; and, as a subject-matter expert in the teaching of yoga, I was able to foresee the needs of learners in such a way that it aided in my creation of the implementation and evaluation plan because I picked pilot testers that could give me their own targeted feedback based on their background and experience teaching yoga. Nonetheless, the experience of designing these artifacts as part of the full learning module was a first for me, and had I this knowledge 10 years ago, I know I would have been able to impact my students more with better designed coursework. I am glad to know it now!
These artifacts work well for this challenge because the Report shows what occurred during the actual formative evaluation (pilot test) of the module and how I responded to my pilot testers’ feedback in order to improve the module. My original plan, as outlined in the Design Document, was solid and I was very comfortable adapting pieces of content based on learner feedback because I had chosen my pilot testers wisely. The only thing I would have changed about the experience is the amount of time necessary to deploy the pilot test because it was a very quick turnaround. Had I more time to do it, I would have secured a few more learners to do the pilot test. When I am a practicing instructional designer, I will seek opportunities that will allow for more quality time associated with evaluation in a pilot test phase. With this experience under my belt, however, I am very confident in my ability to follow a formalized framework to not only create but adjust my products to be optimized for learner retention and experience.
Artifacts
Design Document for Yoga Module, including Implementation and Evaluation Plan
Implementation and Formative Evaluation Report from Pilot Test of Yoga Module
Challenge 2: Implement summative evaluation plans
Criteria for successful completion of this challenge: Evidence of implementing a summative evaluation plan to evaluate the effectiveness of the instruction and decide whether to continue to use instruction. Evidence must show an evaluation plan (e.g., Kirkpatrick’s Four Levels of evaluation). Reflection must address: If the implementation of the summative evaluation met your expectations. What were the results of the summative evaluation (did you continue with program/instruction, did you cancel it, did you modify it)?
Examples: The following assignments are applicable if implemented: Evaluation Plan (EDCI 528), Evaluation Plan (EDCI 577), artifacts showing implementation of an evaluation plan (design, performance, workplace, educational, other).
Reflection
The challenge is to implement summative evaluation plans. The artifacts I have chosen to illustrate my competency in this are the two Evaluation Plan documents (Evaluation Proposal and Training Evaluation Plan) from EDCI 577 and the Gantt chart created to support them by outlining the program evaluation timeline. Because EDCI 577 was designed to have partners completing the Evaluation Plan, my partner was Ritika Bhargo Chari. We collaborated on the creation of these documents. The Evaluation Plan documents from EDCI 577 showcase our thorough understanding of how to use Kirkpatrick’s Four Levels of Program Evaluation model in relation to my real-world job at a hospital system. I work in breast cancer education and support, and though we did not use the real name of my program, we referred to it as BCESSP. These documents have since been used by me to translate the work Ms. Chari and I did in this class into a real-world context not only for program evaluation of the support groups and learning experiences I am curating and facilitating, but in relation to a recent grant application that needed solid program evaluation criteria included. Though the full extent of the program evaluation as laid out in these documents has not yet been deployed in the real world, parts of it have been and are in process in terms of adaptation to real-life scenarios, taking theory into practice. The full-scale program evaluation as laid out here provides a window into my potential, and Ms. Chari’s potential, for effectively deploying a summative evaluation plan to evaluate the effectiveness of a program and its offerings.
After having taken EDCI 577, it became very clear to me that program evaluation was one of the most important things I could learn in this LDT program at Purdue. Because of my background in higher education and teaching college-level writing, as well as my time directing a university writing center and designing an assessment plan to validate it for the Higher Learning Commission’s annual accreditation review, I know how important having quality skills at designing assessments and evaluations can be in the real world. As we learned with the Kirkpatrick Method, the Design phase should start with Evaluation in mind, not as an afterthought. I now more adequately understand the importance of Level 3 and Level 4 benchmarks and assessments to actually show learning has occurred over the long-term as well as proving how organizational benchmarks have been met. Within these artifacts, we painstakingly designed the assessment instruments and assessment plan in order to validate all four levels so that the surface-levels 1-2 were covered but weren’t the only things evaluated. In order to create the Level 4 benchmarks, I had to dig deeply into accreditation standards through National Accreditation Program for Breast Centers (NAPB) and National Comprehensive Cancer Network (NCCN) to match our assessment instruments to their guidelines. Being able to do this was essential to the success of this program evaluation, and in real life in my job, I have been able to easily adapt pieces of the program evaluation to my real work in breast cancer education—all because of the hard work Ms. Chari and I put in on these accompanying program evaluation documents, which show a holistic and vetted approach to summative program evaluation.
My prior knowledge designing an assessment plan that met Higher Learning Commission standards helped me tremendously in the framing of Level 4 benchmarks around NAPBC and NCCN guidelines. Because I had the experience of designing a high-level assessment plan for a student services office, I already understood the need to meet benchmarks, though at the time I designed that, I did not know about Kirkpatrick’s Four Levels Model of Program Evaluation and identifying/meeting Level 4 outcomes. Had I known about this thorough approach, I would have used this to design program assessments for other work I’ve done, including DEI-specific educational workshops, and staff training and development at the university and community college I worked for. Now that I know how to design a formative and summative assessment plan and evaluate a program using Kirkpatrick’s Four Levels, I can only imagine how much more I will be able to do with it now that I have this experience at Purdue in the LDT program.
The artifacts I’ve chosen to support my practice of this competency show that I am able to design an effective evaluation plan (as well as work on a team in order to do program evaluation, since Ms. Ritika Chari and I created this together). Though I have not fully deployed the evaluation plan found in these artifacts in my job, I have used pieces of the four-level evaluation plan to design formative and summative assessment instruments for educational offerings for my breast cancer education program, and I have used pieces of the design document to design learning objectives and an assessment plan for a grant we recently applied for to create educational programming for metastatic breast cancer patients and survivors. I will continue developing professional acumen in this area in my current job because the longer I am in it and can collect data, the more likely that I can do a full program evaluation to actually use the full scale of what Ms. Chari and I developed. The Program Evaluation course (EDCI 577) at Purdue is the most valuable course I have taken because it has helped me build on a foundation I already had and has really increased my knowledge of the importance of Level 3 & 4 evaluation. In my current work in a hospital system, Level 4 evaluation directly connects to meeting hospital accreditation standards, and that means I can always prove my work is of value by designing quality learning experiences that meet those objectives. These artifacts show a very comprehensive, current understanding of that.
Artifacts
Evaluation Proposal for BCESSP Program Evaluation using Kirkpatrick's Four Levels Model
Training Evaluation Plan for BCESSP Program Evaluation using Kirkpatrick's Four Levels Model
Evaluation Plan Timeline as a GANTT Chart
All artifacts copyright 2025 Cindy Spires Malerba and Ritika Bhargo Chari. All rights reserved.