This project primarily aims to address two key issues in Barangay Laram: (1) Develop a replicable instructional plan that includes a step-by-step process for effectively conducting barangay-led educational discussions; (2) Address the increasing number of teenage pregnancies within the community through the developed instructional plan. This instructional design consists of comprehensive learning objectives aligned with Bloom's Taxonomy, strategically selected learning activities, and aligned formative and summative assessments.
The primary instructional design model utilized in this project is the ADDIE model. The project required a structured process to design and develop a barangay-led educational program on sexual and reproductive health. The ADDIE model ensured that each stage of the intervention was grounded in evidence and context-specific data.
The following is a detailed description of the process employed for this project:
This phase included needs, learner, context, program, and task analyses. The analyses were conducted using a combination of online and in-person interviews. For the online survey, a questionnaire was distributed to the invited youth participants (a copy of the online questions is shown in figure 2). On the other hand, I have distributed a different set of questions for barangay leaders and health workers as well as the SK officials (shown in figure 3).
Figure 3. Survey Questionnaire for the Barangay Leaders
The analysis phase identified community issues that can be addressed through education. Results showed that the SK lacked a structured training or instructional plan to guide their educational discussions. One major issue identified was the ongoing problem of teenage pregnancy in the barangay. Although prior initiatives by the SK aimed to combat teenage pregnancy, it remains a prevalent issue for teenagers. Based on the youth survey, the primary perceived causes of teenage pregnancy were (1) lack of knowledge about sexual health (18 out of 20 participants answered) and (2) peer pressure (17 out of 20 participants answered).
In this phase, I translated the findings from the analysis into a concrete instructional plan. I set clear objectives for both SK officials and youth participants, aligned with the overall goal of improving knowledge and attitudes toward SRH. For SK officials and barangay leaders, I focused on building their skills to organize and facilitate structured educational discussions. For the youth, I centered the objectives on gaining accurate SRH knowledge and strengthening decision-making skills to resist peer pressure.
I selected instructional strategies that fit the cultural and social context of Barangay Laram. I embedded Gagné’s Nine Events of Instruction in the “During” phase of the manual. This complements the ADDIE framework that guided the whole plan. I designed the manual to follow this format:
What to Expect (Master Checklist) → Setting Up Page → Before Phase → During Phase → After Phase
In the What to Expect page, I provided a master checklist for SK officials. This helps them see everything they need to conduct during the seminar. In the Setting Up page, I included short explanations of the basics every seminar facilitator should know and prepare, such as budgeting, identifying the target group, choosing a subject matter expert, securing the venue, etc.
In the Before Phase, guided by the analysis and design stages, I laid out a step-by-step process from identifying the problem up to preparing the seminar materials. In the During Phase, implementation begins. Here I applied Gagné’s Nine Events of Instruction to guide the flow of the sessions.
Figure 4. Gagné’s Nine Events of Instruction.
From Gagné’s Nine Events of Instruction [Image], by CourseArc, 2015, https://www.coursearc.com/gagnes-nine-events-of-instruction/
Finally, the After Phase focuses on evaluation and assessment, including how implementers can monitor the seminar’s effectiveness even after the completion of the seminar.
I also designed the materials to be learner-centered and age-appropriate. I created booklets and visual aids, and I selected YouTube videos to complement the discussions. I applied UDL to ensure inclusivity, offering multiple means of engagement, representation, and expression. I also drafted assessment tools, including pre-tests, post-tests, and feedback forms, to measure knowledge gains and program effectiveness.
In this phase, the learning materials and instructional plan is being produced. Modules were drafted based on the objectives set in the design phase, with input from health workers and gatekeepers to ensure the SRH plan is accurate and aligned to medical practices. Booklets were prepared with simplified language and appropriate visuals to encourage comprehension among adolescents with diverse literacy levels. A facilitator’s manual was also created to guide SK officials and leaders in delivering sessions consistently. Materials were guided by the Mayer's multimedia learning theory (Mayer, 2002).
Based on the survey from the analysis phase participants selected the Printed materials as the mode of instruction (16 out of 20 students answered) followed by educational videos and interactive group activities. Guided by UDL various activities were employed: video watching an individual activity and two group activities. The evaluation tools, including pre- and post-test questionnaires, attendance sheets, and feedback forms, were finalized with a dry run with the SK officials to check clarity and reliability before the main seminar. This is also a way for the us to establish confidence in executing the step-by-step process listed in the design-development manual.
In the implementation phase, I rolled out the seminar sessions in Barangay Laram together with SK officials, who served as facilitators and role models for the youth participants. I made sure that the sessions were delivered in the local language and included culturally relevant references so the discussions would be accessible and resonate with the community.
I grounded youth engagement in Social Learning Theory (Bandura, 1977) and applied three key components during instruction. First, I used reinforcement by acknowledging correct answers and decisions that reflected healthy behaviors. This consistent positive feedback encouraged participants to continue making constructive choices. Second, I relied on modeling, where both I and the SK officials acted as role models by providing accurate information and demonstrating the attitudes and behaviors expected from the youth. Third, I emphasized observation by creating opportunities for learners to learn not only from instruction but also from watching their peers during activities such as scenario analysis. I reiterated correct responses to reinforce recall, with the goal of encouraging learners to imitate positive behaviors, including assertively resisting peer pressure.
Figure 6. Actual picture from the seminar during the discussion
We also incorporated Constructivist Learning Theory (Piaget, 1954; Vygotsky, 1978). I encouraged participants to actively build knowledge by connecting new ideas to their prior experiences and social realities. I used interactive strategies such as group discussions, role-plays, and problem-solving tasks so that participants could share perspectives and collaboratively create new understandings.
To support the flow of the sessions, SK officials and youth leaders used the facilitator’s manual I designed. At the same time, I encouraged them to adapt activities depending on the group’s needs. This adaptive approach aligned with PAR (Reason & Bradbury, 2008), where participants are treated as co-decision-makers and their insights are valued, especially when addressing sensitive issues like teenage pregnancy.
I also integrated formative assessments throughout the sessions. Following the PAR cycle of plan–act–observe–reflect, I used continuous feedback and reflection not only to improve the ongoing implementation but also to generate insights that would strengthen future programs.
I conducted both formative and summative assessments throughout the seminar. For formative evaluation, I gathered ongoing feedback from facilitators and participants during the implementation stage. I made real-time adjustments, such as modifying the pacing of discussions or clarifying complex SRH terms. For summative evaluation, I carried out tests and surveys after the seminar ended.
I compared pre-test and post-test results to measure knowledge gains among youth participants (see Appendix E: Test Forms). I also collected feedback forms from youth participants (see Appendix F: Post-Seminar Survey – Youth Participants) to capture their satisfaction and views on the usefulness of the instructional design and materials. For SK officials, I distributed feedback forms (see Appendix G: Post-Seminar Survey – SK Officials) to assess how well they could use the manual and their confidence in leading educational discussions. This allowed me to gauge their readiness to conduct youth-led seminars at the barangay level.
The data showed improvements in both content knowledge and facilitation skills among youth and SK participants. This validated the effectiveness of the ADDIE-based approach. I also documented recommendations for future seminars and for continuous refinement of the instructional plan. Finally, I analyzed the results from the forms we collected and applied Kirkpatrick’s Four Levels of Evaluation (refer to eJournal Week 8: Performance Evaluation & Post-Implementation Monitoring).
Departures from the Project Plan
The project was originally designed as a series of seminar sessions, but it was instead implemented as a one-day seminar due to scheduling conflicts with SK officials and overlapping barangay activities. The co-design phase also took longer than expected because youth participants suggested integrating cultural sensitivity and local examples. While these changes required more time, they improved the contextual relevance of the instructional materials and strengthened community ownership. I also realized that the pilot implementation was limited since it was conducted only with the SK officials and myself. Moving forward, I would recommend conducting a pilot with a small group of youth participants before the main seminar to capture more accurate feedback and ensure the activities truly match their needs.
Insights from the Project
Several key insights emerged from the seminar. I observed that interactive, dialogue-based approaches encouraged youth to speak more openly about sexual and reproductive health, which is often stigmatized. When learners felt comfortable with the facilitators and their peers, they became more confident in asking questions without fear of judgment. I also noticed that participants responded more positively to storytelling and peer-to-peer learning than to lecture-style teaching. Another important realization was the value of formative feedback. By embedding evaluation tools during sessions, facilitators were able to see if students were keeping up with the discussions, which was especially useful given the time constraints.
Challenges Encountered
The project also faced several challenges. One significant issue was the use of language that was not always accessible to all participants. Some learners had little or no prior knowledge of SRH, which made it difficult for them to understand certain terms. For the updated version of the instructional plan, I am simplifying the wording and strengthening visuals to make ideas clearer. Another challenge was underestimated resources. Venues, technology, and materials were not always reliable, which affected the flow of activities. In response, I suggest to include contingency options such as low-tech activities and flexible room arrangements. The one-time nature of the seminar was also a limitation. Unlike working with a stable group such as a school class, the varied participant pool made it difficult to tailor content or measure long-term outcomes. This limited opportunities for reinforcement and mentorship, which are essential for lasting behavior change. To address this, I see the need to link with schools, NGOs, or youth organizations to create more stable platforms for repeated sessions, peer support, and long-term monitoring. Finally, logistical issues such as rescheduling, limited space in the barangay hall, and time constraints for SK officials posed barriers. Some officials could not attend the full session, so I provided them with the manual and gave a short orientation for later independent use. Stigma around SRH was another obstacle, but this was managed by establishing confidentiality rules and using icebreakers to build trust with the learners.
The table 1, directly addresses one of the specific objectives which is to analyze the needs and gaps in barangay-level educational initiatives through surveys and interviews. The table consolidates findings from the needs, context, program, learner, and task analyses. These analyses provide evidence of the gaps in existing SRH seminars, highlight the socio-economic and cultural context of Barangay Laram, assess past program limitations, identify learner characteristics, and specify the tasks for both SK officials and youth participants. Presenting these findings in one table makes the results clear with the study’s aim of creating a replicable instructional plan.
Table 1. Summary of Analyses for the Barangay-Led SRH Seminar.
I developed the guide using the ADDIE model. Each stage allowed me to ensure that the guide was evidence-based, responsive to the needs of the youth, and practical for use in barangay settings. I intentionally simplified it so that SK officials would feel comfortable and confident working with the manual, even without prior training in instructional design. Figure 7 shows the post-seminar survey form that I distributed to the SK officials.
As shown in Table 2, the SK officials demonstrated consistently high confidence in replicating the seminar, with all ratings falling within 4 and 5. Their confidence was strongest in identifying community needs, using the manual as a guide, and preparing materials, while slightly lower ratings appeared in facilitation and evaluation tasks. This tells me that while the officials feel well-prepared for planning and preparation, they may still need additional support in implementation and monitoring stages.
Table 2. Post-Seminar SK officials Feedback: Distribution of Ratings per Item.
10-item quiz (see Appendix E: Test Forms) were distributed to gauge the learners Pre-Test and Post-Test scores. The pre-test and post-test results of the 20 youth participants were analyzed to determine the effectiveness of the intervention. Scores in the pre-test ranged from 3 to 10, while post-test scores ranged from 6 to 10. Figure 2 presents the visual distribution of scores.
Using the formula for Percentage Gain:
((Post-test Mean – Pre-test Mean) ÷ Pre-test Mean) × 100
The score revealed that the mean pre-test score was 7.45, while the mean post-test score was 9.45, yielding a mean gain of 2.0 points. The results showed a 26.85% increase in average knowledge scores. This indicates a notable improvement in the participants’ understanding of SRH concepts after the seminar.
Further analysis revealed that 19 out of 20 participants demonstrated improvement in their post-test scores, while 1 participant maintained the same score of 10. None of the participants scored lower in the post-test compared to the pre-test. Notably, participants who initially had lower pre-test scores (e.g., Student 1 with 3 points, Student 12 with 4 points) exhibited the highest gains, achieving 6 and 10 points respectively in the post-test. This suggests that the intervention was particularly effective for learners with limited prior knowledge.
On the other hand, participants who already had high pre-test scores (9–10 points) either maintained or slightly improved their scores. This shows consolidation of knowledge and reinforcement of previously acquired information.
These findings imply that the intervention addressed existing knowledge gaps among the youth, particularly in areas where misconceptions or lack of prior understanding were evident. The data provide evidence that structured, interactive, and participatory learning activities contributed to improved learning outcomes.
Figure 7. Comparison of the Pre-Test and Post-Test scores of the learners.
The post-seminar survey results from 20 learners indicate highly positive feedback. Most items were rated in categories 4 (Agree) and 5 (Strongly Agree), reflecting strong satisfaction with the instructional materials, facilitation, and seminar content. Several items stood out with very strong agreement. Items 1, 2, and 5 each had 19 learners selecting Strongly Agree, showing that participants found the handouts easy to understand, the information helpful for learning sexual health concepts, and the content applicable to real-life situations. Items 4, 6, 13, 14, 16, and 17 also received strong responses, with 15 or more learners selecting Strongly Agree. These results highlight clear explanations, encouragement of critical thinking, a comfortable learning environment, improved understanding, and greater awareness of risks.
Some items showed slightly more mixed, though still positive feedback. Item 3, on the visual appeal of materials, had three Neutral responses and fewer Strongly Agree, suggesting room to improve the design. Item 8, on learning at one’s own pace, had two Disagree and one Neutral, indicating that some learners wanted more flexibility. Items 11, 12, and 19 leaned positive overall but had more responses in the neutral or disagree ranges, pointing to areas for improving video content, engagement activities, and comfort in discussing sexual health.
The lowest rating appeared in Item 20, which asked about preparedness to seek services. This item included one Strongly Disagree and several mid-range responses. This suggests that while learners valued the seminar, greater reinforcement is needed in future sessions to build confidence in accessing reproductive health services.
Table 3. Post-Seminar Learner Feedback: Distribution of Ratings per Item
The analysis revealed a statistically significant increase in participants’ knowledge after the seminar, t(19) = 6.40, p < .0001. The mean difference between pre-test (M = 7.55, SD = 1.76) and post-test (M = 9.60, SD = 0.94) scores was −2.05, indicating improvement. The 95% confidence interval ranged from −2.72 to −1.38, confirming that the true mean difference is unlikely to be zero and reflects actual knowledge gain. Reporting both p-values and confidence intervals aligns with best practices in quantitative research (Greenland et al., 2016). These findings suggest that the instructional materials and seminar were effective in enhancing participants’ understanding of sexual and reproductive health.
The paired t-test was computed using the formula:
t = (M₁ − M₂) / (SD/√n)
where M₁ and M₂ are the means of the pre-test and post-test, SD is the standard deviation of the differences, and n is the sample size (University of Calgary, n.d.).
The evaluation tools used in the project—pre- and post-tests and feedback forms—aligned closely with Kirkpatrick’s Four Levels of Evaluation, as detailed in eJournal: Week 8: Performance Evaluation & Post-Implementation Monitoring.
My initial assumptions were a crucial learning point. I entered the project believing that a single, impactful seminar could create significant change in both knowledge and behavior. However, the implementation revealed the limitations of this one-time approach. While we successfully transferred information, the real-world questions from the youth—such as whether "biking can cause a loss of virginity" or if "Vitamin C can cause a miscarriage"—revealed a deeper, more fundamental issue. These misconceptions demonstrated that a single session can only address a knowledge gap; it cannot easily dismantle long-held cultural beliefs or deeply ingrained misinformation. This experience underscored the critical difference between knowledge transfer and a genuine shift in beliefs. As Kirby (2008) suggests, sustained and comprehensive education is essential for influencing long-term behavior.
I also underestimated the importance of a conducive learning environment. The absence of a dedicated classroom space meant we had to navigate distractions and create a secure atmosphere in a public setting. This reinforced the idea that a safe, non-judgmental space is not a luxury but a prerequisite for a topic as sensitive as sexual health, where learners need to feel comfortable asking taboo questions without fear. The learners' lack of baseline knowledge, likely due to the absence of formal sexual education in schools or the "taboo" nature of these topics at home, further highlighted the need for a foundational, trust-building approach before information can be effectively shared.
My primary strength was the clear and consistent application of the ADDIE model, integrated with PAR. This provided a robust, data-driven structure that guided every phase of development, from design to evaluation. The use of interactive strategies like the myth-and-fact game also encouraged active participation and made the learning experience learner-centered. The project's materials were intentionally simplified and contextualized for the community, which was a key factor in their success.
Despite these strengths, the one-day format proved to be a critical weakness. The ADDIE model provided a strong framework for the project's development and implementation, but it did not inherently account for the long-term reinforcement necessary for a topic of this complexity. The project's limited sustainability was not a failure of the design itself but a consequence of its confined scope. This led me to the most important realization: a successful intervention on SRH must pivot from a single "event" to a continuous "process."
Based on these learnings, I now recommend a more holistic approach that prioritizes long-term engagement and sustainability. We should strengthen partnerships with schools, youth organizations, and NGOs to ensure continuous mentorship and follow-up support. Educational sessions are best conducted in multiple sessions to enhance knowledge retention and allow for the reinforcement needed to challenge deep-seated beliefs. A strong monitoring system for youth participation is essential for tracking long-term impact. Finally, regularly updating instructional materials based on feedback and incorporating a blended approach of online and face-to-face learning can significantly improve accessibility and ensure the program remains relevant and impactful for the community's youth.
04