Methodology Decisions Rooted in Respect for Student Voices
Choosing Merriam's Framework
for Student Agency
My selection of Sharon Merriam's qualitative case study framework represented a deliberate philosophical choice that aligned with my educational beliefs about student agency and voice. Merriam's approach emphasizes understanding how individuals make sense of their experiences rather than imposing external interpretations, directly supporting my commitment to honoring student perspectives.
I chose this framework because it acknowledges the complexity and contextual nature of student experiences while providing systematic approaches for investigation. Merriam's emphasis on thick description and heuristic understanding meant I could capture not just what students do, but how they understand their own adaptation strategies and what meaning they create from their experiences navigating repetitive learning cycles.
The bounded system approach helped me maintain focus on UPOU undergraduate students' experiences within structured learning cycles while allowing for rich exploration of individual variation and sophisticated adaptation strategies. This methodological choice reflected my educational philosophy that students are experts in their own experiences and deserving of research approaches that honor rather than diminish their agency.
Mixed-Methods Integration
for Comprehensive Understanding
My decision to integrate quantitative surveys with qualitative interviews reflected my educational philosophy's emphasis on evidence-based practice. I needed both the breadth of understanding that survey data provides and the depth of insight that comes from detailed conversation with students about their lived experiences.
The survey component allowed me to document the prevalence and patterns of experiences like routine fatigue and strategic disengagement across a broader student population, providing an empirical foundation for claims about systematic rather than individual challenges. The interview component enabled deep exploration of how students understand and navigate these challenges, honoring the sophistication of their adaptation strategies.
This integration required careful attention to ensuring that quantitative findings didn't overshadow qualitative insights, and that both components contributed to understanding student agency rather than reducing complex experiences to simple metrics.
Technical Challenges and How I Navigated Them
Research Instrument Development:
Learning from Expert Feedback
The development of my research instruments proved far more complex than I initially anticipated. When I submitted my first draft to my external adviser, I was humbled by the extensive feedback I received (See Appendix E.1). The comments addressed how I framed questions, the organization of sections, and critically, how participants might interpret and respond to my inquiries.
I learned that research instrument development shares important similarities with instructional design – both require careful consideration of your audience and their experiences. Ethics emerged as a crucial concern; I needed to ensure my questions weren't loaded, didn't force participants into uncomfortable positions, and genuinely respected their autonomy to share or withhold information.
The revision process taught me that developing effective research instruments requires multiple iterations and expert guidance. I couldn't simply rely on my own perspective as both researcher and fellow student; I needed external validation to ensure my instruments would actually capture what I intended to investigate rather than what I assumed was happening (See Appendix E.2 for final instruments).
Pilot Testing: Building in Time
for Unexpected Discoveries
My pilot testing process involved seven UPOU undergraduate students from both BES and BAMS programs (See Appendix F). To acknowledge participants' time and effort, I included a ₱50.00 honorarium, recognizing that surveys require genuine mental engagement and shouldn't be treated as quick, throwaway activities.
The pilot testing experience reinforced the importance of building substantial buffer time for unexpected delays and discoveries. While participants provided minimal suggestions for improvement – which initially gave me confidence – I learned to probe more deeply for constructive feedback rather than accepting surface-level validation.
The process revealed practical considerations I hadn't anticipated: survey fatigue, technical functionality issues, and the importance of progress indicators for maintaining participant engagement. Most importantly, it confirmed that my theoretical frameworks (SDT and ARCS-V) effectively guided question development in ways that captured meaningful student experiences (See Appendix F.3).
Faculty Recruitment Challenges: Learning About Academic Timing
The faculty recruitment process proved particularly challenging and taught me important lessons about academic timing and communication. Despite reaching out to several potential faculty interviewees with expertise in distance education, I received virtually no responses (See Appendix G.1).
This experience taught me about the reality of academic schedules, particularly during end-of-trimester periods when faculty face grading, course preparation, and administrative demands. I learned that future research should build in earlier outreach and more flexible timing to accommodate academic rhythms rather than expecting immediate availability.
The challenge also highlighted the importance of having multiple recruitment strategies rather than relying on single approaches. While disappointing, this limitation didn't compromise the research's validity since student voice remained the central focus, aligned with my educational philosophy priorities.
MyPortal Analysis Methodology:
Systematic Technical Review
Developing a systematic approach for analyzing UPOU's MyPortal learning management system required creating a framework that could evaluate technical features through theoretical lenses while maintaining objectivity about platform capabilities and limitations.
I developed a structured analysis framework using the integrated SDT-ARCS-V theoretical approach, supplemented by Flow Theory concepts to evaluate optimal engagement conditions (See Appendix H.1.3). This methodology involved systematic feature inventory and description, usage frequency assessment based on student feedback, analysis of design elements supporting or hindering psychological needs and motivation components, and identification of optimization opportunities.
The analysis revealed the complex relationship students have with MyPortal – 94.12% find it functional yet 97.06% desire improvements. This paradox emerged through systematic technical review that went beyond surface-level satisfaction to understand how platform design either supports or constrains student adaptation strategies within repetitive learning cycles.
Statistical Model Attempts and Data Realities
One of my most significant learning experiences involved attempting statistical modeling approaches that ultimately didn't work with my data. I initially planned more sophisticated quantitative analyses, assuming that student motivation patterns would yield clear statistical relationships suitable for predictive modeling.
The reality proved far more complex. With 34 survey participants and 261 variables, my data was rich in breadth but limited in the sample size needed for robust statistical modeling. More importantly, the phenomena I was investigating – routine fatigue, strategic disengagement, adaptation strategies – proved too nuanced and contextual for simple statistical prediction.
This challenge taught me that research must adapt to data realities rather than forcing data into predetermined analytical frameworks. The shift toward descriptive statistics and pattern identification proved far more productive for understanding the systematic threshold patterns (67%, 76%, 85%, 88%) that emerged as key findings.
Survey Design Problems and
Solutions from Pilot Testing
The pilot testing process revealed several survey design issues that required systematic solutions. Participants indicated the survey felt lengthy, despite taking only 25 minutes on average to complete. This feedback led me to add progress indicators and reorganize content flow to reduce psychological burden.
More significantly, I discovered that some questions inadvertently suggested "correct" answers rather than genuinely exploring participant experiences. The pilot feedback helped me revise language to be more open-ended and genuinely curious rather than confirmatory.
The process also revealed technical considerations: ensuring mobile compatibility, providing clear navigation, and maintaining participant engagement throughout the comprehensive survey experience. These solutions improved both completion rates and data quality for the final implementation.
Mixed-Methods Integration Challenges and Solutions
Balancing Quantitative Breadth
with Qualitative Depth
Integrating survey data with interview insights required careful attention to ensuring each component contributed meaningfully rather than simply confirming predetermined conclusions. The challenge involved allowing each method to reveal different aspects of student experiences while identifying meaningful connections and contradictions.
The solution involved systematic triangulation where I analyzed each data source independently before seeking integration points. This approach revealed that quantitative patterns (like the 88.24% routine fatigue prevalence) gained deeper meaning through qualitative insights about how students understand and respond to these experiences.
The integration process showed that student experiences are far more sophisticated than any single methodology could capture. Students might report satisfaction on surveys while describing substantial struggles in interviews, revealing the adaptation-burden paradox that became a central finding.
Timeline Adaptations
and Gantt Chart Modifications
My original research timeline proved overly optimistic about several phases, requiring multiple adaptations that taught me valuable lessons about realistic project management (See Appendix D.1 for complete timeline documentation).
The faculty recruitment delays necessitated extending that phase while shortening others to maintain overall completion deadlines. Data analysis took longer than anticipated due to the richness and complexity of the qualitative interviews, which averaged 74 minutes each and required careful transcription and coding.
These adaptations taught me the importance of building substantial buffer periods into future research timelines and recognizing that thorough qualitative analysis cannot be rushed without compromising depth and accuracy of findings.
From Personal to Systematic Investigation
Topic Development Journey:
Finding the Real Question
My research topic evolved significantly from initial conception to final investigation. I began with broad curiosity about student motivation in distance education but gradually narrowed focus to the specific phenomenon of repetitive learning cycles and their psychological impacts.
This evolution occurred through systematic engagement with existing literature, which revealed gaps in understanding structural versus content-based motivational challenges. Preliminary informal surveys and conversations with fellow students confirmed that many experienced similar patterns to what I observed in myself (See Appendix D.3).
The topic crystallized around understanding routine fatigue and strategic disengagement as potentially adaptive rather than problematic phenomena. This reframing represented a significant shift from deficit-based approaches toward recognizing student sophistication and agency in navigating educational challenges.
Learning Experiences from Research Challenges
My original research timeline proved overly optimistic about several phases, requiring multiple adaptations that taught me valuable lessons about realistic project management (See Appendix D.1 for complete timeline documentation).
The faculty recruitment delays necessitated extending that phase while shortening others to maintain overall completion deadlines. Data analysis took longer than anticipated due to the richness and complexity of the qualitative interviews, which averaged 74 minutes each and required careful transcription and coding.
These adaptations taught me the importance of building substantial buffer periods into future research timelines and recognizing that thorough qualitative analysis cannot be rushed without compromising depth and accuracy of findings.
Reflective Analysis: What This Approach Revealed
Third-Person Perspective on Research Process
This approach revealed the complexity of balancing structure with authentic student voice throughout systematic investigation. The methodology demonstrated that honoring student agency requires careful attention to research design choices that either support or constrain participants' ability to share their genuine experiences.
The investigation showed that mixed-methods approaches can effectively capture both systematic patterns and individual variation when theoretical frameworks guide integration rather than simply combining different data types. The use of SDT and ARCS-V provided coherent lenses for understanding diverse student responses to repetitive learning cycles.
Most importantly, the research process demonstrated that student-researcher perspectives can contribute unique insights to educational investigation precisely because of rather than despite their insider position. The combination of personal experience and systematic methodology produced findings that might never emerge from purely external research approaches.
Integration of Multiple Data Sources Enhanced Understanding
The systematic triangulation across surveys, interviews, and technical platform analysis revealed relationships and patterns that no single method could have captured independently. The quantitative documentation of routine fatigue prevalence (88.24%) gained meaning through qualitative exploration of how students understand and respond to this experience.
The platform analysis provided crucial context for understanding why certain adaptation strategies emerge and persist among students. MyPortal's inflexible architectural design helps explain why students develop sophisticated workarounds rather than simply adapting to intended usage patterns.
This integration demonstrated that comprehensive understanding of educational phenomena requires multiple perspectives and methodological approaches that complement rather than compete with each other.
Educational Philosophy Validation Through Process
The research process validated my educational philosophy's emphasis on student agency by demonstrating that students are indeed sophisticated agents who develop complex strategies for navigating educational challenges. Every phase of investigation revealed evidence of student capacity for self-regulation, adaptation, and meaning-making within institutional constraints.
The methodology choices consistently supported rather than undermined student voice, from theoretical framework selection through data collection and analysis approaches. This philosophical consistency enabled discoveries that recognized rather than pathologized student adaptation strategies.
The successful translation of research findings into the Student Motivation Guide demonstrated how systematic investigation guided by student-centered educational philosophy can produce immediate practical benefits for student communities while contributing to broader theoretical understanding.
© 2025 Alexis Faith S. Gonzales (asgonzales10@up.edu.ph).
This ePortfolio is submitted for academic evaluation purposes. Grammar and language enhancement were assisted by Quillbot for clarity and readability. Licensed under Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
No part of this work may be reproduced, distributed, or transmitted for commercial purposes without prior written permission from the author, except for brief quotations in academic reviews or citations