Mixed-Methods Methodology 

Please refer to the Research Key Terms page for definitions of research terms mentioned on this page.

Explanatory Mixed Methods Design Process

Researchers used a mixed methods explanatory sequential design to address the study’s research questions. A mixed methods explanatory sequential design uses quantitative data in the first phase to identify qualitative data for the second phase (Decuir-Gunby & Schutz, 2017). Quantitative data in this study were collected via a study-specific online survey of 326 family members. Survey data informed questions for focus groups in phase two. After analyzing survey and focus group data, researchers combined data for interpretation using an interaction matrix. 

Research design sequence: quantitative data collection, quantitative data analysis, qualitative data collection, qualitative data analysis, and mixed-methods synthesis.

Diagram of the explanatory mixed-methods design

Quantitative Survey Methodology Details

Researchers collected quantitative data using a study-specific web-based survey. Survey questions elicited information from family members of K-12 students across the US with an IEP or 504 plan enrolled in a public school. In addition to demographic questions, the survey contained questions about education leaders and classroom educator planning and use of information, and educational and assistive technologies. Questions centered around technology strategy, infrastructure, and use for teaching, learning, and assessment. Researchers analyzed survey data using Excel to calculate response frequencies and perform a chi-square analysis on used and preferred communication methods. 

Survey Participant Recruitment

Participants were recruited for the study through nonprofit organizations that serve families of K-12 students with disabilities, such as the Center for Parent Information and Resources (CPIR), the National Parent Teacher Association (PTA), and Special Olympics. These organizations assist families in engaging with their children's education and promoting family involvement in schools. Study details were shared with these organizations, and they distributed the information through newsletters, social media, and paper-based communication channels to reach families without internet access. 

Voluntary response sampling was used to allow any interested family member of a public K-12 school student with an IEP or 504 plan to participate in the survey. It is important to note that this sampling method has limitations, as it relies on individuals who are aware of the study and have internet access. To maximize participation, information was provided on using library computers and Wi-Fi services, and the survey was designed to be completed on mobile devices. Both English and Spanish versions of the survey and recruitment materials were made available to accommodate participants' language preferences.

Survey Data Collection

Prospective survey participants received recruitment information from a non-profit organization serving families of K-12 students with disabilities, containing a survey link. Upon accessing the Qualtrics survey, participants encountered study information and screening questions. Those who met the screening criteria proceeded to the informed consent form. After providing consent, participants completed demographic questions and the survey. At the end of the survey, participants had the option to fill out a focus group interest form and fill out an entry form for a drawing for an Amazon gift card.

Survey Data Analysis

A total of 431 participants consented and completed at least a portion of the survey. Before analyzing the data, 82 incomplete responses (less than 60% complete) and 23 fraudulent responses were removed. The incomplete responses lacked a complete non-demographic section necessary for analysis, while fraudulent responses were identified using Qualtrics reCAPTCHA and Relevant ID metadata fields. Fraudulent responses with a ReCAPTCHA Score below 0.5 and those meeting the Relevant ID fraud criteria were excluded.

The remaining 326 survey responses were statistically analyzed using Excel. Frequency tables and graphs were created for respondent and student demographic data. Pearson's chi-squared test was employed to determine if there were significant differences between student demographic data and population demographics from the 2017-18 US Department of Education Office of Civil Rights survey (US Department of Education OCR, 2018). Frequency tables and graphs were also generated for communication method questions. Chi-square analysis was used to assess differences between communication methods used by educational leaders and classroom educators and those preferred by family members.

Survey Instrument

The survey starts with nine family member (respondent) demographic questions and 12 student demographic questions. The survey then presents three sections with questions about family engagement at the classroom, school, and district levels. Each of the three sections contains a group of questions for CITES framework categories that apply to that level (see example below). Response options are based on communication methods identified in the literature review. The classroom family engagement section contains questions about teaching, learning, and classroom assessments. The school and district-level family engagement sections include questions about leadership, infrastructure, and large-scale assessments. Survey question response options were either single-select or multi-select. 

Survey Instrument Validity and Reliability

To ensure content validity, experts provided feedback about whether survey questions covered all aspects of the first three study research questions, item wording, and accessibility for the intended audience. Researchers revised the initial draft survey based on feedback received. 

Researchers ensured internal validity by conducting a pilot survey that included the draft survey questions and two open-ended questions for feedback. Twelve pilot survey participants provided input on question clarity, survey duration, display logic, and response options. The survey was then revised based on the feedback received from the pilot participants.

The survey recruited participants from all states in the US to establish external validity. However, it has limited external validity due to demographic disparities between the survey respondents and the overall population of families with students having an IEP or 504 plan (US Department of Education OCR, 2018). Only gender showed statistical similarity (p=0.16) to the overall population, while other demographic comparisons resulted in a p-value<0.001, indicating no relationship between the survey and population data.

Researchers assessed the survey's internal consistency by calculating the average inter-item correlation for questions about preferred communication methods. The table shows correlations for each method at different levels (classroom, school, district). The final row displays the overall average correlations: 0.44 for classroom educators, 0.52 for school leaders, and 0.57 for district leaders. These values exceed the target range of 0.15 to 0.5, indicating good internal consistency (Clark & Watson, 1995).

Qualitative Focus Group Methodology Details

Participants who completed the survey had the opportunity to volunteer to participate in a focus group by clicking an interest form link at the end of the survey. Researchers conducted five focus group meetings for the study, four in English and one in Spanish. Due to the limited number of focus group volunteers, researchers assigned participants to focus groups based on their availability specified in a Qualtrics survey.

Focus Group Data Collection

Audio-recorded Zoom sessions were used for all focus groups, enabling participants from different locations in the United States to join without the need for travel. Participants provided electronic consent before the meetings. A list of pre-defined focus group questions was used to guide discussion during each session. Each focus group had two facilitators, with one managing the discussion and asking questions while the other handled technical aspects and took notes. This approach allowed for multiple perspectives during note-taking (Morgan & Hoffman, 2018). Transcription of the audio recordings was done by a third-party company, and anonymized transcripts were sent to focus group participants for member checking to ensure accuracy and relevance.

Focus Group Data Analysis

After completing member checking, researchers used Excel to organize the content of focus group transcripts into tables. The analysis began by reading through the transcripts. To facilitate integrated data analysis, participant statements were deductively coded into eight themes: (1) Leadership, (2) Infrastructure, (3) Large-Scale Assessment Accessibility and Accommodations, (4) Large-Scale Assessment Data Use, (5) Teaching, (6) Learning, (7) Classroom Assessment Accessibility and Accommodations, and (8) Classroom Assessment Data Use. Following deductive coding, inductive qualitative coding was employed to derive meaning from the data and address the research questions. Through an open coding process, participant statements were identified. Axial coding was then performed to review all coded data elements and establish logical connections to the research questions. Lastly, selective coding was conducted to identify the final subthemes for answering the research questions. This rigorous coding process ensured a thorough analysis and interpretation of the data in line with qualitative research principles (Merriam & Tisdell, 2017).

Focus Group Data Validity and Reliability

Researchers used multiple data collection methods, including surveys and focus groups, to enhance the study's internal validity through triangulation. The focus groups discussed family engagement by education leaders and classroom educators, ensuring data saturation was achieved. Member checking was employed by sharing a summary of focus group findings with participants for feedback, which influenced the integration of focus group and survey data in the final analysis. Researchers also openly acknowledged their positionality, communicating its impact on the study and the research process to both colleagues and participants (Merriam & Tisdell, 2017). 

In qualitative research, reliability refers to the consistency between the findings and the data. To ensure intercoder reliability, two researchers used a predefined coding scheme and iteratively coded the focus group data until reaching an 80% agreement level. 

External validity refers to the study’s generalizability to other situations (Merriam & Tisdell, 2017). Although focus group participants represent different states and have different demographic characteristics, the number and demographic make-up of participants is insufficiently representative to conclude that findings reflect the opinions of the general population of families of students with disabilities (Morgan & Hoffman, 2018). 

Mixed-Methods Data Synthesis

Once all quantitative and qualitative data were collected and analyzed, researchers collectively analyzed survey and focus group findings using result-based integration (Thierbach et al., 2020). They created an interaction matrix using a table-based comparison of findings for each focus group theme.

References