Source: stock.adobe.com
The purpose of this mixed-methods study was to hear the voice of urban educators in the hopes that their voice could determine the best strategies to support urban students on their path toward high school graduation and entrance into post-secondary programs. As seen in the review of literature there have been many studies on the topic of urban student matriculation from high school to post-secondary and each study claimed that their strategy best supported urban students; however, the rates of urban students graduating from high school and entering post-secondary programs is well below state averages. Therefore, this section will discuss why I chose a mixed-methods approach and a parallel design, the instrument used to collect urban educators’ voice, the process for how responses were collected and analyzed, and the validity and reliability of the study.
This study utilized a mixed methods research design. During phase one of my research, I determined my research questions. After deciding on my research questions, I determined that each of my questions could be answered using a quantitative approach, but I wanted to validate the data I collected by asking participants to give their opinions about strategies implemented in their own buildings. My first research question, “What factors are urban education systems using to create equity within their system?” could be answered by asking yes, no, and maybe questions. My next two research questions, “What factors lead to increased graduation rates and entry into post-secondary education?” and “What factors have little impact on graduation rates and entry into post-secondary education?” would require participants to give their opinion using a Likert scale (1 not effective – 5 very effective). In addition, I asked participants to respond to two short answer questions about what strategies were utilized in their buildings to help support high school graduation and entrance into post-secondary programs. Participants also offered their opinions about the effectiveness of the strategies. I employed a parallel design for my data collection and analysis. I collected both sets of data on the same survey, but the data was kept separate until ready for convergence for the study write-up.
Source: stock.adobe.com
Source: stock.adobe.com
During phase two of my study, I created a survey, based on a survey created by Irena Pietrzyk, et al. from 2019, designed to answer my research questions. The title for my survey was, “What strategies best support urban students on their pathway toward graduation and beyond?” The survey consisted of 49 questions. Question one asked for consent to participate in the study. Participants were provided with the informed consent and then asked to click either, “I consent to participate in this research study,” or “I do not consent to participate in this research study.” Those that consented to participate in this research were able to continue with the rest of the survey. The survey took no more than 10-20 minutes. There was no anticipated risk, aside from what they might encounter in their everyday life. Participants were asked not to add any personal information apart from their opinions about the effectiveness of the support given to students. Their response to the survey was assigned an identification number. In the discussion of the collected data, no identifiable characteristics, such as name or title, will be shared. As a safeguard put in place to protect participants’ privacy, I did not collect any personal data (i.e. names, emails, building names etc.).
The survey from Pietrzyk et al (2019) formed the jumping off point for how I would create the survey for my project. The Pietrzyk survey titled, “Future and career plans before high school graduation (ZuBAb): Background, research questions and research design,” focused on the opinions of students about their preparedness for their future career. The survey consisted of thirteen sections. This survey was used as a model for mine because it utilized a variety of question types. For example, open ended questions, yes and no questions, and Likert scaled questions were utilized. I decided that this type of survey would allow me to ask the questions I needed in order to gather the information that would answer my research questions.
Participants were informed that taking part in this project will help researchers better understand the ways to support urban students on their pathway to graduation and entrance into post-secondary programs. Taking part in the project was entirely up to their discretion, and no one would hold it against them if they decided not to do it. If they did take part, they could also stop at any time without penalty. In addition, they can ask to have their data withdrawn from the study after the research has been conducted.
The next section of the survey asked for information about the type of high school they work in, the role they play in the building, how many years they worked in urban education, and how many years they had worked in their current position. These questions were asked to see any correlations between the number of years, the type of building, and position in the building and their responses. Questions six and seven asked for participant observations about the strategies their building uses to support students on their path toward graduation and entrance into post-secondary education. The participants offered their opinions about how these strategies supported their students. Questions 8, 10, 12, 14, 16, 18, 20, 22, 24, 26, 28, 30, 32, 34, 36, 38, 40, 42, 44, 46, and 48 asked about the strategies researchers over the past thirty years believe will help to improve graduation rates and entrance into post-secondary programs. These strategies were pulled directly from the research discussed in the review of literature. There were 21 studies that were highlighted in the survey from question 8, “Does your school use ‘college talk’ a system of intentional conversations with students about college?” to question 48 “Does your school offer industry certifications?”
Finally, questions 9, 11, 13, 15, 17, 19, 21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, and 49 asked participants to give their opinions about each of the strategies and whether they thought the strategy would support urban students on their path toward graduation and entrance into post-secondary programs. The questions ranged from question 9, “How effective do you think ‘college talk’ would be with urban students to help them graduate and move onto post-secondary programs?” to question 49 “Do you think offering the opportunity to attain an industry certification would help urban students graduate and move onto post-secondary programs?”
During the third phase of the study, I published the survey and collected responses. The survey was created using Microsoft Forms due to the ease of collecting responses. I made the decision to collect responses using various public websites namely Facebook, Instagram, Reddit, and the American School Counselor Open Forum. I created a link for the survey and posted it in urban educator groups on Facebook and Instagram. I also posted the link in urban educator forums on Reddit. Finally, I posted in the Open Forum of the monthly newsletter produced by ASCA. I continually posted on these sites over four months. In addition, I reached out to urban educator colleagues.
Source: stock.adobe.com
Source: stock.adobe.com
I wanted to collect more than 30 responses for correlational research. After four months of collecting responses, I had 32 participants that provided responses. I asked that participants be urban educators that worked at the secondary level. I invited principals, assistant principals, school counselors, family and community specialists, graduation coaches, social workers, and teachers to respond. Of the 32 responses 3 were principals, 5 were assistant principals, 7 were school counselors, 5 were family and community specialists, 0 graduation coaches, 3 social workers, and 9 teachers. A majority, 47%, had more than 10 years of experience in urban education, 28% had 6-10 years of experience, 19% had 3-5 years of experience, and 6% had 0-2 years of experience. The number of years in their current position varied, 31% had 6-10 years, 28% had more than 10 years, 22% had 3-5 years, and 19% had 0-2 years. It could be concluded that during their career some participants moved from one position to another.
This section will analyze the collected data. The first discussion is on the quantitative data collected, since this was the majority of data collected on the survey. In addition to examining the data from each question, I will also utilize JASP software to complete a descriptive statistical analysis on the reliability and validity of the data. Specifically, JASP version 0.18.1 (0.18.1; JASP Team, 2023). Next, discussion is on the qualitative data collected. I will discuss the predominant themes present in the responses indicated in the Microsoft Forms software and then analyze these themes using NVivo software to identify trends in the themes.
Source: stock.adobe.com
Source: stock.adobe.com
Quantitative Data Analysis
The following tables provide a question-by-question analysis of the responses given. Table 2 gives the response data for the questions about the use of the strategy in urban high schools and Table 3 gives response data for the questions about the effectiveness of each strategy according to the urban educator.
The following tables (Table 4-11) provide descriptive statistical analysis for each question. The questions are grouped again by use in urban high schools and urban educator opinion about the effectiveness of the strategy. Tables 4-7 give descriptive statistical information for questions on use of the strategy and Tables 8-11 give descriptive statistical information on perceived effectiveness of each strategy.
Descriptive statistics were computed for the variables of interest, namely Questions 8, 10, 12, 14, 16, 18, 20, 22, 24, 26, 28, 30, 32, 34, 36, 38, 40, 42, 44, 46, and 48, based on the cleaned dataset (Table 4-7). The dataset consisted of 32 participants (N = 32). The mean response of the participants ranged from 2.031 (Questions 26 and 38) to 1.063 (Question 18). This indicates that although there is some skewness in each of these questions, they fall within an acceptable range indicating that the responses stayed close to a normal distribution and there is no need to throw out any responses. Therefore, the responses from this question set are all within acceptable range.
Descriptive statistics were computed for the variables of interest, namely Questions 9, 11, 13, 15, 17, 19, 21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, and 49, based on the cleaned dataset (Table 8-11). The dataset consisted of 32 participants (N = 32). The mean response of the participants ranged from Question 41 at 4.69 to Question 29 at 1.93. This indicates that although there is some skewness in each of these questions, they fall within an acceptable range indicating that the responses stayed close to a normal distribution and there is no need to throw out any responses. Therefore, the responses from this question set are all within acceptable range.
Even though each question falls within an acceptable range there are a set of five questions (16, 18, 37, 38, and 41) that will need to be examined further. This section presents the results of the normality tests conducted on the selected survey items. Various tests were employed to assess the normality of the survey items, including examinations of skewness and kurtosis, utilization of the Shapiro-Wilk test, scrutiny of histograms and curve lines, examination of box plots, and analysis of Q-Q plots. Each of these tests provides valuable insights into the distributional properties of the data, aiding in determining the appropriateness of employing parametric statistical analyses for subsequent examinations. JASP software was utilized for conducting the normality tests and analyzing the data (0.18.1, JASP Team, 2023).
Source: stock.adobe.com
In this section, the focus is on examining the reliability and validity of questions from the survey conducted during this study (See Table 2 and Table 3). The survey aims to determine uses and effectiveness of strategies found in the research on urban education that help to support urban students on their path toward graduation and entrance into post-secondary programs. The reliability of the survey is assessed using Cronbach’s alpha, while its validity is evaluated through Confirmatory Factor Analysis (CFA). This section presents the findings of the reliability and validity analyses.
Reliability Analysis
To assess the internal consistency of the survey, Cronbach’s alpha was calculated based on the Likert items from Table 3 and the scaled items on Table 2 (yes=1, no=2, maybe=3). The questions representing urban educator opinions about various strategies found in the research, were measured using a 5-point Likert scale. Table 13 presents the results of the reliability analysis. The initial calculation of Cronbach’s alpha yielded a coefficient of 0.353 for the survey construct. This value indicates relatively low internal consistency, suggesting that the survey items may not be effectively measuring a single underlying construct (α = .353). The questions representing urban educator observation of use in their buildings were measured on a 3-point scale yes=1, no=2, maybe=3. Table 14 presents the results of the reliability analysis. The initial calculation of Cronbach’s alpha yielded a coefficient of 0.301 for the survey construct. This value indicates relatively low internal consistency, suggesting that the survey items may not be effectively measuring a single underlying construct. However, in both surveys, the assumption must be held that due to similar experiences there may be some issues with reliability due to internal bias from the participants. In this situation that bias may indicate a particular strategy that would be more appropriate for the support of urban students over another strategy.
Validity Analysis
To further investigate the validity of the survey, Confirmatory Factor Analysis (CFA) was conducted using the indices presented in Table 15, 16, 17, and 18. These indices provide essential information about the fit of the proposed model and aid in evaluating its validity. The results of the CFA indicate that the initial model (see Table 15) has a x^2 value of 4.120 with 6 degrees if freedom (df), resulting in a x^2/df ratio of 0.238. Additionally, the Comparative Fit index (CFI) and Tucker-Lewis Index (TLI) (See Table 18) were found to be 0.000 and –2.215, respectively. For questions on the effectiveness of the strategies the results of the CFA indicate that the initial model (see Table 16) has a x^2 value of 56.502 with 6 degrees of freedom (df), resulting in a x^2/df ratio of <.001. Additionally, the Comparative Fit index (CFI) and Tucker-Lewis Index (TLI) (See Table 17) were found to be 0.000 and –2.215, respectively (0.18.1; JASP Team, 2023). These indices suggest that the initial model may not adequately fit the data, raising concerns about validity. However, when considering earlier assumptions about the participants these factors can be attributed to similar experiences.
The following figures come from the Microsoft Forms software and give a quick snapshot of the responses given to the short answer questions from the survey (Microsoft Forms (Microsoft 365); Microsoft Corporation, 2025).
In Figure 9 we can see the word cloud for short answer question six. Question six asked “What strategies does your school utilize to support students as they move toward graduation?” We can see that at the heart of every response is the word students. Some of the most common phrases that participants used were, “students keep on track,” “school counselors,” “students and the advisor,” “student support,” and “students are on target.” We can see just from a cursory glance that the participants wanted to highlight the ways their schools support students as they move toward graduation.
We see a similar trend in question seven-word cloud demonstrated in Figure 10. Question seven asked “What strategies does your school utilize to support students as they decide on post-secondary options?” At the center of each of the responses we see the word students. Around this word we see, “college visits,” “team with students,” “Schoolinks,” and “option for each student.” From this word cloud we see that participants recognized that programs such as Schoolinks can help support students as they make decisions about post-secondary options and that college visits are another way to help support students make future decisions about post-secondary options.
Table 19 represents the analysis of data from question six of the survey. Using NVivo I analyzed the data from these questions (NVivo (version 14); Lumivero, 2023). First, I took the responses from the survey and converted them into a text document. Next, I read through the responses and started coding for different themes in the responses. After completing the coding, I examined the prevalence of the themes. As seen in the table there were a total of seven themes found in the responses. In question six we see that the theme of school counselor interactions was the most prevalent response along with mentoring. In Table 20 a similar process was used. All responses from the survey were converted into a text document and then I started coding for different themes. As we can see from the table there were a total of 10 themes present in the seven responses. The most prevalent theme in question seven was school counselor interactions, very similar to the responses in question 6, followed using the online platform Schoolinks.
Source: stock.adobe.com