Key Components:
Students: This section focuses on student advising, support services, exit processes, and two-year post-graduation data. This may include information about student performance evaluations, PRAXIS and EdTPA results, and dispositions, etc.
Process & Procedures: This section focuses on the college's manuals, program assessment, university reporting, and QAS reporting.
Faculty: This section focuses on faculty development, surveys, course evaluations, and performance evaluations.
Curriculum: This section covers reviewing syllabi, curriculum maps, course evaluations, and performance evaluations.
Partnerships: This section focuses on collaborations with external organizations, such as the Tennessee Department of Education (TDOE), and internal units like the Office of Teaching and Learning (OTL). This includes information about advisory councils and focus groups.
Support: This section focuses on the college's support services for students, such as tutoring, counseling, and academic advising.
Overall, the QAS is a comprehensive system that aims to ensure the quality of the College of Education's programs and processes. It focuses on student outcomes, faculty development, curriculum alignment, and partnerships with external organizations. The system is data-driven, with regular evaluations and assessments to inform decision-making.
Pre-2018, the EPP possessed a quality assurance system (QAS). Unfortunately, it was not operable due to the many changes and misalignment with the EPP. As of 2022, the pilot QAS was introduced using the baseline data of 2019-2022. The EPP implemented quality assurance facilitation, management, and process review to begin the revision. The QAS has been revised to include the areas of Students, Process and Procedures, Faculty, Curriculum, Support, and Partnerships for forward progression (Quality Assurance Task Force, 2010; OAA, 2021) and aligns with the University Program Assessment report. All program units under the College collect data using the six-area framework to ensure the continuous improvement of our programs. This framework focuses on all aspects of the educational experience, from the student journey to faculty development and program effectiveness. Utilizing this six-area QAS framework, the EPP can continuously assess its programs' effectiveness, identify improvement areas, and implement data-driven solutions to ensure our graduates are well-prepared and successful educators.
The 2020-2023 audit discovered improvements needed to align and support the consistency of processes across the programs within the College using the predefined areas and current metrics. Those areas include the following:
Students
The data areas are a mix of what is currently available and what was created during this QAS pilot phase.
This area consists of Advising, Support, Exit, Two-Year Post-Grad Surveys, Performance Evaluations, PRAXIS * TEAM * EdTPA, COMPS, Dispositions, Course-Level Standard Performance Evaluations, Recruitment, Retention, and Graduation Rates.
Processes and Procedures
The data represents the implementation of the various processes and the various scores received due to the submission of reports and the outcome of those reports during this pilot phase.
College of Education and Unit Manuals, Program Assessment, University Reporting, QAS Reporting
Faculty
The data represented includes student feedback and evaluations during this pilot phase.
Professional Development, Surveys, Course Evaluations, Performance Evaluations
Curriculum
Review of Syllabi, Curriculum Maps, Course Evaluations, Performance Evaluations
Support
GRACIE-OTL, Exit, & TESS Exit Surveys, Equity Council, Student Advisory Council
Partnerships
Focus Groups, TDOE Surveys (EPP Only), Advisory Councils
The provider has developed, implemented, and modified, as needed, a functioning quality assurance system that ensures a sustainable process to document operational effectiveness. The provider documents how data enter the system, how data are reported and used in decision-making, and how the outcomes of those decisions inform programmatic improvement.
QAS Handbook to support changes based on outcomes. This handbook supports understanding the use of the QAS and procedures to support consistency.
Data Management Schedule - All areas of the Quality Assurance System are covered in this workbook for you to consider concerning your department. This is a starting point for assessing your data needs and creating processes, procedures, and goals to maintain uniform, transparent data collection relevant to answering to any balance and checkpoint entities on a university, local, state, or national level.
Approval Checklist - Designed to support and align program standards.
QAS Workbook was developed for programs as a guide to support deeper analysis of program data.
Calendar (Planner) is managed through Microsoft 365 as a program management tool and calendar to track products.
Continuous Improvement Tracking supports tracking revisions and outcomes in QAS processes and procedures.
The Educator Preparation Program (EPP) council functions as the central mechanism for quality assurance, operating within a collaborative, data-driven improvement framework. Comprised of certification faculty coordinators, master clinicians, and key partners, the council ensures a holistic evaluation of the program’s efficacy. This assembly is a critical forum where comprehensive data analysis and report review inform strategic decision-making.
The council’s primary function is systematically examining program data, including candidate performance metrics, clinical placement evaluations, and stakeholder feedback. This rigorous analysis facilitates the identification of programmatic strengths and areas necessitating targeted improvement. Through meticulous review of assessment, accreditation, and program evaluation reports, the council ensures adherence to established quality benchmarks and regulatory mandates.
Furthermore, the council fosters a collaborative environment wherein diverse perspectives converge to inform program enhancement. Stakeholders engage in constructive dialogue, sharing insights derived from their respective roles. This exchange enables the council to develop and implement evidence-based quality improvement initiatives, such as curriculum revisions, assessment modifications, or partnership enhancements. By integrating partner feedback, the commission ensures that the EPP remains responsive to the evolving needs of educational settings.
The EPP council is a dynamic entity, driving continuous improvement through data-informed decision-making and collaborative engagement. This systematic approach ensures that the program consistently delivers high-quality educator preparation, ultimately contributing to the success of future educators and the communities they serve.
The team utilized an external reviewer to verify that the QAS is improving practices within the EPP. Auditing within a Quality Assurance System (QAS) is a strategic process beyond simple compliance checks. It's a systematic evaluation to ensure the QAS is effective, not just compliant. Auditors objectively examine processes, identifying areas for improvement and providing concrete evidence of performance. This proactive approach mitigates risks, drives continuous improvement, and fosters a culture of quality, ensuring the organization meets and exceeds expectations.
Each April, the Educator Preparation Program (EPP) convenes a college-wide Quality Assurance System (QAS) meeting, a crucial annual ritual of self-reflection and strategic alignment. This gathering is a focused forum where faculty, clinicians, and partners meticulously examine the program's practices against the established QAS framework. They assess how effectively the EPP meets its quality objectives through collaborative dialogue and data analysis. This annual review ensures continuous improvement, fostering a culture of accountability and ensuring the program consistently delivers high-quality educator preparation.
College-Wide Meeting and Agenda: The annual meeting for all programs was instituted to share the responsibility of collection, analysis, and continuous improvement.
College-Wide Meeting Scheduled for April 18: Agenda [draft]
The Assessment and Accreditation Unit (AAU) at Tennessee State University is conducting a longitudinal study to evaluate the effectiveness of the Quality Assurance System (QAS) in improving student and college outcomes. The study will collect data over several years to track changes in key metrics and assess the QAS's long-term impact.
Study Design and Data Collection
The study will utilize a mixed-methods design, combining quantitative and qualitative data to understand the QAS's impact comprehensively. The following data sources will be used from the QAS areas:
Students: This section focuses on student advising, support services, exit processes, and two-year post-graduation data. This may include information about student performance evaluations, PRAXIS and EdTPA results, dispositions, etc.
Process & Procedures: This section focuses on the college's manuals, program assessment, university reporting, and QAS reporting.
Faculty: This section focuses on faculty development, surveys, course evaluations, and performance evaluations.
Curriculum: This section covers reviewing syllabi, curriculum maps, course evaluations, and performance evaluations.
Partnerships: This section focuses on collaborations with external organizations, such as the Tennessee Department of Education (TDOE), and internal units like the Office of Teaching and Learning (OTL). This includes information about advisory councils and focus groups.
Support: This section focuses on the college's support services for students, such as tutoring, counseling, and academic advising. s
The findings of this longitudinal study will provide valuable insights into the QAS's effectiveness and its impact on student and college outcomes. The study is expected to demonstrate the QAS's role in improving the quality of education. The results will also inform future refinements and adaptations of the QAS to ensure its continued effectiveness in meeting the evolving needs of students and the educational landscape.
Note. The longitudinal study directly supports the fulfillment of CAEP standards by providing a comprehensive framework for assessing and enhancing program quality. By collecting data over several years, the study allows for the identification of trends, the evaluation of interventions, and the continuous improvement of the educational experience.
Over the past few years, the EPP has made significant strides in refining and enhancing its Quality Assurance System (QAS). Building upon the foundation laid in the 2020-2023 period, the EPP has undertaken a comprehensive review and revision process to strengthen the QAS for the 2021-2024 cycle. To support these efforts, the EPP has developed a robust QAS Handbook that provides clear guidelines and procedures for using the QAS effectively. This handbook is valuable for all stakeholders, ensuring consistency and adherence to best practices.
A key component of the QAS is the annual QAS Summary Report, which is completed at the college level. Each program within the EPP is responsible for contributing to this report and providing updates on its specific QAS activities. By analyzing the data collected through the QAS, the EPP has identified trends, uncovered emerging issues, and implemented targeted improvements.
To further streamline the data collection and analysis process, the EPP has developed a Data Management Schedule. This schedule outlines the data collection, analysis, and reporting timeline, ensuring that the necessary data is gathered and analyzed promptly.
Additionally, the EPP has created a QAS Departmental Checklist to guide programs in conducting a deeper analysis of their program data. This checklist helps programs identify areas for improvement and develop targeted action plans.
The EPP conducts an annual data profile to understand the college's overall performance comprehensively. This profile provides an overview of key performance indicators and allows individual programs to report on their specific achievements.
Furthermore, the EPP has established a Continuous Improvement Tracking system to monitor and document revisions and outcomes related to QAS processes and procedures. This system enables the EPP to identify areas for ongoing improvement and measure the impact of these changes.
To address the challenges of declining enrollment, the EPP has developed comprehensive Recruitment and Retention Plans. These plans, informed by QAS data, are designed to attract and retain high-quality students.
The EPP utilizes various tools and strategies to support its QAS initiatives. Microsoft 365 is a central platform for managing calendars, tracking tasks, and facilitating collaboration. Regular college-wide meetings and focus groups allow stakeholders to share insights, discuss challenges, and develop solutions. The EPP Council, composed of key stakeholders from across the college, is critical in guiding and overseeing the QAS process.
Enrollment, Retention, and Graduation Profile [CAEP R3, R4, RA3, RA5]: The purpose is to analyze and synthesize college enrollment data. This analysis aims to inform strategies for recruiting and retaining students in its programs. The data profile includes tables and figures illustrating enrollment trends over several years, broken down by major, concentration, and student level. This detailed breakdown allows the College of Education to understand the specific areas where enrollment is strong or weak. Ultimately, the data profile is a tool for the College of Education to make data-driven decisions to improve enrollment and retention outcomes.
Praxis Analysis by Program [CAEP R1, RA1, R4, RA5] —The Educator Preparation Program (EPP) reviews program data over the past three years. In instances where data is absent, this indicates that no assessments were administered or that no students were enrolled in the program during that period.
Observation [CAEP R1, R3, R4, R5] - In response to the previous site visit, the EPP implemented systematic observation data collection in this cycle. This data is now integrated into our comprehensive analysis for enhanced program evaluation. The observation is initial only. The charts supply the analysis. Our review of teacher candidate performance across different majors reveals some clear patterns. Overall, Biology, English, and Special Education candidates demonstrated strong performance across most teaching competencies. However, Music-Instrumental candidates consistently scored less, indicating a need for targeted program review and support. Looking at specific skills, candidates generally performed well in areas like 'Standards and Objectives' and 'Teacher Content Knowledge'. However, we identified potential areas for improvement in 'Questioning,' 'Academic Feedback,' and 'Grouping Students,' as these competencies showed more variability and generally lower scores across majors.
TVAAS [CAEP R4, RA4] - While TVAAS data presents limitations due to low values, the team utilizes it to the fullest extent possible to inform program performance analysis. The number of teachers in each effectiveness level has varied over the 3 years from 2021 to 2023. In 2021, there were 4 teachers in Level 5 (Most Effective), 7 in Level 4, 17 in Level 3, 10 in Level 2, and 10 in Level 1. In 2022, the number of teachers in Level 5 remained the same, but the number in Level 4 increased to 11, the number in Level 3 increased to 41, the number in Level 2 decreased to 7, and the number in Level 1 remained at 10. In 2023, there was a significant jump, with 24 teachers in Level 5, 28 in Level 4, 66 in Level 3, 45 in Level 2, and 30 in Level 1. The data shows an overall positive trend in the effectiveness of EPP teachers at Tennessee State University from 2021 to 2023. The number of teachers in the higher effectiveness levels (Levels 5 and 4) increased, indicating that more teachers demonstrate significant or moderate evidence of their students exceeding expected growth. However, there are some areas for potential improvement. The number of teachers in the lower effectiveness levels (Levels 1 and 2) also increased in 2023, suggesting that a significant portion of teachers are still not meeting the desired standards for student growth. It is important to note that the TVAAS data represents only one aspect of teacher effectiveness. Other factors, such as classroom observations, student feedback, and professional development, should also be considered when evaluating the overall performance of EPP teachers at Tennessee State University. Overall, there has been an increasing trend in the number of teachers in the higher effectiveness levels (Levels 5 and 4) and a general increase in the total number of teachers evaluated over the 3 years.
Predominance of Level 1: Many educators across subjects and grades, including single-year and multi-year composites, are classified as Level 1. This suggests that a notable proportion of educators might not meet the expected growth targets in student achievement.
Limited Representation in Higher Levels: Fewer educators are in Levels 4 and 5, especially in single-year composites. This indicates a smaller pool of consistently high-performing educators who exceed student growth expectations.
Subject-Specific Patterns:
For Grades 3–8, subjects like English Language Arts and Math show many educators with Level 1 effectiveness.
However, Science and Social Studies have slightly more representation in Level 2 and Level 3, hinting at variations in subject-specific teaching effectiveness.
Multi-Year Trends:
Most multi-year composites (up++ to 3 years and without 2021) for educators remain at Level 1. This reflects consistent challenges in improving growth measures over time.
Even educators with Level 5 single-year scores show variability when analyzed through multi-year trends.
Active Educators [CAEP R4, RA4.1]: The Educator Preparation Program (EPP) initiated a data tracking point in 2022 to analyze educator retention, precisely to determine the proportion of program completers who remain active in the field. This data serves as evidence for CAEP Standard RA4.1, demonstrating the extent to which program completers contribute to P-12 student-learning growth and effectively apply the professional knowledge, skills, and dispositions acquired during their preparation. While this data does not directly measure employer satisfaction, it can be used to infer it; higher retention rates suggest employers are satisfied with the performance of program completers.
School Counselor: Analysis of TNCompass data indicates no change (0.00%) in the percentage of active school counselors (Authorization Type Codes 487 and 086) between the 2020-2023 and 2021-2024 data cycles. While limitations exist, such as potential small sample sizes or missing educator renewal information, 67% of educators with School Counselor PreK-12 (487) or School Counselor K-8 (086) authorizations are active in education. This demonstrates a stable program.
Reading Specialist: Data analysis reveals a 33.33% decrease in the percentage of active Reading Specialists (Authorization Type Code 486) between 2020-2023 and 2021-2024. This program has historically had low enrollment and is set for closure.
Speech Pathology and Special Education: TNCompass data shows a 33% decrease in the percentage of active Speech Language Teachers (Authorization Type Code 458) between 2020-2023 and 2021-2024. For Special Education, data indicates that 74% of 257 completers who completed the program in its entirety during the Grow Your Own (GYO) initiative remain active in the system.
Instructional Leaders: Analysis suggests a 33% decrease in the percentage of active Administrators (Authorization Type Codes 441, 442, and 443) between 2020-2023 and 2021-2024. This decrease warrants further investigation, and feedback will be gathered from program completers to understand the factors contributing to this trend.
Cycle Comparisons [CAEP R1, RA1, R5, RA5] - This analysis compares the 2020-2023 and 2021-2024 data cycles. By examining key performance indicators and trends, we aim to identify areas of strength, weakness, and opportunities for improvement. This analysis will inform strategic decision-making and guide future initiatives to enhance program effectiveness. Through this comparative analysis, we will explore how the program has evolved over the past few years. We will delve into specific metrics, such as student enrollment, retention, graduation rates, and program outcomes, to assess the impact of various interventions and strategies.
Overall QAS Cycle Comparison [CAEP R5, RA5]- To evaluate the effectiveness of the quality assurance system, we conduct a comparative analysis of data from two distinct cycles. This analysis will identify changes in key metrics and outcomes, allowing us to determine if implemented improvements have yielded desired results and ensure continuous program enhancement in alignment with CAEP standards.
Partnership [CAEP R3, R4, RA2, RA3, RA4.1, R5, RA5]—The EPP Council, a dynamic collaboration of in-college and out-of-college faculty, community partners, and key stakeholders, gathered for its regular meeting. The atmosphere is focused and collaborative, reflecting their shared commitment to program excellence. The recent CAEP site visit highlighted areas needing attention, specifically standards RA4.1, R5, and RA5, making this meeting particularly crucial. The meeting examines candidate assessment data, a core component of standard R5. Partners, bringing valuable real-world perspectives, joined the faculty in analyzing the data. They aren't just passive observers; their insights into candidate performance in field placements and subsequent employment were invaluable.
Mentor Teacher Institute [MNPS]
EmpowerEQ - QAS Audit
Adequacy Council [formerly "Equity"]- The Adequacy Council is a newly implemented body within the EPP that addresses fairness issues and concerns. It consists of representatives from across the College, excluding leadership positions. The Council's primary function is to review and resolve equity-related matters, ensuring fair treatment and opportunities for all students.
One of the Council's significant impacts has been addressing student complaints regarding certain practices not formally established as policies. These practices caused anxiety and stress among students, and through the Council, students could challenge them. In one instance, a student successfully challenged a practice, leading the Council to vote in favor of the student and revise the policy.
[2024] The Council has successfully resolved two submissions, demonstrating its effectiveness in ensuring fair treatment and addressing equity concerns within the EPP. The Council's role is crucial in maintaining a supportive and inclusive learning environment for all students.
Candidate Demographics [CAEP R1, R2, RA3, RA4, R5, RA5]- A critical data analysis component involves examining our candidates' demographics and our program's impact on diverse P-12 learners. This focus aligns with the CAEP standards, emphasizing the importance of preparing educators to serve all students effectively.
The Spring 2023 semester saw 123 future educators from Tennessee State University's EPP contribute to 57 school districts across Tennessee, concentrating in Metro Nashville and Shelby County. Many graduates earned the "Beginning Administrator PreK-12" endorsement, demonstrating the program's focus on leadership preparation. The program is noted to impact Tennessee's education system positively. Further granular data regarding the racial demographics of the 123 candidates and the number of each endorsement earned would strengthen this data analysis. Strongpoint includes: 123 future educators participated; Served 57 school districts across Tennessee; Primary districts: Metro Nashville and Shelby County; Range of endorsements earned, including Beginning Administrator PreK-12, Graduates contributing to Tennessee education, with reported positive student impact; and Strong partnership with Metro Nashville Public Schools.
Tennessee State Board Report Card [CAEP R4, RA4, R5, RA5]—The EPP utilizes the annual report card for initial and advanced (instructional leadership)as a critical tool for quality assurance and continuous improvement. The report card offers a comprehensive overview of the EPP's performance across several key domains, including Candidate Profile, Employment, Provider Impact, Candidate Assessment, and Satisfaction.
For example, from 2023, the EPP (initial) can ascertain that the EPP exceeded expectations in the employment domain. A high percentage of graduates found employment in Tennessee public schools within one year (86.4% compared to the state average of 80.3%). Furthermore, the retention rates for these educators in their second and third years of teaching were also strong, with a 100% retention rate for the second year (state average: 93.6%) and 80.6% for the third year (state average: 78.8%).
The Candidate Profile domain was rated as "Meets Expectations." The report card provides data on the number of cohort members over three years (2020-2022) and the cohort's racial diversity (43.2%). The percentage of high-demand endorsements was 11%, below the state average of 16.2%.
In Provider Impact, the EPP exceeded expectations. A very high percentage of cohort members received classroom observation scores of Level 3 or above (96.3%, slightly above the state average of 96%) and Level 4 or above (71%, exceeding the state average of 65.3%). Similarly, the percentage of cohort members with Student Growth (TVAAS) scores of Level 3 or above was 76.5% (significantly higher than the state average of 60.3%). However, the percentage with Level 4 or above was 17.6% (below the state average of 25.1%). The EPP also demonstrated strong results in LOE scores, with 94.3% of cohort members scoring Level 3 or above (state average: 89.5%) and 66% scoring Level 4 or above (state average: 61.1%).
The Candidate Assessment domain was rated as "Meets Expectations". The pass rate for the pedagogical assessment was 96.7% (slightly below the state average of 97.2%). The content assessment pass rate was 84.3% (below the state average of 88.6%). The first-time pass rate for the literacy assessment was 77.8%, also below the state average of 82.3%.
Finally, the Satisfaction domain revealed that while a majority of respondents agreed or strongly agreed that their clinical experience prepared them for teaching (75%) and would recommend the program (41.7%), there were fewer who agreed or strongly agreed that their coursework prepared them (16.7%). The survey response rate for Tennessee State University was 18.5%, lower than the state average of 35.5%.
Employer Survey [CAEP R4, RA4]—Provides data that can be used for program review, continuous improvement, and potentially accreditation requirements. In contrast, the survey did successfully identify potential areas for program improvement. The primary weakness of the survey itself as a data source is the low response rate, which affects the reliability and generalizability of the findings.
Direct measures of employer satisfaction are currently limited due to a low survey response rate (n=5). However, the qualitative data from these responses is positive, indicating satisfaction with completer collaboration, professionalism, and overall strength, supported by anecdotal hiring evidence. These positive, albeit limited, direct findings are triangulated with stronger indirect evidence from high initial employment and retention rates and documented, active employer involvement in EPP governance and feedback processes. The EPP recognizes the need to implement strategies to increase employer survey participation for future cycles to obtain more robust, generalizable, direct evidence of satisfaction.
EPP Candidate Survey [CAEP R5, RA5] - Initial surveys, comprising 79 questions, revealed a predominantly White female respondent base. However, subsequent surveys, notably those from 2022 to 2024, indicated a significant demographic shift, marked by a substantial increase in Black female respondents, reflecting the college's evolving student population. Recognizing the limitations of the lengthy initial surveys and the resultant reduced response rates, the college implemented a revised 26-question survey with skip logic, enhancing response efficiency.
Further analysis of neutral responses prompted consideration for their removal to improve data granularity. Graduate rates demonstrated a modest positive trend, rising from 28% in the first three-year cycle to 34% in the subsequent cycle. Notably, fall-to-fall retention rates significantly improved, escalating from 0% in Fall 20-21 to 38% in Fall 21-22, and reaching 64% by Fall 22-23, indicating enhanced student support and program efficacy.
Program-specific performance variations were observed, with teaching licensure students exhibiting lower average performance scores in 2023-2024, while instructional leadership and clinical roles demonstrated consistently higher scores. Students in the "Other" program category maintained a consistent moderate score.
The transition from primarily in-seat and online delivery to a hybrid model correlated with increased student satisfaction. A critical focus on improving survey response rates by 15% was identified as essential for ensuring data reliability and generalizability.
Data Profile—The profile provides a detailed quantitative look at student enrollment, retention, graduation, and Praxis trends within the College of Education's programs over three years, alongside the college and program-level plans designed to address these areas.
Removal of Focus Group - Despite the initial promise of the 2023 Focus Group, designed to enhance our clinical experience programs, engagement metrics indicate a lack of sustained participation and effectiveness. While the two meetings yielded valuable feedback, resulting in tangible program changes related to curriculum, information access, and recruitment, the format has not proven conducive to ongoing, robust dialogue. Therefore, we will discontinue the Focus Group in its current form. We remain committed to maintaining collaborative feedback practices, including internal and external stakeholder input, and will explore alternative engagement strategies to ensure more effective and impactful program development.
Pathway Closures - The EPP utilizes metrics to determine program viability comprehensively, including enrollment, retention, graduation, and faculty engagement. After carefully considering collective program performance over the past six years, during which we observed consistently low enrollment and a lack of viable growth projections, these factors and a commitment to responsible resource allocation, the EPP has determined that these programs are no longer sustainable at the undergraduate level. Our primary focus is to ensure the strength and effectiveness of our remaining educator preparation offerings, aligning our resources with programs demonstrating strong enrollment, robust faculty engagement, and clear pathways for future growth. Deans Meeting Notes [using data to make decisions]
As a result of the data (enrollment, retention, graduation, engagement) and feedback from the Tennessee Department of Education, the following programs shall close to redesign:
School Counseling - Effective Immediately
Reading Specialist - Effective Immediately
CTE-A - Effective Immediately
CTE-O - Effective Immediately
All undergraduate secondary programs (except Performing Arts – Music K-12)
Early Childhood (167) - June 2026 - post-bac and job-embedded
Biology, 6-12 - June 2026
Chemistry, 6-12 - Effective Immediately
English, 6-12 - June 2026
Health & Wellness, K-12 - Effective Immediately
History, 6-12 - Effective Immediately
Mathematics, 6-12 - June 2026
Physical Education, K-12 - June 2026
Visual Arts, K-12 - Effective Immediately
Recruitment and Retention - A comprehensive analysis revealed key factors contributing to the enrollment and retention decline, including evolving student demographics, shifting career interests, and increased competition from alternative pathways. Armed with this understanding, the EPP developed targeted recruitment plans designed to address these challenges head-on. These plans encompassed a multi-faceted approach, including enhanced outreach to prospective students through digital platforms, strategic partnerships with feeder institutions, and the development of compelling program narratives that highlighted the impact and value of a career in education.
Overall Assessment: The College of Education has implemented a robust Quality Assurance System (QAS) that aligns with CAEP Standard R5.1/RA5.4. The system demonstrates a commitment to continuous improvement and data-driven decision-making. An external review validates the effectiveness of the QAS and provides insights for future enhancements.
Improvements Made: The college has embraced a continuous improvement approach, leading to significant enhancements to the QAS, including:
Improved alignment with the college's mission, vision, and goals.
Enhanced data collection and analysis processes.
Strengthened accountability measures.
Increased collaboration among faculty, staff, and students.
Challenges Addressed: The college has addressed difficulties related to personnel changes, university changes, and the COVID-19 pandemic, ensuring the QAS's ongoing success.
System Effectiveness: While the review team noted similar issues from a previous audit, the QAS has now been established to combat these issues. However, the system's optimal performance depends on all leadership embracing and utilizing it as designed. While there are improvements to the overall EPP, pockets of programs are still not engaged in the process. This trend apparently correlates to program enrollment.
Performance Assessments - Through the EPP's established Quality Assurance System (QAS) review processes, an area identified for necessary improvement is consistently collecting and submitting key candidate assessment data across all programs. While standardized data such as licensure examination scores (Praxis) and state-provided data (TVAAS, State Report Card) are centrally collected, the submission of program-specific key assessments measuring candidate progress and competency (e.g., capstone projects, performance assessments, comprehensive examinations) has been inconsistent.
While some programs reliably submit this data for analysis, it is not a universal practice across the EPP. This lack of holistic data submission presents a challenge for comprehensive, EPP-wide analysis of candidate performance on these critical internal measures. It limits the EPP's ability to systematically identify trends, strengths, or areas of concern demonstrated by candidates across different programs concerning these specific assessments.
Therefore, establishing and implementing a standardized, EPP-wide protocol for collecting, submitting, and analyzing designated key assessment data from all programs has been identified as a critical step for continuous improvement. This will strengthen the QAS by enabling more robust cross-program evaluation of candidate progress and competency, further ensuring program effectiveness and alignment with expected outcomes.