Common Standard 4

Continuous Improvement 

The education unit develops and implements a comprehensive continuous improvement process at both the unit level and within each of its programs that identifies program and unit effectiveness and makes appropriate modifications based on findings. 


Please view our narrative responses below in black and hyperlinked evidence in gold below. 

Standard 4.1. The education unit and its programs regularly assess their effectiveness in relation to the course of study offered, fieldwork and clinical practice, and support services for candidates.

Standard 4.2. Both the unit and its programs regularly and systematically collect, analyze, and use candidate and program completer data as well as data reflecting the effectiveness of unit operations to improve programs and their services.

Standard 4.3. The continuous improvement process includes multiple sources of data including

1) the extent to which candidates are prepared to enter professional practice; and  

2) feedback from key constituencies such as employers and community partners about the quality of the preparation. 

Mills College at Northeastern University School of Education faculty and staff systematically gather information about candidates’ development in relation to coursework, fieldwork, and support services. Support services include College-wide resources as well as programmatic support structures, such as academic, credentialing, and fieldwork advising for each candidate. Graduating candidates complete exit surveys, which combined with Northeastern University’s evaluation of course assessment, programmatic assessment, and institutional assessment, provide information for yearly program improvement efforts. Finally, surveys designed to gather input from alumni, current candidates, and cooperating teachers provide periodic assessment data. 

 

The data collected from these assessments will allow us to understand programmatic aspects that are most valued by candidates, as well as areas of improvement. We will use this information to enhance the strengths of our program, identify areas for improvement, and balance programmatic needs against institutional resources and demands. Through regular cycles of planning and evaluating evidence, we will ensure that our graduate students are well-prepared to enter professional practice.  

 

To further improve the quality of our programs, we will actively seek out and collaborate with key stakeholders, including employers, community partners, and our advisory committee. We will gather their feedback on the level of preparation demonstrated by program completers in order to understand the real-world impact of our programs and identify areas for improvement. By actively collaborating with these constituencies, we will show our commitment to producing well-prepared, highly qualified professionals who are able to succeed in their careers and make a positive impact in their communities. 

 

Improvement cycle: Up until the merger with Northeastern University, the Mills School of Education followed a three-year cycle of assessment, planning, implementation, and outcomes. However, we now need an improvement cycle that is more responsive to what we anticipate will be enrollment growth and possible distance learning options. The improvement cycle for all credential programs will have three phases: 


The sequence in these steps as well as inputs, processes, and measurable products is illustrated and detailed in this Continuous Improvement Cycle graphic below:

Continuous Improvement Cycle-final.pdf

Data collected: The Mills College at Northeastern University School of Education will regularly collect data in order to evaluate the effectiveness of our teaching credential programs and inform ongoing improvement efforts. The process will include data related to student, staff, and faculty performance, program outcomes, and candidate satisfaction. Specific information will include enrollment totals, course completion rates, course evaluations, exit surveys, and percentages of employed graduates (see List of Data Sources below).  

 

List of Data Sources: 

 

Admission, Enrollment, and Completion: Total applicants, percentage of admitted, enrolled, and completing candidates with corresponding demographic information. The Mills College at Northeastern University School of Education faculty and staff will use student demographic data for two purposes. Firstly, program faculty and staff will track and update candidate information for the annual CTC Accreditation Data System (ADS) and Title II reports. Secondly, program faculty and staff will rely on admission, enrollment, and completion data to evaluate the programs’ goal of recruiting and preparing diverse educators matching California public school students’ demographic composition.  

 

Credential Candidates’ Development:  

•TPE #1: Engaging and Supporting All Students in Learning 

•TPE #2: Creating and Maintaining Effective Environments 

•TPE #3: Understanding& Organizing Subject Matter for Student Learning Content Specific Pedagogy 

•TPE #4: Planning Instruction and Designing Learning Experiences 

•TPE #5: Assessing Student Learning 

•TPE #6: Developing as a Professional Educator 

Candidates will receive a total score representing their overall practical teaching abilities and development, which is used to evaluate their readiness to teach. We will use this score to inform, in part, each candidate’s credential recommendation. We will also utilize aggregated scores for each TPE to inform our evaluation and decisions about the program.  

2. Mean GPA scores by program 

3. Aggregate edTPA, CalTPA scores by assessment task by program 

4. Percentage of candidates completing each credential program 

5. Percentage of candidates obtaining credential recommendations by program 

6. Alumni and Community Partners Survey. Yearly, we will survey alumni as well as school districts and county agencies of education to assess graduates’ impact on teaching and learning in schools. The Survey will assess graduates’:   

• Capacity to seek and gain employment in the education system; 

• Exercise leadership functions and apply leadership practices that contribute to the development, improvement, and transformation of their organizations; 

• Strengthen and enhance links between their respective educational organizations and the program, the School of Education, the College, and Northeastern University.  

 

Ranked tenured/tenure-track faculty: Total ranked-faculty, corresponding qualifications and experience, and regular professional evaluation summaries (i.e., teaching evaluations, scholarly work, community service), and course and administrative assignments by program. Data will provide an assessment point between faculty’s expertise and experience and their roles within the Mills College at Northeastern University School of Education. Tenured/tenure-track faculty evaluation data will allow us to determine the capacity of each credential program.  

 

Non-tenure-track and part-time faculty: Overall and by course totals of non-tenure-track and part-time  faculty, corresponding qualifications and experience, demographics, and Service Employees International Union (SEIU) 1021 evaluation results. Data gathered will allow an assessment of the experience, expertise, and functions of non-tenure-track and part-time faculty in relation to support and development of credential candidates by program. Together with tenured/tenure-track faculty, non-tenure-track and part-time faculty data will inform capacity assessment and future plans.


Cooperating Teachers/Field Supervisors and field experiences: Number of Cooperating Teachers and field supervisors by program, qualifications, demographics, and number of students they mentor and guide. Cooperating Teacher/Field Supervisor Evaluation Rubrics scores and comments inform the assessment of candidates’ fieldwork experiences. These data will allow assessment of candidates' individualized support as well as the extent to which field placements are representative of P-12 public schools in California, particularly regarding students’ ethnicity, race, culture, language background, and overall diversity.  

 

Program Supervisor and Field Coordinator: Number of Program Supervisors, expertise, experience, demographics, and evaluations of their work as completed by the Field Coordinator and Program Directors. Key components in the data gathered are diversity in types of placement sites and the lived and professional experiences of Program Supervisors in relation to candidates’ learning, growth, and achievements as beginning educators. In addition, Program Supervisor Evaluation Rubrics scores and comments will inform the assessment of candidates’ fieldwork experiences. These data will allow assessment of the quality of the Program Supervisors support, guidance, assessment, and mentoring of credential candidates. 

 

All data points, together will provide context for continuous improvement of credential programs, enabling program faculty and staff to identify the ways in which the fieldwork experiences, coursework, student support and resources are effective, while suggesting areas of possible improvement. 

 

Data collection tools & methods: We will gather program evaluation and planning data through a variety of methods, including online student portals, course evaluations (TRACE), field experience rubrics, edTPA scores, exit surveys, and surveys administered to our school district partners. We will also use data from yearly Master Program Completer Surveys and likely faculty projects with Northeastern University’s Center for Advancing Teaching and Learning through Research (CATLR). Northeastern University Office of Institutional Assessment and Evaluation provides scaffolding for a continuous improvement of programmatic assessment processes.  

 

Data analysts: The faculty and leadership at Mills College at Northeastern University are skilled and qualified researchers, with advanced degrees in education, mixed-methods research experience, and data science. They have a wealth of experience in analyzing educational data and will collaborate to provide valuable insights and recommendations to respective stakeholders. In addition, Northeastern University Office of Institutional Assessment and Evaluation resources will be available for consultation, support, and guidance to identify important patterns in the data. 

 

Data analysis process: Data analyses will be carried out at the end of each term as well as at the end of each academic year. At the end of each term, Program Directors will analyze data from the following sources, for their respective credential program: 

• Field Experience Rubrics 

• Mean GPA scores 

• Summative TRACE scores 

At the end of each academic year, Program Directors, in collaboration with tenured/tenure-track faculty, will analyze data from the following sources:  

• Admission, Enrollment, and Completion 

• Summative credential candidates’ development 

• Tenured/Tenure-track and non-tenure-track and part-time faculty 

• Cooperating Teacher/Field Supervisors and field experiences 

• Program Supervisor and Field Coordinator 

Evaluation and planning data are both qualitative (i.e., course evaluation comments, open-ended survey responses, rubric comments, interviews) and quantitative (i.e., survey responses, assessment scores, GPA scores, course evaluations) in nature and provide necessary triangulation for reliability purposes. 

 

At the end of each academic year, Program Directors and tenured/tenure-track faculty from each credential program jointly draft a Preliminary Program Report to be shared with non-tenure-track and part-time faculty, Program Supervisors, and the program’s Advisory Committee. Comments and feedback from the various constituents in response to the report as well as other specific related concerns or issues will be used to revise the report and set goals for subsequent academic years aimed to improve the development and growth of credential candidates and each credential program. An Executive Summary, including conclusions and recommendations for each program, will be presented to the Dean of the School of Education at Mills College at Northeastern University (or successor office). Early in the spring term, each Program Director will meet with the program’s Advisory Committee as well as the Dean (or successor office) to discuss program strengths and weaknesses revealed in the data and review the program’s plan for implementing relevant changes as well as measurable outcomes or products. Program Directors present their final reports in meetings with the Dean of the School of Education (or successor office). A yearly Program Evaluation and Improvement Report will include a summary of each credential program’s achievements and strengths, evaluate outcomes from previous improvement plans, identify areas of improvement, define new goals and plans to achieve them, and identify potential cross-programmatic synergies. 

 

Actionable steps from analysis: The insights we gain from our data analysis efforts will inform a range of actions designed to improve the quality and effectiveness of our teaching credential programs. This may include revising courses, course materials and/or assignments, providing support to candidates, and utilizing new pedagogical, curricular, or assessment approaches or strategies. The School of Education will also use the data to assess the effectiveness of our fieldwork and clinical practice components, as well as support services for candidates. By using multiple sources of data to triangulate and inform our decision-making process, we aim to continuously improve each credential program and Mills College at Northeastern University as an institution to better prepare students for professional success. The Academic Descriptive Summary and Program Innovation are examples of actionable plans from the Mills College School of Education prior to the merger with Northeastern University. These illustrate likely similar documents emerging from the Continuous Improvement Process we plan. 

Standard 1     Standard 2     Standard 3     Standard 4     Standard 5