2020-2021
Established a Workgroup and two Think Tanks that developed the framework for the course evaluations–how they would be used and what they would cover.
Conducted internal and external research to determine how student course evaluations were being used in faculty evaluation
Surveyed ACC Department Chairs and found significant differences in how each department weighted the Student Course Evaluations in the final evaluation rating. From a survey of department chairs in Summer 2021, we received the following results:
Interviewed 7 institutions to determine how student course evaluations weighed in faculty evaluation( Baltimore College, City Colleges of Chicago, Valencia College, Tarrant Community College, Eastern Shore Community College, Reynolds Community College, and Simon Fraser University in British Columbia).
Hosted a Faculty Forum on Bias in Evaluations with Dr. Philip Stark
Conducted-follow up research on best practices in student course evaluations at institutions recommended by Dr. Philip Stark as exemplars of best practices( Simon Fraser University, University of Oregon and the University of Southern California, University of Colorado at Boulder)
Developed two question banks (one pre and one subsequent to Dr. Stark’s presentation
2022-2023
Surveyed faculty, deans and department chairs to determine what they would most like to learn from student course evaluations. The 87 responses revealed that faculty, department chairs, and deans wanted more meaningful feedback from students on course design, teaching methods, student enjoyment of course, student learning, and class climate.
At the recommendation of the Workgroup, a Statement of Work for a Consultant for ACC Course Evaluation Instruments-Psychometric Analysis was written. Workgroup leaders contacted multiple consultants: Lana Newton, Acting Director of Learning Experiences Assessment and Planning, Simon Fraser University; Dr. Bruno Zumbo, Professor and Distinguished University and Canada Research Chair in Psychometrics and Measurement, University of British Columbia; Roland Carrillo, Campus Effectiveness and Anthology Blackboard course evaluation consultants Dr. Glover, Dr. Choi, Professor and Quantitative Methods Area Chair, Department of Educational Psychology, Pearson Endowed Professorship in Psychometrics, College of Education, University of Texas.
OIRA agreed to assist with the project. Richard Griffiths, IR Data Scientist/Survey Specialist and Susan Burkhauser, Director of Institutional Research were assigned to assist with completing the psychometrics for faculty developed questions.
The new AR and GP for Faculty Evaluation established a collegewide weighting for the results of student course evaluations and faculty reflection on that feedback in a faculty member’s overall evaluation rating.
2023-2024
The Student Course Evaluation Workgroup was recreated; there are currently members from all but one AOS.
Fifty adjunct and full-time faculty were selected from a variety of departments to participate in a survey of 65 questions, setting the groundwork for which questions to move forward for student course evaluations of lecture courses, including face-to-face (F2F), synchronous online (DLS), asynchronous online (ONL), hybrid classroom (HYC), hybrid asynchronous (HYD) and Hy-Flex offerings. Questions were reviewed for: the appropriateness of the question for our different course delivery methods; determining which of the various dimensions involved in teaching is measured by the question; the level of how useful and actionable the question is to improving student learning or faculty teaching; the quality of the question; and the wording of the question.
With assistance from OIRA, the questions have been edited to be more student friendly and reduced to 41 questions for lecture courses. The questions cover course design and organization, instructional methods, assessment and feedback, class culture, and student learning.
Conducted four student focus groups to determine if the questions are clear and understandable, mean the same thing to the students as they do to us, and capture the things students would like to tell faculty about courses
Piloted the questions for end-of-the-semester course evaluations for almost 200 sections with about 1400 responses
Surveyed all faculty for comments on the questions.
2024-2025
Surveyed students at HLC for feedback on length of course evaluation and their thoughts on course evaluations. (37 responses)
OIRA analyzed the results of the focus groups, pilots, and faculty comments to further tweak the questions and to be sure the questions measure the correct dimensions. (Information on the methodology used by OIRA to analyze the spring pilot can be found here.)
Surveyed departments, the Student Course Evaluation Workgroup, faculty about teaching lab, internships, practicums, cooperatives about what they would most like to learn from student course evaluations. Forty-seven faculty responded and their suggestions have been incorporated into the proposed Student Course Evaluations for Labs, Internships, Practicums, and Cooperatives.
Piloted question sets for lecture (LEC - revised set from spring pilot), lecture-lab, lab, internship, practicum, cooperative courses during the Fall semester: This involved over 400 faculty and about 1000 sections
Surveyed faculty who participated in the pilot
Worked with OIRA to analyze the pilot results for LEC courses to determine a recommendation for a rating scale; continueing to work on analyzing the other surveys for factor analysis
FDEC has approved the LEC surveys, making the student course evaluation ratings consistent with the new faculty evaluation ratings, and a recommendation for computing the overall rating
Presented the LEC surveys, the ratings recommendations, and a recommendation on low response rates to ASAC - Approved on March 7, 2025
Presented the other surveys to FDEC and ASAC for approval - Approved on May 2, 2025
Planned rollout of all surveys in Fall 2025