9-19-22
Proposals on KSD End-of-Semester Student Forms
The Keck Science Diversity, Equity, and Inclusion (DEI) committee has two proposals for consideration.
We propose replacing the current end-of-semester written student feedback form with this version informed by the work of the 2020 Ad Hoc Course Evals Committee (the numerical CMC forms would stay the same) to incorporate equity-minded evidence-based practices from the literature for student feedback: W.M. Keck Science Department Student Feedback Form (uneditable version). These would supplement the current forms required by the colleges.
We propose transitioning from paper feedback forms to digital feedback forms.
Note: we are only proposing changes to the substance and delivery method of the forms. We are not suggesting any change to how the forms are required or used in RPT procedures.
Motivation and Justification
The Ad Hoc Committee surveyed current practices and thinking in order to develop the new questions; the papers they used can be found in the Keck Science Sakai folder for Keck Science Department Resources under Committee 2020 - Ad Hoc Course Evals. The Ad Hoc Committee carefully designed items/questions in the new form based on evidence of reduced bias in student answers. The estimated time to complete the form, including reading instructions, is 17 minutes according to the methods here.
Rather than the current term “student evaluations of teaching,” we suggest the term “student feedback forms” since students aren’t experts in evaluating teaching, but are very capable in reflecting on their own experience (1). The form developed by the Ad Hoc Committee is well-suited to the task of soliciting such student feedback. The only change that we made to the Ad Hoc Committee’s form was to add an “N/A” option to each Likert question.
The proposed process of student evaluations would also allow more instructors to contextualize their student feedback. Under the current model, faculty who teach lab courses have no way of understanding their data relative to distributions from similar student populations (e.g., same course across years, other sections the same semester, etc.). For example, lab instructors in Chemistry previously wanted to understand if their quantitative data were outside the norm for their circumstances. However, there was no way for them to know this other than consulting senior faculty who have acquired a feel for it. Mentoring is of course important and irreplaceable, but can introduce bias and does not substitute for data. The importance of data gathering may also gain importance as statistical adjustments become more necessary (3).
Additionally, digital administration of student feedback forms was a pandemic necessity that we propose continuing for many reasons:
Pros for digital forms:
Data analysis: This is, perhaps, the most important advantage. Digital data is much easier to analyze, combine, compare, etc. One could hand enter all their own paper evals each semester in Excel files to track means, medians, and distributions over the semesters, but it isn't easy. Instead digital data is much easier to manipulate and store long term. Data manipulation is discussed under Administration below.
Reaching all students: If a student misses class on the day of evaluation, it’s far easier to get that student a link than to allow for paper evaluation at another time. Confidentiality can also be enhanced if students can fill out their evaluation remotely.
Record keeping: Facilitates record-keeping, easing retention, dissemination, and distribution of evaluations for administrators, faculty evaluators (e.g., RPT), and instructors being assessed.
Universal Design: Students occasionally have accommodations regarding typing over handwriting, larger font, etc. It is much easier to print out an otherwise digital form if a student requires a paper version than it is to digitize an otherwise paper form.
Improves anonymity: Removes possible identifiable handwriting associated with a scanned PDF of handwritten evaluations.
Comfortability: Research has shown that digital responses tend to be lengthier than handwritten (2).
Reduces workload required to scan paper evaluations
Cons for digital forms and how to address them:
Response rate: The response rate of digital forms has been lower than for paper forms. With the paper forms, we had students feeling socially obligated to participate in what is truly an optional activity because they were physically present in class so that they needed to be around to accomplish other tasks following survey time. This is not always the case with digital forms that students complete at home.
→ To address this concern, we recommend that instructors provide class time that is at the beginning of a class and in the second to last week of class (i.e., before finals but after students have formed a solid opinion of the class). Additionally, faculty leave the room during the evaluations, but return later. Organic chemistry lab was able to receive high (132/151, 87%) response rates using this strategy.
Device diversity. If students are allowed to complete the digital form using any device, then there may be differences in the length or perhaps type of language a student provides. For example, typing fewer words and/or less formally using a phone versus laptop.
→ To address this, instructors should recommend that students bring their laptop to complete the form. (Students usually bring laptops without being asked.)
Potential increased workload on administrators. See below.
Administration
Digital data would allow the department to keep track of calculations and distributions by instructor, semester, course, discipline, etc.There are important questions to resolve like what sort of calculations should be made (e.g., means in addition to medians and distributions) and who should make them and retain them to distribute each semester. We suggest making this the responsibility of the lab coordinators: An administrator like Adam/Velda/UJ keeps the original digital files—with instructor identification—that can always be returned to if there’s a data quality concern. Each coordinator completes calculations and summarizes data using de-identified information (i.e., no instructor names). The administrator sends each instructor their data and the coordinator sends each instructor the contextual data. This additionally allows the coordinator to understand how the course is being experienced by all students taking the course, and not just their students. An alternative model would be to give the calculation and summary task to a single person in the department as part of their service duties.
Citations
Miller, A.; Clark, M. H.; Donnelly, J.; Hahs-Vaughn, D. The Relationship of Student Evaluations of Teaching to Future Performance in a Chemistry Course Sequence. J. Chem. Educ. 2022, 99 (3), 1336–1346. DOI: 10.1021/acs.jchemed.1c01020
Benton, S.; Cashin, W. Student Ratings of Teaching: A Summary of Research and Literature. IDEA PAPER #50
Esarey, J.; Valdes, N. Unbiased, Reliable, and Valid Student Evaluations Can Still Be Unfair. Assessment & Evaluation in Higher Education 2020, 45 (8), 1106–1120 DOI: 10.1080/02602938.2020.1724875.