Narrative

Feedback

From Students

While teaching, I often feel that the material is very basic, that I am an excellent communicator, and that every student is happily learning their maximum amount at all times. This is often not the case.

I had this realization after grading the first exam in a data structures course I taught over the summer of 2017 to roughly sixty sophomore and junior students. I was already soliciting verbal feedback to my beginning of class review questions or end of class summary questions. I mistakenly believed that a quick answer, often by one of the more advanced students, meant full class understanding, and silence meant no understanding (of course, silence itself can mean a variety things, including, “We need more time to process what you’ve just said”). It is amazing the obvious things you forget when teaching the first time – things I knew as a student, but forgot in my bustle of preparation and presentation.

Thus, I made the simple decision to anonymously poll my students every week to determine their level of understanding. This made an enormous difference in my classroom, as I not only had a more accurate picture of their level of learning on specific topics, but I could gather feedback that stimulated discussion (e.g. “Class time is” : fun/useful/waste of time, or “Exams are ” : too hard/too easy/not correlated to course content) and occasionally lighten the mood (e.g. “Are you looking forward to the final?” : yes/no/I quit). These questions fostered conversation that eventually resulted in me changing pace, offering in-class worksheets (‘Arrays Practice’ and ‘Sorting Practice’ below), allowing short group discussion/question formulation time, and giving extra credit for the best student “cheat sheet” of key words from one chapter (‘Graph Speak Student Summary Work’ below). I am so thankful I woke up and gathered the feedback during the semester, before it was too late to implement in a way that would benefit my students. I became a better instructor, they became a better class.

To Students

Educational preferences have shifted significantly, with Gen Z learners (individuals born between the mid-1990s and mid-2000s). They prefer instant feedback, are increasingly collaborative, and are active-learners who prefer project-based coursework [1]. Meanwhile, with increasing course sizes, it is difficult to provide students the timely, subjective feedback they need to improve the quality of their work.

Peer review is a highly-engaging feedback mechanism often used to addressing these challenges. Instead of relying solely on a professor or TAs for feedback, peers collaborate to provide diverse, multi-sourced feedback to one another with a relatively quick turnaround. In addition, the evaluation process itself gives students an opportunity to reinforce recently-learned course concepts by critically evaluating others’ work [2]. Finally, for professors, it is a scalable approach [3]. Since students provide feedback to one another, adding students to a course does not reduce speed and quality of feedback. Thus, for its host of benefits, I implemented a semi-automated peer assessment process based on actual student work that can be employed by any professor to increase the diversity and timeliness of feedback a student receives (see details on the 'Research' page).

Communication and Teamwork

In almost any industry post-graduation (computer science or otherwise), my students will work with others. Thus, I made the projects team-based, with 2-3 students per group. I instituted a scoring rubric for team participation (‘Scoring Rubric for Team Participation’ below) so that each student would know that they were being held accountable by their teammate(s). In addition, if randomly selected by the TA, both students would present in a code walkthrough. Thus, with these two balances, I tried to disincentivize students from letting their teammate do all the work or leaving their partner behind.

I also always enable the Piazza plugin on Canvas to allow students to ask questions (with an anonymous option) and to help one another. Crowdsourcing the instruction in this regard allows more immediate assistance and gives students the opportunity to learn by teaching. I found that many students are more willing to ask online than in class, and that peers often provide a concise and coherent answer. Students can echo questions, which can clue me in to something I have not covered well. Although it requires more work to moderate, Piazza provides a means to allow students to communicate with and help one another in a timely manner.

Creativity and Crowd Sourcing

My first semester teaching I decided to gather and modify material from four sources: the textbook, two previous instructors of the course at USF, and various small lessons, worksheets, and projects shared on the web by other universities. Although it took more time to review, format, fill in gaps, and ensure proper content coverage, I discovered that building a course in this manner resulted in a more open, creative atmosphere. By including content that was recent and pertinent, I modeled for my students the way they should pursue their education: look for outside resources to bring to the classroom. I even allowed students to contribute material to promote a diversity of perspectives. Something I find unhelpful might be just what a specific student needs. This increased student creativity and engagement and created an environment of flexibility.

Throughout the semester, as I was continuing to compile and grow my collection of varied teaching resources, I resolved to never take a resource as is, but to improve upon and streamline everything before it was delivered to the students. I believe one should always refine and I wanted the course to have my own “flavor”. Ultimately, I want to be a valuable resource to the next instructor who teaches my course so that future students will benefit from continual course improvement.

Example resources

Below is a selection of sample resources from my Data Structures course.

References

[1] Renfro, A. Meet generation Z, 2012.

[2] Beasley, Z., Friedman, A., Piegl, L., & Rosen, P. (2020). Leveraging Peer Feedback to Improve Visualization Education. arXiv preprint arXiv:2001.07549.

[3] Beasley, Z. J., Piegl, L. A., & Rosen, P. (2019, June). Board 39: Designing Intelligent Review Forms for Peer Assessment: A Data-driven Approach. In 2019 ASEE Annual Conference & Exposition.

Syllabus

DataStructures_Syllabus.docx

Scoring Rubric for Team Participation

Scoring rubric for team participation.docx

Project 1 - Linked Lists

project1.pdf

Project 4 - Graphs

project4.pdf

Sorting Methods Practice

Sorting practice.pdf

Dijkstra's Algorithm Practice

Dijkstras practice.pdf

Infix Tree Example

Infix example.pdf

Arrays Practice

Big Oh, Arrays practice.docx

"Graph Speak" Student Summary Work

Graph Speak Student Work.pdf

Science of Learning Tips for Teaching

Science of Learning Tips for Teaching.docx

Kruskal's Algorithm Example

Kruskal's MST example.xlsx