Cross-Curricular Gems
Carolina Ortega, Elaine Madrigal, Andrea Flores
Carolina Ortega, Elaine Madrigal, Andrea Flores
Research Question:
How do different types of technology-mediated feedback impact student learning?
No matter the educational setting, whether traditional or distance learning, both teachers and students use feedback to change behavior and/or improve performance (Brown & Green, 2020). Teachers use feedback as a way to show students gaps in their knowledge and ability, so that they can improve their academic performance on future assessments. Technological tools, such as screencasting software and classroom response systems, allow teachers and students to provide feedback in a more immediate or personal way (Joseph-Edwards & Edwards, 2022; Wu et al., 2019). Even word processing tools, like Google Docs, provide tools that students and teachers can use to provide collaborative feedback (Brooks, 2022). Highlighting, commenting, and suggesting edits allow users to adapt text feedback to an online environment, with the added benefits of immediacy and remote capabilities.
Instructors make an important instructional choice when they select the methods that will be used to provide feedback; this choice should be made while taking into account the product being assessed, as well as the methods’ effectiveness and efficiency (Brown & Green, 2020). Students benefit more from feedback that is delivered in a timely manner, and research also shows that students will prefer timeliness over the specific method/format of the feedback (Burnham et al., 2017; Joseph-Edwards & Edwards, 2022; Parkes & Fletcher, 2017; Şahin, 2019; Sánchez-Mora et al., 2020).
This report seeks to answer the question: How do different types of technology-mediated feedback impact student learning?
CSU Fullerton’s Pollak Library online databases were used to find relevant literature. Databases used include ERIC, Educational Full Text (H.W. Wilson), APA PsycInfo, and Teacher Reference Center. When searching databases, the following keywords were used: feedback, education, technology, instant feedback, peer feedback, audio feedback, and video feedback. When a relevant text was mentioned within another article, Google’s search engine and the databases above were used to help locate the text as indicated in the reference lists. The themes that were identified in the literature review process are (a) peer feedback, (b) instant feedback, and (c) media-based feedback.
Although feedback has long been viewed as a teacher-centered activity, recent research indicates a general trend towards more student-focused activities, like peer review (Daweli, 2018). Peer feedback, also known as peer review, involves students in the process of evaluating and improving the work of their peers. While student involvement is key to the peer review process, their responsibilities can vary based on the students' setting. Teachers can use technology to create peer feedback activities most suited to their students' educational needs.
Peer feedback is not exclusive to synchronous learning environments and can occur during after-school hours. Research indicates that technology-mediated peer review activities have been used successfully in traditional, hybrid, and online learning environments (Ahmed & Al-Kadi, 2021; Daweli, 2018; Mortaji, 2022). The level of interaction between peers can range from collaborative review groups to asynchronous feedback comments delivered across different time zones. However, certain environments or modalities can be more productive for specific groups of students. Native speakers of English prefer to review their peers during class time, a setting that would require that students verbally communicate with their reviewers in real-time (Ahmed & Al-Kadi, 2021). In contrast, non-native speakers preferred text-based online peer review assignments done outside of class (Ahmed & Al-Kadi, 2021). This difference in preferences suggests that technology can be used to support student learning in asynchronous settings for non-native speakers. When non-native speakers of English review their peers after class, they have more time to process the writing and to use online translation tools.
Similarly, asynchronous peer review activities can also help students with inconsistent class attendance be part of the learning process. Tools like Google Docs can enable students to give and receive high-quality feedback from their peers in both traditional and fully virtual settings (Daweli, 2018; Ebadi & Alizadeh, 2021). The commenting and suggesting features in Google Docs allow students to collaborate and respond to feedback at any time of the day. Other tools, like blogging websites, can also help students connect with their peers, but have had different effects on student learning. A recent study (Mortaji, 2022) found that posting students' work on publically available websites can increase access and participation among most students while simultaneously discouraging others. The public setting can make some students feel uncomfortable, insecure, and fearful of anonymous comments during the peer review process (Mortaji, 2022). Using a password-protected website can help address these concerns and create a safer, more inviting learning environment for students. Educators can also address students' learning needs by altering the design of peer review activities.
The peer review process can be designed to help students feel safer and more willing to take educational risks that boost learning. Although fully anonymous comments can make students uncomfortable, introducing some anonymity in the peer feedback process can help boost participation (Ching & Hsu, 2016; Mortaji, 2022). Students who review their peers in online environments feel more comfortable leaving honest feedback due to the lack of face-to-face interaction (Ching & Hsu, 2016; Ebadi & Alizadeh, 2021). Online role-playing exercises can also help encourage students to be truthful when reviewing their peers, especially those they do not know well. A recent study (Ching & Hsu, 2016) found that students felt more comfortable reviewing their peers when they wrote their comments from the perspective of an outside stakeholder. Online environments also provide students with greater privacy when reviewing their own feedback, helping them feel safer (Ebadi & Alizadeh, 2021). Learner-centered peer review can further empower students by giving them greater control over what parts of their work are open to feedback (Ebadi & Alizadeh, 2021). Google Docs can support this process by making it easy for students to highlight specific parts of their work they want to be reviewed. Educators can also provide learners with greater control over their learning through the use of classroom response systems (CRS) that provide students with immediate feedback on their progress.
What role might your classroom community play in implementing Peer Feedback processes?
(Spencer, 2019)
Using mentimeter.com can help you build student-focused lessons that allow students to see the responses of their peers. Through this interactive polling system, users remain anonymous, which can lead to increased student participation. Ching & Hsu (2016) found that anonymous feedback facilitated by a peer-to-peer discourse can also enhance student agency and a collaborative learning environment. So, all teachers should try this tool to promote a collaborative, stress-free learning environment to enhance student voice in the classroom.
Formative feedback is an important part of the learning process and can provide key information for both students and educators (Elmahdi et al., 2018; Şahin, 2019). Students can especially benefit from feedback delivered instantly rather than days or weeks after an assignment is submitted (Burnham et al., 2017; Şahin, 2019; Sánchez-Mora et al., 2020). Teachers can provide students with timely feedback through classroom response systems (CRS) that test students' knowledge in real-time. Programs such as Desmos allow teachers to see student work in real-time and provide instant feedback for students as they are problem-solving and working on the concepts. Desmos provides and opportunity to use written feedback in real-time to support student learning. Similar to Desmos, which is predominantly used for mathematics, Peardeck provides and option for teachers to leave students in real-time. Peardeck is supported through google slides, and allows teachers to use slides they have already created. Like Desmos, teachers can see student work live, and provide instant feedback based on student work all virtually.
Unlike traditional lectures, those that use instant feedback questions rely on students' constant attention and responses throughout the entire class period. As a result of the frequent questioning, students in these environments experience greater and more consistent attention rates throughout lectures (Asiksoy & Sorakin, 2018; Sun et al., 2018). The increased attention rates last beyond the period of questioning and go on for the remainder of the class, suggesting that the feedback can motivate students to continue participating (Sun et al., 2018). Using these questions has also led to increased attendance rates for the course duration, which researchers attribute to the perceived value lectures bring to students (Burnham et al., 2017; Fotaris et al., 2016). However, other research indicates that these benefits do not extend to all students. Wu et al. (2019) found that using a CRS helped boost class participation but did not improve the attendance rates of all students. These findings suggest that other interventions may be needed to encourage all student populations to attend.
Although instant feedback questions cannot motivate all student groups to attend class, they can still be used to create a more equitable learning environment. Instant feedback questions can help engage all learners but are especially beneficial for shy and quiet students who can feel excluded from class activities (Elmahdi et al., 2018; Fotaris et al., 2016). Participants in a recent study shared that Plickers, a formative feedback tool, helped all students to participate equally in classroom activities (Elmahdi et al., 2018). Plickers allow all students to submit a response anonymously, which can help alleviate the social anxiety of shy students. Other tools have had similar effects on students and increased the confidence and social behaviors among these shy students (Fotaris et al., 2016). These positive effects have also been observed among other student groups. Ashtari and Taylor (2021) found that "the positive effects of CRS were even more prominent within an underrepresented population within technology" (p. 136). These results suggest that instant feedback questions can be used to increase the engagement of many student populations who are traditionally excluded.
These benefits extend past the end of the lecture and can lead to greater involvement outside of class. Students in these environments typically spend more time interacting with classroom materials before and after lectures (Asiksoy & Sorakin, 2018; Law et al., 2018; Sun et al., 2018). The additional exposure to content in and outside of class has led to significant improvements in exam scores, suggesting that instant feedback can help support academic achievement (Asiksoy & Sorakin, 2018; Burnham et al., 2017; Law et al., 2018; Sun et al., 2018).
Educators can further support students learning by designing instant feedback questions that promote deeper thinking and discussion. Students reported greater learning outcomes when given time to discuss questions with peers before submitting their answers (Sun et al., 2018). Allowing students to explain their choice to others can help them identify and address gaps in their learning, further improving learning outcomes. However, other studies have indicated that students can still achieve this deeper level of thinking in independent environments (Papadopoulos et al., 2021). Papadopoulos et al. (2021) found that asking students to justify their answers led to increased academic performance, even when they could not review their peers' responses. Despite not experiencing greater academic improvement, students who could see their peers' justifications still described their experiences more positively. Given these findings, educators must consider all aspects of student learning when incorporating instant feedback questions.
Above is an excerpt from a research study conducted by C. Ortega and R. Valdez comparing two types of instant feedback interventions, Mentimeter and Google Forms. Data were collected on students' achievement levels, self-efficacy, and engagment.
Digital Instant Feedback: Case Study
This excerpt is from a research study conducted by Carolina Ortega and Rachel Valdez. This study examines the impact of instant feedback through the online tool Mentimeter. Through this intervention, the following results indicated that 66.7% of the students loved instant feedback, 26.5% were neutral, and only 8.8% of students did not care to receive instant feedback.
This data supports previous research that found students wanted instant feedback so they could continue to think about the content outside of class and reflect on their own learning. (Asiksoy & Sorakin, 2018; Burnham et al., 2017; Law et al., . Papadopoulos et al. (2021) 2018; Sun et al., 2018). Five students in the study explicitly stated that instant feedback made them think more about the topic, while the other 29 students found that instant feedback allowed for further reflection and room for improvement. This supports the idea that instant feedback is intended for the student to reflect on their development throughout the learning process. (Campen & Molenarr, 2020).
Media-based feedback, which includes media like video or audio, allows users to try a new modality instead of relying on text-based feedback. Due to its need for technological knowledge and comfort, its use is most often done with instructors providing feedback.
Audio feedback has had mixed reactions. In a study by Parkes & Fletcher (2017), an overwhelming majority were satisfied with the audio feedback provided and believe audio feedback is clear and easy to follow, identified their strengths and weakness, and was more personal. Most students in Parkes and Fletcher’s study expressed that the audio feedback provided a higher quantity of feedback as well as better quality feedback. Most students do not have a preference for the audio quality or recording method, as long as the turnaround time was prompt. Within the sample pool, 72 percent of students preferred audio feedback, while 14% preferred written and 14% had no preference. Preconceptions that audio feedback is difficult to provide, due to the need for quiet places which limits physical accessibility. However, this seems to not impact students’ attitudes toward audio feedback. These results suggest that as long as the feedback is effective and timely, students do not place high expectations on the audio quality/production. Sarcona et al.’s study (2020) produced different findings; its students preferred text feedback over audio feedback as students felt they were better able to visually interpret constructive feedback when distinguishing areas of strength from developing skills. This study also confirmed Parkes & Fletcher’s finding that audio feedback created an appreciated and more collaborative rapport. This study also indicated that students’ preference for audio versus text feedback correlated with their perceived learning style.
Similarly, video feedback also had conflicting reactions. In blended learning environments, students and teachers have indicated that they prefer the efficiency and specificity of text feedback over the affective benefits of conversational and supportive communication (Borup et al., 2015). Other researchers have challenged these studies that state that video feedback is not worth the time-consuming effort for both teachers and students. Lowenthal (2021) suggests that video feedback can be used more effectively than was indicated in earlier research. He believes video feedback could have more significant affective benefits in distance learning, in contrast to the blended environment of Borup et al.’s study.
Screencast videos allow the person providing feedback to record their screen, often displaying the student's work, in sync with a simultaneous recording of their audio feedback. While some students and instructors do not perceive a difference in the feedback provided, it can provide affective benefits (Joseph-Edwards & Edwards, 2022). These affective benefits, such as intimacy and closeness, are particularly useful in distance education programs (Joseph-Edwards & Edwards, 2022; Lowenthal, 2021).
Issues of equity and access appear because of the technical barriers that may be presented with this type of feedback. Media-based feedback requires good internet connection or increased data usage in order to view or receive the instructor’s feedback (Joseph-Edwards & Edwards, 2022), putting some student groups at a disadvantage (i.e. low income, physical location with limited internet providers).
Overall, students did not prefer media-based feedback if it came at the expense of timeliness or constructive quality (Joseph-Edwards & Edwards, 2022; Parkes & Fletcher, 2017). In a survey, some students considered that instructors’ feedback quality might diminish if feedback took longer to provide via audio channels (Joseph-Edwards & Edwards, 2022). Teachers planning to use media-based feedback should be sure students will be able to access the feedback, as well as implement the appropriate media format for the task. As in other forms of feedback, timeliness, and efficacy of the feedback is key.
General feedback of the assessment of student work can occur in various forms and some can be effectively implemented in a variety of learning environments. In Campen and Molenarr’s study (2020), it was determined that the feedback that provides the most positive trends in student success is meta-cognitive and process feedback. Instructor determination of appropriate feedback measures within a source that fosters student growth relies on multiple factors; perceived learning style, access to stable internet, availability of quiet spaces, and time (Parks & Fletcher, 2017). This formation demands that instructors take a holistic perspective when addressing best practices for providing effective metacognitive and process feedback.
A complementing, yet, consistent factor in the development of ensuring student growth, is facilitating student engagement through focusing on student-driven practices. Instant feedback is a method of disseminating process-feedback, which is intended for the student to reflect on their development throughout a learning process (Campen & Molenarr, 2020). Present within the Iraj et al.’s (2021) study illustrates the need for students to be provided a sense of agency over their interactions with instructor and/or digital feedback models. In the case of instant feedback, this can be achieved with the embedding of progress checks within lectures and instructional activities with students responding to digital prompts with a clicker. The use of clickers for facilitating instantaneous feedback drives student learning with opportunities to bridge the gaps of learning or developed misconceptions of the material because there has been consistent interaction by the student with the content material (Burnham et al., 2017). As instructors take Burham et al.’s 2017 study into consideration when determining feedback practices, it is imperative to note that while synchronous engagement trends will increase with higher attendance rates, it does not guarantee a generalized trend in increased student performance (Wu et al., 2019). This suggests that the reinforcement of learning objectives limits students mobility and overall access to effective learning practices. The effectiveness of digital progress checks, such as clickers and gamification, is limited by the environments in which they are implemented. In line with the results of the studies evaluated, it is suggested that these types of progress-feedback strategies are more effective as an instructional tool when they are implemented in a synchronous or in-person setting.
Formatting instruction to fully maximize the use of digital feedback strategies should factor in student time spent accessing, interpreting, and connecting to the given feedback. In combination with instant-feedback tools, the digital tools that include self-guided progress provide students with increased interaction outside of a brick-and-mortar learning environment with content, while simultaneously limiting emotional stressors that inhibit learning found from gamification feedback (Chen et al., 2016). Providing students with opportunities to use platforms, like Kahoot, in an asynchronous manner also increases student engagement, but adds the benefit of increased active learning as competitive stress has been removed from the instructional activity. Furthermore, this incorporation of instant feedback tools focuses on overall practices being student-driven as they determine when, where, and how long they interact with the course content.
Emphasizing feedback that supports student-driven approaches for accessing feedback is inline with increasing student agency and confidence which in turns supports collaborative learning. For example, the use of anonymous feedback that is facilitated by a peer-to-peer discourse can also enhance student agency and a collaborative learning environment (Ching & Hsu, 2016). An effective practice when establishing a student-driven exercise that requires students to rely on peer understanding of student outcome standards is through role-playing feedback. Interestingly, in Ching and Hsu’s study (2016), it is acknowledged that peer groups are resistant to provide or accept metacognitive feedback when class groups have a perceived lack of trust of the peer’s content credibility, this trend is continued in situations when the peer feedback is also anonymous. Peer-to-peer feedback strategies that are used within a whole group setting enable students to reach a Webb’s depth of knowledge (DOK) that would resemble level 3. DOK level three determines that students are engaged in strategic thinking practices which supports students in mastering strategic thinking through explanatory feedback that critique reasoning, the justification of sequencing, and complexity of written submissions (Hess, 2006). The inclusion of anonymity when incorporating peer-to-peer feedback into instruction may be perceived as a solution to bolster student engagement in feedback practices; it is more suitable when student identities are veiled through role-playing. In support of anonymous peer-to-peer feedback, Barber et al.’s 2015 study noted the significance in positive learning outcomes when peer-to-peer feedback was embedded into larger collaborative assignments through creative personas. The creativity and mastery of the content needed to create objective-related personas encouraged students to become more trusting in their own decision making and media-related feedback.
In attempting to meet the needs of all students, holistic planning reinforces instructor ability to develop equitable practices. Taking student personalities and circumstances into consideration provides instructors with an enhanced ability to support the learning needs of a variety of students. Based on the literature, best practices that foster equitability share general characteristics. Common trends in practices that hinder equitable learning from feedback are availability to stable internet connections and the need for scaffolding to understand student standards. In Park and Fletcher's study (2017), the provision of video feedback is supportive of general class progress, however, it is not as conducive to increased learning trends as other means of feedback - text or audio. Low-income students meet barriers of financial stability that prevent viewing video-feedback asynchronously if their internet access is not consistent. Also, students living in areas that lack internet providers, known as “Internet deserts,” would also face obstacles accessing video feedback outside of the classroom. IEP students may benefit more significantly than low-income students, as an instructor would be able to adapt instruction by setting aside specific times and locations in which feedback can be viewed synchronously. When targeting instruction and feedback to meet the needs of English language learner students, they will benefit more distinctly with feedback that can be offered or peer-developed when given optional opportunities to compose textual feedback rather than audio feedback (Lowenthal, 2021). Therefore the creation of peer-to-peer feedback that utilizes text formatting is a suitable adaptation for language learners as they gain more time to critically analyze the standards of the assignment and develop comprehensive text feedback (Ahmed & Al-Kadi, 2021). However, we see a different trend for effective practices that target the personalities of Advanced Placement (AP) or GATE students. In previous literature, it was noted that certain forms of feedback led to emotional stressors that would limit a students level of potential development when content engagement was competitive (Chen et al., 2016). López-Jiménez et al.'s 2022 study finds similar results, but noted that this form of feedback that is triggered by competitive gamification better supports students who are previously determined to be high performing. Overall, the review of the difference in student populations within the classroom is necessary and still prompted the usefulness of differentiated instruction and feedback.
The importance of teachers making strategic choices in selecting the appropriate feedback tool for assessment type is key to student learning (Brown & Green, 2020; Lowenthal, 2021). Some assessments may require students to visually see their work with the feedback, in which case a screencast or text feedback would be sufficient. While other times, video or audio feedback is sufficient. Some learning management systems have feedback tools embedded into them, that may provide more familiarity and streamline the feedback process in an educational program. When reviewing for an exam or immediate feedback is required, auto feedback tools provide convenience for both the instructor and student.
Upon analysis of the literature, possible implication of these findings suggest there should be more quantitative research that would be able to evaluate the possibility of a correlation between the digital practices that require low student energy to interpret feedback and the impact on overall engagement that leads to increased student mastery. Studies have suggested that time and effort spent on accessing video feedback negatively impacted student perceptions of effectiveness of digital tools (Lowenthal, 2021). Future research that focuses on determining if there is a link between a decrease in student engagement with feedback when that feedback requires higher rates of time and resources spent interpreting the feedback, will support teachers in making decisions on the implementation of best practices based on data-driven analysis. Conclusions that can be drawn from such studies can encourage future development into better equitable practices that are student-driven and inclusive.
5 Strategies in 3 Minutes
(Watson, 2013)
Ahmed, R., & Al-Kadi, A. (2021). Online and face-to-face peer review in academic writing: Frequency and preferences. Eurasian Journal of Applied Linguistics, 7(1), 169–201.
Aşıksoy, G., & Sorakin, Y. (2018). The effects of clicker-aided flipped classroom model on learning achievement, Physics anxiety and students’ perceptions. International Online Journal of Education and Teaching, 5(2), 334- 346. http://iojet.org/index.php/IOJET/article/view/389/238
Ashtari, S., & Taylor, J. (2021). Winning together: Using Game-Based response systems to boost perception of learning. International Journal of Education and Development Using Information and Communication Technology, 21(1), 123–141.
Borup, J., West, R. E., & Thomas, R. (2015). The impact of text versus video communication on instructor feedback in blended courses. Educational Technology Research and Development, 63(2), 161–184.
Brooks, C. (2022, April 29). Improving feedback and fostering collaboration with technology. Edutopia. Retrieved October 12, 2022, from https://www.edutopia.org/article/improving-feedback-and-fostering-collaboration-technology
Brown, A. H., & Green, T. D. (2020). The essentials of instructional design connecting fundamental principles with process and practice (4th ed.). Routledge, Taylor & Francis Group.
Burnham, N. A., Kadam, S. V., & DeSilva, E. (2017). In-Class use of clickers and clicker tests improve learning and enable instant feedback and retests via automated grading. Physics Education, 52(6), 1–7.
Campen, C. K. & Molenaar, I. (2020). How teachers integrate dashboards into their feedback practices. Frontline Learning Research 8(4), 37-51.
Ching, Y. H., & Hsu, Y. C. (2016). Learners’ interpersonal beliefs and generated feedback in an online role-playing peer- feedback activity: An exploratory study. The International Review of Research in Open and Distributed Learning, 17(2), 105-122.
Daweli, T. W. (2018). Engaging Saudi EFL students in online peer review in a Saudi university context. Arab World English Journal, 9(4), 270–280.
Decman, M. (2020). Factors that increase active participation by higher education students, and predict the acceptance and use of classroom response systems. International Journal of Higher Education, 9(4), 84–98.
Ditch That Textbook. (2022, August 29). How do I give feedback to 100+ students? https://ditchthattextbook.com/feedback-100-students/#tve- jump-182ea121507
Ebadi, S., & Alizadeh, A. (2021). The effects of online Learner-Driven feedback on IELTS writing skills via google docs. Teaching English With Technology, 21(3), 42–66.
Elmahdi, I., Al-Hattami, A., & Fawzi, H. (2018). Using technology for formative assessment to improve students’ learning. Turkish Online Journal of Educational Technology - TOJET, 17(2), 182–188.
Fotaris, P., Mastoras, T., Leinfellner, R., & Rosunally, Y. (2016). Climbing up the leaderboard: An empirical study of applying gamification techniques to a computer programming class. Electronic Journal of E-Learning, 14(2), 94–110.
Iraj, H., Fudge, A., Khan, H., Faulkner, M., Pardo, A., & Kovanovic, V. (2021). Narrowing the feedback gap: Examining student engagement with personalized and actionable feedback messages. Journal of Learning Analytics, 8(3), 101-116.
Joseph-Edwards, A., & Edwards, R. (2022). Screencast feedback: Can I use it? International Journal of Education & Development Using Information & Communication Technology, 18(2), 46–67.
Law, Y. K., Tobin, R. W., Wilson, N. R., & Brandon, L. A. (2020). Improving student success by incorporating Instant-Feedback questions and increased proctoring in online science and mathematics courses. Journal of Teaching and Learning With Technology, 9(1), 64-78.
López-Jiménez, J. J., Fernández-Alemán, J. L., González, L. L., Sequeros, O. G., Valle, B. M., García-Berná, J. A., Idri, A., & Toval, A. (2022). Taking the pulse of a classroom with a gamified audience response system. Computer Methods and Programs in Biomedicine, 213 (106459), 1-12.
Lowenthal, P. R. (2021). Video feedback: Is it worth the effort? A response to Borup et al. Educational Technology Research and Development, 69(1), 127–131.
Mortaji, L. E. (2022). Public speaking and online peer feedback in a blended learning EFL course environment: Students’ perceptions. English Language Teaching, 15(2), 31–49.
Papadopoulos, P. M., Obwegeser, N., & Weinberger, A. (2021). Let me explain! The effects of writing and reading short justifications on students’ performance, confidence and opinions in audience response systems. Journal of Computer Assisted Learning, 38(2), 327–337.
Parkes, M., & Fletcher, P. (2017). A longitudinal, quantitative study of student attitudes towards audio feedback for assessment. Assessment & Evaluation in Higher Education, 42(7), 1046–1053.
Şahin, M. (2019). Classroom response systems as a formative assessment tool: Investigation into students’ perceived usefulness and behavioral intention. International Journal of Assessment Tools in Education, 6(4), 693–705.
Sanchez, D. R., Langer, M., & Kaur, R. (2020). Gamification in the classroom: Examining the impact of gamified quizzes on student learning. Computers & Education, 144, 1–16.
Sánchez-Mora, J., Tamayo, R. M., & Corredor-Aristizábal, J. (2020). Affordances of audience response systems: Effects of instant and regular feedback. Technology, Knowledge and Learning, 1–18.
Sarcona, A., Dirhan, D., & Davidson, P. (2020). An overview of audio and written feedback from students’ and instructors’ perspective. Educational Media International, 57(1), 47–60.
Spencer, J. [John Spencer]. (2019). Feedback and Trust Grid [Video]. YouTube. https://youtu.be/DLhNX-ArJT8
Sun, J. C., Chen, A. Y., Yeh, K. P., Cheng, Y. T., & Lin, Y. Y. (2018). Is group polling better? An investigation of the effect of individual and group polling strategies on students’ academic performance, anxiety, and attention. Educational Technology & Society, 21(1), 12–24.
Watson, G. [Gavan Watson]. (2013). Characteristics of Good Student Feedback [Video]. YouTube. https://youtu.be/Huju0xwNFKU
Wu, Y. J., Wu, T., & Li, Y. (2019). Impact of using classroom response systems on students’ entrepreneurship learning experience. Computers in Human Behavior, 92, 634–645.