Evaluation is a necessary component of system development and instructional design. Below I listed some of my research projects highlighting evaluation and assessment. However, my other research projects also involved assessment of student learning and evaluation of educational programs.
Current projects: Feedback: Literature review and Empirical investigation
Collaborating with Dr. Binbin Zheng to conduct a literature review on feedback concept and practices
Collaborating with Dr. Carla Allen to investigate learner experience during participating in online peer feedback activities
Technology-enhanced assessments activities for deeper learning
Zagaar, M. & Chen, W. (2022). Assessment for deeper understanding using concept maps: Lessons learned from flipped teaching of Pharmacology. Medical Science Educator. 32, 1289–1297 https://doi.org/10.1007/s40670-022-01653-3 *corresponding author
Conference: Zagaar, M. & Chen, W. (2022). Exploring the use of concept mapping as assessment for learning with the 1st Year Physician Assistant students. 2022 International Association of Medical Science Educators (IAMSE) Annual Meeting
Abstract: To foster deeper understanding of pharmacology concepts among physician assistant students, we integrated concept mapping into our flipped teaching to provide assessment for learning. Different mapping-based assessment strategies were adopted based on learner feedback, including in-person collaborative mapping, individual computerized mapping-based quiz with automatic feedback, and collaborative computerized mapping-based quiz enhanced by jigsaw strategy. Each mapping activity also leveraged the strength of a specific technology platform. Based on findings from comparing learner ratings of these mapping activities and thematic analysis of learner feedback, we engaged in critical reflection and share our lessons learned.
Concept mapping to provide assessment of, for, and as learning
Journal Article: Chen, W., & Allen, C. (2017). Concept mapping: Providing assessment of, for, and as learning. Medical Science Educator. 27(2), 149-153 http://dx.doi.org/10.1007/s40670-016-0365-1
Abstract: In order to enhance healthcare students’ conceptual understanding and assessment of their learning and also to foster their metacognitive development, we fully integrated concept mapping into a radiographic physics course. Mapping assisted instructors to administer assessment of students’ conceptual knowledge and assessment for the improvement of concept instruction. Furthermore, learners were invited to incorporate mapping-based self-assessment as a metacognitive strategy into their learning. To encourage learners to do so, peer exchange of feedback was also arranged. Based on analyses of learners’ maps, self-reflections, and interview data, both evidence of students’ learning and lessons learned from this implementation are provided.
Weblog-based authentic assessment: Two case studies
Journal Article: Chen, W. & Bonk, C. (2008). The use of Weblogs in learning and assessment in Chinese higher education: Possibilities and potential problems. International Journal on E-Learning, 7(1), 41-65. https://www.learntechlib.org/primary/p/24198/
Abstract: Weblogs are beginning to be used in educational settings in China, sending hope for changes in both learning and assessment. This article summarizes two recent Weblogs studies involving new ideas of student assessment in China. While these case studies focus on how Weblogs could facilitate innovation in the assessment field, they simultaneously point out certain problems, which deserve attention before Weblogs can be widely accepted. To facilitate such use, discussions and suggestions are provided on educational uses of Weblogs. Finally, it is hoped that additional studies will be conducted to facilitate changes in learning and assessment in China.
http://www.editlib.org/p/24198
Technology and Common Core
Conference Presentation: Chen, X. & Chen, W. (2015). Technology and Common Core: Linking practice with standards. Society for Information Technology & Teacher Education 2015 Conference, Las Vegas, Nevada, USA. Mar 2-6, 2015
Abstract: Two recent trends play significant roles in the integration of technology in schools: Firstly, teachers were encouraged to leverage technology tools to engage students in deeper learning; secondly, the Common Core State Standards (CCSS) has received wide adoption, promoting the utilization of technology to enhance students’ academic learning. We studied whether these two trends supported each other through analyzing the technology standards in the CSSS with the Bloom’s taxonomy. In the ELA CSSS, teachers were expected to use technology to engage students in higher-order thinking, whereas technology standards associated with lower-order thinking most frequently appeared in the k-5 teaching. Our study served as a bridge to help teachers identify the best practices to introduce technology into classroom teaching.
I led several study investigating the impact of collaborative concept mapping activities on students' knowledge construction. In order to analyze the achievement of students' collaborative learning, we investigated the use of concept maps to assess students' knowledge representations.
Click here for more information
1. Evaluation of faculty development programs
Journal article: Chen, W. Berry, A., Drowos, J., Lama, A., Kleinheksel, A. (2021) Improving the Evaluation of Faculty Development Programs. Academic Medicine. doi:10.1097/ACM.0000000000004151
Abstract: Faculty development (FD) begins with the design of programming that will meet identified needs.
Evaluation of FD programs facilitates the measurement of meaningful outcomes, accurate reporting, continuous improvement, and future program development. The paper list strategies to guide faculty developers in the evaluation effort.
Journal Article: Chen, W., Kelley, B., & Haggar, F. (2013) Assessing faculty development programs: Outcome-based evaluation. Journal on Centers for Teaching and Learning, 5, 107-119.
Abstract: Centers for Teaching and Learning have increasingly realized the need to effectively measure the impact of their programming on the quality of teaching and student learning. This is especially true in the area of educational technology integration. The authors examine the evolution from using output- to outcome- based evaluation methods at their institution and share the major findings their methods revealed. The challenges that were encountered as the outcome-based measures were implemented are also explored. The article concludes with a discussion of planned future steps regarding the improvement of current evaluation practices and the incorporation of process-based evaluation practices.
Conference presentation of initial outcomes of this study: Chen, W., Kelley, B., & Haggar, F. (2013). Evaluation of faculty development practices. Global Chinese Conference on Computers in Education (GCCCE2013) (pp.624-627). Beijing, China.
Conference Presentation: Smith, J. & Chen, W. (2014). Developing and Assessing Faculty Training Programs for Student Veteran Success. Presented at American Educational Research Association Annual Meeting (AERA 2014)
Abstract: Many student veterans experience overwhelming challenges during their transition from military to academic life. It is necessary to provide faculty with the knowledge, skills, and resources needed to support the academic success of student veterans. Sponsored by a large three-year federal grant, our center for teaching and learning has created professional development programs to enable faculty to understand their role in providing an extraordinary learning experience for student veterans. In our presentation, we will describe the development of our faculty training programs, share findings of our program evaluation, discuss lessons learned, and provide future research directions. The presentation will benefit both researchers and practitioners interested in supporting student veteran success.
2. What do Medical Students Learn from Evaluating their Curriculum:
Conference proceeding: Chen, W. & Bradley, E. (2018). What do Medical Students Learn from Evaluating Flipped-Classroom-based Curriculum: A Grounded Theory Study. In Proceedings of E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education (pp. 455-460). Las Vegas, NV, United States: Association for the Advancement of Computing in Education (AACE). https://www.learntechlib.org/primary/p/184992/.
Abstract: This grounded theory study is investigating how medical students learn from direct involvement in evaluating a technology-facilitated learner-centered undergraduate medical education curriculum. Specifically, we focused our investigation on learners’ experience evaluating the pre-clerkship curriculum, which has been constructed based on the flipped classroom model. Initial analysis of 24 individual student interviews revealed themes illustrating the impact evaluation activities had on their educational endeavors. These activities prompted students to: practice reflection; hone skills to communicate differing opinions and provide effective feedback; acquire deeper understanding of the curriculum; and appreciate the significance of evaluation as a process empowering students to participate in their own learning. Whereas traditional evaluation theories emphasize the importance of evaluation as a way to inform program practice and improvement, the current study expands this notion by revealing a direct educational impact achieved through learners’ participation in evaluation activities. Most importantly, learners’ curriculum evaluation experiences were consistent with their other flipped learning experiences, emphasizing a learner-centered approach. Findings from this study could inform other practitioners implementing flipped classroom model to consider evaluation practices that support their learner-centered vision.
3. The practice of formative evaluation is emphasized in Educational Design Research projects. Additionally, my other research projects included evaluation of the impacts of educational technology programs.
e.g., Assessment of students' learning experiences and achievements in a massively open online course.
Chen, W., & Jia, J. (2016) Comparison of online and onsite students’ learning outcomes and experiences in a Massively Open Online Course in China. The Journal of Educational Technology Development and Exchange, 9(1), 67-88.
Evaluation of the usability of information systems adopted in E-learning, business, health care, and other settings.
1. Usability evaluation in Medical Education
Journal Article: Chen, W. Cheng, H. & Bradley, E. (2017) Improving online teaching in a required Geriatrics clerkship using heuristic evaluation. Medical Science Educator. https://link.springer.com/article/10.1007/s40670-017-0437-x 27(4), 871-875
Abstract: We demonstrate conducting heuristic evaluation to improve the usability of online medical education using an instrument specifically designed for e-learning. Specifically, we describe the process and outcomes of implementing heuristic evaluation to enhance a newly developed online geriatric module. The result of this implementation indicated the feasibility and usefulness of incorporating HE, as a relatively low-cost usability evaluation approach, into the refinement of e-learning systems in medical education. Lessons learned from this implementation and future steps are presented at the end.
Conference presentation of initial outcomes of this study: Chen, W. Bradley, E., Bennett, J., Cheng, H. (2017). Enhancing “learner-friendliness” for online medical education through usability testing. 2017 Southern Group on Educational Affairs (SGEA), Southern Group on Student Affairs (SGSA), and Southern Organization of Student Representatives (SOSR) Regional Meeting Charlottesville, VA
2. Usability Evaluation Methodology: Navigating Practical Trade-offs
Journal Article: Chen, W., Paul, A., Kibaru, F., Ma. Y. & Saparova, D. (2015). Two-phase usability evaluation of insurance website prototypes. International Journal of E-Business Research, 11(1), 1-21
Book Chapter: Chen, W., Paul, A., Kibaru, F., Ma. Y. & Saparova, D. (2016). Navigating practical trade-offs during prototype testing. In I. Lee (Ed.). Encyclopedia of E-Commerce Development, Implementation, and Management (pp. 496-510). IGI Global
Abstract: Two different usability methods were adopted in an iterative usability evaluation of prototypes of a major regional insurance company website’s home page. The study consisted of two phases. During the first phase, six individual think-aloud interviews were conducted to compare three prototypes. It was observed that participants tended to click on menu items or options that were intuitive or easily visible. In the second phase, an online survey was administered with existing customers to compare three prototypes, designed based on findings from the first phase. Survey responses indicated that visual design, layout, content/theme, and capturing of attention were most influential on users’ preferences of the designs. Both phases of evaluation were able to identify one of the designs as the best choice. This paper elaborates on methodology details and highlights practical issues encountered. Trade-offs involved during design and implementation of the study are discussed. Lessons and tips learned are shared.
3. Training of E-learning usability evaluators
Conference Presentation: Dickson-Deane, C. Moore, J., Chen, W., Vo, N., Galyen, K. & Washburn, M. (2009). Building competency for usability evaluation of E-learning courses. Association for Education Communications and Technology (AECT) 2009 Convention.
Abstract: The evaluation of E-learning courses is a multi-disciplinary skill-set that includes usability experience, instructional design, learning theory, and a basic understanding of the subject-matter. Although there are several types of instruments to implement e-learning evaluation, evaluators have difficulty using these tools because of ambiguity, specificity for a particular application, or length. We present a case study of the implementation of one instrument and its impact on identifying usability issues.
Conference Presentation: Dickson-Deane, C., Moore, J., Galyen, K., Chen, W., Vo, N. & Washburn, M. (2009). Identifying appropriate E-learning usability evaluators. Elearn 2009 of Trinidad and Tobago.
4. Usability evaluation: The impact of media use and content organization in online boating education courses
Conference paper: Moore, J., Chen, W. & Kulp, G. (2010). Determining effective media use and content organization for online boating education courses. American Educational Research Association 2010 Annual Meeting (AERA 2010). Denver, Colorado, USA. April 30-May 4, 2010
This research focuses on the impact of media use and content organization in online boating education courses. We focused on three NASBLA-approved boating education courses that had different methods for delivering the same content.
5. Usability Evaluation: Assessing user information needs
Conference Poster: Wang, X., Erdelez, S. Al Ghenaimi, S., Centner, S., Chen, W., Wang, J., Ward, D., & Yadamsuren, B. (2010). Understanding users’ needs for utilizing a health literacy website: Using the information horizons approach. Association for Library and Information Science Education (ALISE) 2010 Annual Conference.
Abstract: http://www.alise.org/index.php?option=com_content&view=article&id=212#39
The Missouri Foundation for Health is supporting several partners from around the state of Missouri in developing the Health Literacy Resource Inventory (HLMRI). HLMRI is a centralized web-based interactive digital library. To identify the needs of health literacy materials from different groups, the investigators employed a theoretical framework and methodology called Information Horizons to study three key user groups (health care providers, health care educators, and patients) of the HLMRI. The data collection includes graphical maps of information horizons and in-depth interview transcripts from 15 patients, physicians, nurses, and health care educators. This poster will present the preliminary findings for user-centered design of the HLMRI website.
6. Usability Evaluation of a nuclear math and theory online learning environment
Conference Proceeding: Galyen, K., Dickson-Deane, C., Moore, J., Chen, W. & Vo, N. (2009). Usability evaluation of a nuclear math and theory online learning environment. In G. Siemens & C. Fulford (Eds.), Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2009 (pp. 842-847). Chesapeake, VA: AACE.
Abstract: http://editlib.org/p/31596/
The purpose of this paper is to describe a designer’s exploratory use of an e-learning usability evaluation heuristic in the beginning processes of designing and developing a nuclear math and theory online learning environment (OLE). A team consisting of a designer-developer and three evaluators provided feedback through an e-learning usability heuristic, expert background and experience, and/or visual design experience. Strengths and weaknesses of the process as well as the instrument are described and recommendations for the instrument and future heuristic evaluations are provided.