EPCAL: Studying Synchronous Online Collaboration at Scale

June 17, 2019 @ CSCL 2019 in Lyon, France

Call for Participation

Jiangang Hao, Diego Zapata-Rivera, Jessica Andrews-Todd and Mengxiao Zhu

Educational Testing Service

Note: this is a full day workshop to be held on June 17, 2019 in the CSCL 2019

Collaborative problem solving (CPS) is widely accepted as an important 21st century skill (Fiore et al., 2017; Griffin, McGaw, & Care, 2012; OECD, 2013; World Economic Forum, 2015). Computer-supported collaboration has made large-scale collaboration possible in the real world and is playing an increasingly important role in academia and the workplace (Stahl, Koschmann, & Suthers, 2006). However, studying the synchronous computer-supported collaboration at scale requires significant technological infrastructure to enable the collaboration and to record the collaboration process data. To carry out studies efficiently, one needs far more than what a communication tool such as Skype or an online meeting tool such as Zoom can provide. For example, to manipulate the team assembly, one needs a participant and team management system to form teams under specific rules. To have participants progress at the same pace, one needs a mechanism to synchronize the progress of each participant. To identify evidence of the skills, one needs all the response and communication data captured and aggregated for further analysis. To provide appropriate scaffolding, one needs a mechanism to send feedback to an ongoing collaborative activity.

Educational Testing Service (ETS) has made a major investment in developing a web-based platform, ETS platform for collaborative assessment and learning (EPCAL, Hao, et al, 2017) to support research studies on computer-supported collaboration. The EPCAL allows multimodal communication, team management, modularized item/task uploading, customizable real-time facilitation/intervention, and captures all the process data during the collaboration. Most importantly, it is integrated with powerful data analytics support from ETS’s glassPy data analytics solution (Hao, Smith, Mislevy, von Davier, & Bauer, 2016). All these features make it an ideal tool for computer-supported collaboration research at scale. It is worth noting that the platform is not a commercial product at this stage, but more a research tool that ETS would like to share with the research community for collaboration.

We organized a similar workshop at CSCL 2017 and received tremendous enthusiasm from peer researchers. It also fostered collaborations that led to several joint submissions of research proposals. Now, we are organizing a workshop at CSCL 2019 again, with our fully upgraded EPCAL platform and a variety of task prototypes in different domains. In particular, we encourage participants to bring in their thoughts and share their own research with other participants. Participants will team up in dyads or triads to work on various collaborative assessment prototypes developed at ETS, by which we want participants to have hands-on experience of using the platform to scale up their studies in future. We will facilitate discussions on common challenges and solutions regarding designing collaborative learning and assessment environments and how the EPCAL platform can be used to support various needs of such designs. The planned activities include the following:

  • EPCAL platform, its data analytics and AI-based components
  • Example of collaborative prototypes on the EPCAL platform
  • Participants will team up into dyads or triads to complete tasks hosted on the EPCAL
  • Presentations from participants about their relevant research
    • We planned several slots for participants to introduce their relevant research to other participants.
    • Please respond the form below and we will contact you regarding the arrangement after have a count of the presentations
  • Discussions

In the discussion portion of the workshop, we will discuss how EPCAL can help to scale up the research on collaborative learning and assessment and we strongly encourage new ideas and potential collaborations among the participants.

Intended Audience

The intended audience of this workshop includes researchers, learning and assessment environment designers, and educational practitioners (e.g. teachers) who are interested in the area of computer-supported learning, teaching, and assessment and conducting research projects and/or empirical studies on collaboration and communication from various disciplines such as Computer Science, Cognitive Science, Learning Sciences, Social Science, Technology studies, Educational Technology and Applied Educational Science.

Expected Outcomes

  • We hope the participants to learn the capability of the EPCAL platform and think about whether the platform can be used for their future research.
  • We also want to hear back from participants regarding what additional features they expect from the EPCAL platform, based on which we can further develop the platform.

Equipments Needed

  • Workshop participants need to have a laptop that can connect to internet through WIFI.
  • Please install Chrome browser before you come to the workshop.

Contact Information

  • Please direct any questions and inquires to: Jiangang Hao, jhao@ets.org

Tell us if you will participate (please note that you also need to register for the workshop through the conference registration website)

Organizer Information

Dr. Jiangang Hao is a senior research scientist in the center for Next Generation Psychometrics and Data Science at Educational Testing Service. His current research centers on collaborative problem solving, game and simulation-based assessment, educational data mining & analytics, and automated annotation/scoring. He is the principal investigator of ETS’s collaborative science assessment prototype (ECSAP) as well as the ETS platform for collaborative assessment and learning (EPCAL). He leads the development of the assessment data analytics solution (glassPy) at ETS and is the recipient of the 2015 ETS presidential award. Dr. Hao obtained his Ph.D. in Physics and MA in Statistics, both from the University of Michigan. He has published over 50 peer-reviewed papers and developed several widely used software packages in Python and C++ for image analysis, measurement error corrected Gaussian Mixture Model and Probabilistic clustering analysis. His work has been widely reported by leading technology media, such as the Wired and MIT Technology Review. Email address: jhao@ets.org

Dr. Jessica Andrews-Todd is a research scientist in the Cognitive and Technology Sciences Center at Educational Testing Service. Dr. Andrews-Todd obtained a Ph.D. in learning sciences from Northwestern University. Her research has focused on collaborative learning and assessment of collaborative skills. Her research on learning examines how individuals acquire accurate and inaccurate information as a function of their collaborative experiences. In her assessment work, she has developed and applied approaches for (1) identifying measurable components of complex constructs such as collaborative problem solving and (2) reasoning up from learner behaviors in fine-grained log data from games and simulations to inferences about high level proficiencies. Her research has been funded by the National Science Foundation, Institute of Education Sciences, and the Gordon Commission/MacArthur Foundation. Email address: jandrewstodd@ets.org

Dr. Diego Zapata-Rivera is a Principal Research Scientist in the Cognitive and Technology Sciences Center at Educational Testing Service in Princeton, NJ. He earned a Ph.D. in computer science (with a focus on artificial intelligence in education) from the University of Saskatchewan in 2003. His research at ETS has focused on the areas of innovations in score reporting and adaptive learning and assessment environments. He leads a project aimed at exploring the effectiveness of various strategies for facilitating collaborative problem solving in science. Dr. Zapata-Rivera has produced over 100 publications including journal articles, book chapters, and technical papers. He has served as a reviewer for several international conferences and journals and has been a committee member and organizer of international conferences and workshops in his research areas. He is a member of the Editorial Board of User Modeling and User-Adapted Interaction and an Associate Editor of the IEEE Transactions on Learning Technologies Journal. Email address: dzapata@ets.org

Dr. Mengxiao Zhu is a research scientist in the Center for Next Generation Psychometrics and Data Science at Educational Testing Service in Princeton, NJ. She obtained a Ph.D. degree in Industrial Engineering and Management Sciences from Northwestern University in 2012. Her research focuses on psychometrics for the new generation of assessments, including psychometric models for collaborative problem solving, data mining techniques applied on assessment data, and integration of cognitive science with psychometrics. Her work on collaborative problem solving explores how individual level variables, such as individual knowledge and personality, are related to collaborative performance. She also works on assessing individual and team collaborative problem-solving skills through process data analysis. Her research projects have been supported by the ETS research initiatives, National Science Foundation, and Army Research Institute. Email address: mzhu@ets.org

Reference

Fiore, S. M., Graesser, A., Greiff, S., Griffin, P., Gong, B., Kyllonen, P., . . . others (2017). Collaborative problem solving: Considerations for the national assessment of educational progress.

Griffin, P., McGaw, B., & Care, E. (2012). Assessment and teaching of 21st century skills. Springer.

Graesser, A. C., Foltz, P. W., Rosen, Y., Shaffer, D. W., Forsyth, C., & Germany, M.-L. (2018). Challenges of assessing collaborative problem solving. In Assessment and teaching of 21st century skills (pp. 75–91). Springer.

Graesser, A. C., Li, H., & Forsyth, C. (2014). Learning by communicating in natural language with conversational agents. Current Directions in Psychological Science, 23, 374-380.

Hao, J., Liu, L., von Davier, A. A., Lederer, N., Zapata-Rivera, D., Jakl, P., & Bakkenson, M. (2017). Epcal: Ets platform for collaborative assessment and learning. ETS Research Report Series, 2017 (1), 1–14.

Hao, J., Smith L., Mislevy, R., von Davier, A., & Bauer, M., (2016). Taming log files from the game and simulation-based assessment: Data model and data analysis tool. ETS Research Report RR-16-11. Princeton, NJ: Educational Testing Service.

Organization for Economic Co-operation and Development. (2013). Pisa 2015 draft collaborative problem-solving assessment framework. OECD Publishing.

Stahl, G., Koschmann, T., & Suthers, D. (2006). Computer-supported collaborative learning: An historical perspective. Cambridge handbook of the learning sciences, 2006 , 409–426.

World Economic Forum. (2015). New vision for education: Unlocking the potential of technology. British Columbia Teachers’ Federation.