About


Call for Participation

Joseph Jay Williams, Sasha Poquet, Neil Heffernan are holding a workshop on online educational experimentation on Sat June 30 at AIED.


AIED (June 30): Design and Application of Collaborative, Dynamic, Personalised Experimentation (tiny.cc/edexpt) is a post-conference workshop at AIED2018 (the 19th International Conference on Artificial Intelligence in Education). The workshop will focus on the design and application of randomized experimental comparisons of how components of digital problems impact students' learning and motivation. This HALF-DAY workshop takes place on June 30th in London. Read the full workshop description here.


If you are interested in participating, please send an inquiry email as soon as possible to aied2018experiments@googlegroups.com expressing your interest.

Feel free to include your website or a link to your papers, and any other information about why you're interested or what you might like to get out of the workshop.

We will respond to these email inquiries on a rolling basis.

There is a limit of 20 participants, so a response to your email will be provided about the likely fit of your submission beforehand.

Examples of Potential Collaborative Experiments

The goal of the workshop is for participants to design experiments that could be deployed in real online problems, in settings like ASSISTments, K12 online math homework, MOOCs on programming, as well as online quizzes for on-campus courses. 

Feel free to specify in your inquiry email or paper, the educational settings contexts you are interested in, what variables for personalisation are of relevance to you, and what experimental factors/conditions you want to investigate. 

Example problem with motivational message in Khan academy

Example problem in ASSISTments with hints, reflection prompts and motivational messages

Workshop In-Brief

The workshop will explore how to enhance components of digital problems to increase students' learning and motivation. The focus is on the components of widely used online problems, like prompts for students to reflect (1, 2), hints, explanations, motivational messages, and feedback. These can be implemented within online activities where it is possible to conduct online experiments. Examples of such activities include problems on the www.assistments.org platform for middle school math, quizzes in on-campus university courses, and Massive Open Online Courses (MOOCs).

Applying dynamic experimentation in online problems is fruitful, as components of online problems are especially germane: (1) They are ubiquitous in a wide range of educational settings, topics, age groups.
(2) There are immediate dependent measures of engagement (time spent on problems, repeated attempts) and learning (accuracy and time needed to solve future near and far transfer problems).
(3) A wide range of variables can be experimentally investigated in enhancing online problems, through appropriate design of hints (3), explanations (4) , and learning tips (5).

Despite the extensive research literature showing that high-quality support in problems can benefit learning, there are many open questions about how to provide the best instructional support and feedback in interactive problems (see 6 for a review), and a great deal of work to be done in testing the extent to which these broad principles apply to specific real-world courses and contexts.

However, the challenge that arises in conducting randomized comparisons is minimizing the chances that students are disadvantaged by receiving conditions that are bad for learning and maximizing the chances that data from experiments leads to practical improvements for future students. One methodology the workshop will introduce is to dynamically adapt experiments, by analyzing data in real-time and weighting randomization, so that the probability of assigning a student to a condition is proportional to the probability that the condition is best for them (leads to highest learning or engagement).

The workshop participants will join forces in designing and exploring possibilities of applying dynamic experimentation in online education. Among outcomes, we imagine information sharing, fostering new collaborations, and identifying the relevant knowledge, technology, and relationships that can propel the development of dynamic experiments in educational settings.

Sign up for more information

AIED18 EduExpt Workshop

Extended Abstract

Williams, J. J., Heffernan, N., Poquet, O. (2018). Design and Application of Collaborative, Dynamic, Personalised Experimentation. Workshop conducted at the 19th International Conference on Artificial Intelligence in Education. London, UK. [PDF]



Ċ
Violet Koh,
Jun 12, 2018, 5:01 AM