TAEMILE Project

TAEMILE: Towards Automating Experience Management in Interactive Learning Environments

Playable Demo

Unity web player not available at the moment.
Note: The current builds contain telemetry collection. When the system prompts for an ID, leave the field blank and press "continue".
This software is distributed for free for educational and research purposes. The software is provided "as is," with all faults, defects and errors, and without warranty of any kind. We cannot warrant that the software will be free of bugs, errors, viruses or other defects, and we shall have no liability of any kind for the use of or inability to use the software, the software content or any associated service. Telemetry will be collected and sent to a data collection server. The collected data is limited to the in-game behavior and screen resolution. By downloading and executing this application you acknowledge this notice and permit the collection of this data. This software contains code by third parties. Built with Unity. All assets are copyrighted 2014-2016 Drexel University. Trademarks and brands are the property of their respective owners.

Dataset

A limited version may be available to select parties, contact us for more information.

Additional information

Publications

This is a list of some of the project's publications.
  • J. Valls-Vargas, A. Kahl, J. Patterson, G. Muschio, A. Foster, J. Zhu (2015). Designing and Tracking Play Styles in Solving the Incognitum. GLS 2015. [PDF pp. 241-247]

    @inproceedings{valls2015gls,
    Author = {Valls-Vargas, Josep and Kahl, Andrew and Patterson, Justin and Muschio, Glen and Foster, Aroutis and Zhu, Jichen},
    Booktitle = {Proceedings of the Games + Learning + Society Conference},
    Date-Modified = {2015-07-01 21:12:07 +0000},
    Keywords = {educational games, player modeling, ILE},
    Organization = {Games + Learning + Society},
    Pages = {to appear},
    Title = {Designing and Tracking Play Styles in Solving the Incognitum},
    Year = {2015},
    }

Technical information

The game environment has been designed to record game events in a log file and transmit game telemetry. The telemetry data captures movement and interaction variables and can be used to recreate the play session. The events recorded are:
  • Close: close a card
  • End: finish the game
  • Focus: open a card
  • HelpCardShown: a tutorial item is shown on screen
  • InfoBox: the card is visible
  • InventoryNew: they visited a new exhibit
  • KeyDown
  • KeyUp
  • Look: looked at an object in the environment (not only exhibits)
  • MapHoover
  • MapHooverOut
  • MastodonAll
  • MastodonNew: they got a new bone
  • MouseDown
  • MouseMoveD: mouse movement
  • MouseMoveS: mouse movement
  • MouseMoveV: mouse movement
  • MouseUp
  • Navigate: they started walking
  • PauseMenu: open/close the concept map
  • QuestionAsk: a question is asked
  • QuestionCardAnswerFalse: they answered wrong
  • QuestionCardAnswerTrue: they answered correctly
  • Track: activity, position, camera rotation and heading every 500ms
Some of the events have additional parameters, here are some examples:
  • QuestionCardAnswerTrue/Jefferson: the current card
  • MastodonNew/Portrait: the type of exhibit
  • MouseMoveV/0.5640188: movement speed
  • PauseMenu/Pause: opening the concept map
  • HelpCardShown/6: the sixth card of the tutorial is shown
  • MouseDown/0: the left mouse button has been pressed
  • Close/MastodonBone4: the current card
  • MouseUp/1: the right mouse button has been depressed
  • PauseMenu/Unpause: closing the concept map
The log files are segmented into chunks using different strategies and each chunk is transformed into a feature vector with the following features:
  • # of transitions between quest and optional lines (by exhibits belonging to lines)
  • # of lines visited (by exhibits belonging to lines)
  • minimum amount of time spent reading a card
  • maximum amount of time spent reading a card
  • average amount of time spent reading cards
  • minimum amount of time spent reading a question
  • maximum amount of time spent reading a question
  • average amount of time spent reading questions
  • average distance from beacon located in the NW corner of the room
  • average distance from beacon located in the SE corner of the room
  • average distance from beacon located in the NE corner of the room
  • distance walked (M)
  • average walking speed (M/X)
  • time of the current segment (X)
  • time spent in cards (reading and answering questions) (A)
  • time spent checking the concept map (quest tracking tool) (B)
  • time spent navigating the environment (not in cards nor in concept map) (C)
  • ratio of time spent in cards (A/X)
  • ratio of time spent checking the concept map (B/X)
  • ratio of time spent navigating (C/X)
  • # of times the concept map (quest tracking tool) has been opened (M)
  • # of items hovered with the mouse while in the concept map (N)
  • # of times an item in the quest lines has been hovered (R)
  • ratio of times hovering a quest item and non-quest items (R/N)
  • # of different items hovered
  • # of items hovered for less than 500ms
  • average time an item is hovered on with the mouse (T/(R-S))
  • average # of items hovered each time the concept map is opened (N/M)
  • average time spent at the concept map when opened (B/M)
  • # of times concept map opened was opened within 5 seconds
  • # of key presses (down) (O)
  • # of mouse button presses (down) (P)
  • # of events recorded (Q)
  • key presses per second (O/X)
  • mouse button presses per second (P/X)
  • events recorded per second (Q/X)
  • # of questions answered correctly (D)
  • # of questions answered incorrectly (E)
  • # of questions answered correctly in a row
  • ratio of questions answered correctly (D/(D+E))
  • ratio of questions answered incorrectly (E/(D+E))
  • # of words shown in cards
  • average reading speed for each visit in a card
  • average reading speed for visits and revisits in cards (i.e. revisiting an exhibit increases the average)
  • # of words shown in questions
  • average reading speed for each visit in a question
  • average reading speed for visits and revisits in questions (i.e. revisiting an exhibit increases the average)
  • # of new items visited (F)
  • # of items revisited (G)
  • # of visits (F+G)
  • # of new questions visited (H)
  • # of questions revisited (I)
  • # of questions (H+I)
  • time spent in exhibits completing a quest (J)
  • time spent in exhibits in quest lines (K)
  • time spent in exhibits in optional lines (L)
  • ratio of time spent in exhibits that complete a quest (J/A)
  • ratio of time spent in exhibits in quest lines (K/A)
  • whether one of 10 specific patterns of interaction (sequences of 3 to 7 actions) happened (e.g., open concept map, walk, interact with an exhibit on a quest line, open the concept map)
  • some additional identification related to the session (e.g., telemetry ID) and gameplay (e.g., screen resolution, date/time started)