Knowledge Extraction from Games Workshop
The second Knowledge Extraction from Games (KEG-19) workshop is now complete.
The papers have been published through CEUR, which can be found here: link
The morning sessions and afternoon sessions of presentations were both recorded and can be found on YouTube:
morning sessions (including talks on knowledge extraction for play and design)
afternoon sessions (including the invited talks and talks on knowledge extraction for hybrid purposes)
We very much appreciate everyone who took part, and we hope to see you at next year's KEG!
Welcome to the second Knowledge Extraction from Games (KEG-19) workshop at the Thirty-Third AAAI Conference on Artificial Intelligence (AAAI-19).
Knowledge Extraction from Games (KEG) is a workshop exploring questions of and approaches to the automated extraction of *knowledge* from games. We use “knowledge” in the broadest possible sense, including but not limited to design patterns, game rules, character graphics, environment maps, music and sound effects, high-level goals or heuristic strategies, transferable skills, aesthetic standards and conventions, or abstracted models of games.
- November 12: Submissions due to organizers (via EasyChair)
- November 27: Notifications to authors
- November 30: Early Registration Deadline
- December 8: Return camera-ready papers to organizers
- January 27: Workshop
Please see the full call for papers for more details.
Media Partner: MAKE
Machine Learning and Knowledge Extraction (MAKE) (http://www.mdpi.com/journal/make) is an inter-disciplinary, cross-domain, peer-reviewed, scholarly open access journal to provide a platform to support the international machine learning community. It publishes original research articles, reviews, tutorials, research ideas, short notes and Special Issues that focus on machine learning and applications. Papers which deal with fundamental research questions to help reach a level of useable computational intelligence are very welcome.
The journal is free for readers and free publication for well-prepared manuscripts submitted in 2018 and 2019.