What we do
Our research addresses some of the central problems in computer science and AI, by developing intelligent interactive systems which can collaborate effectively and adaptively with humans, and combining a variety of interaction modalities, such as speech, graphics, gesture, vision, and Natural Language. Our methods combine statistical and symbolic information processing, and we are developing data-driven machine learning approaches to build robust agents which can adapt autonomously in uncertain and dynamic interactions. These techniques are applied in a variety of domains, such as conversational interfaces, mobile search, wearable technology, technology enhanced learning, healthcare informatics, and human-robot interaction. We evaluate the performance of our models and algorithms both in simulation and in trials with real users.
We organised INLG 2016 in Edinburgh. We also organised the 1st Workshop on Data-to-text Generation and the workshop on Spatial Reasoning and Interaction for Real-World Robotics at IROS 2015. We organised the 2018 SICSA workshop on Conversational AI.
Some of our research projects:
- machine learning for socially intelligent Human-Robot Interaction (MuMMER, EC Horizon 2020)
- learning domain-general / wide-coverage spoken dialogue systems: BABBLE (EPSRC)
- Natural Language Generation: GUI project (Generation for Uncertain Information, EPSRC); DILiGENt (Domain-Independent Language Generation, EPSRC), and REGIME (Dstl) for REport GeneratIon from Meta-data for Autonomous Systems
- a fully statistical approach for adaptive and incremental Spoken Dialogue Systems (CLASSiC and PARLANCE projects, EC FP7)
- Reinforcement Learning, User Simulations, Information Presentation, Natural Language Generation, POMDPs, incremental dialogue processing.
- socially intelligent human-robot interaction: the JAMES project (EC, FP7): video
- adaptive spoken interaction for an intelligent city guide: the SpaceBook project (EC, FP7), SpeechCity Impact Acceleration Grant (EPSRC)
- Human-Robot Interaction for teaching and learning: the EMOTE project (EC, FP7)
- an intelligent multimodal interface for communication skills learning in children (ECHOES project, ESRC/EPSRC TEL)
- combining multitouch, vision, planning, and animation for intelligent interactions with virtual characters.
- integrating symbolic and statistical methods for dialogue processing:
- adaptive interfaces for patient monitoring and treatment (Help4Mood, EC FP7)
- autonomously adapting to different types of user during interaction.
We are interested in joining further international and national research collaborations in the above areas.
We are part of the Intelligent Systems Lab, and SICSA (the Scottish Informatics and Computer Science Alliance). We are currently funded by grants from the EPSRC and the European Commission (Horizon 2020).
RA position available!
New £31M Hub funded : ORCA.
We are through to the 2018 Amazon Alexa Challenge finals!
Best paper award at Robo-NLP 2017
Oliver Lemon is a keynote speaker at SIGDIAL 2017
Yannis Konstas will be joining our group as an assistant professor!
Our team was chosen by Amazon to enter the Alexa challenge!
Job opening with closing date 27 June 2016.
Pepper robot has arrived! (June 2016)
Papers accepted (2016): AAMAS, HRI, ACL, SemDial, LREC, FUZZ-IEE, IWSDS, SemEval, CSL, ECAI, SIGDIAL, RO-MAN, INLG.
Eshrag Refaee and Verena Rieser are winners of SemEval'16 challenge (sub-task 7)
Horizon 2020 project on socially intelligent robots (MuMMER) starting in 2016! Postdoc positions here
New ESPRC project (2015) funded on domain-general spoken dialogue systems: BABBLE
January 2015: FurHat has arrived!
2 new EPSRC projects on NLG funded
ECHOES project rated "outstanding" by ESRC (June 2014)
The Interaction lab is jointly organising a workshop on Multimodal HRI at ICMI 2014
James Watt studentships available.
BBC article on Parlance project, "How to turn Siri into Samantha", Feb2014
SemDial'14 is hosted by iLab.
MSc in AI with Speech and Multimodal Interaction starting Sept 2014.
Welcome to Katrin Lohan, our new lab member (21/10/13)
2 papers accepted for ICMI 2013
5 papers accepted for SIGDIAL 2013
Jason Williams from Microsoft Research visited us on 14/6/13
Oliver Lemon joins Advisory Board for Dialog State Tracking Challenge II (2013 -)
ENLG 2013 paper accepted.
Verena Rieser is area chair for discourse & dialogue for EACL'14.
Mary Ellen Foster's paper with Ron Petrick from Edinburgh Uni has been named Novel Applications Track Best Paper for ICAPS'13.
2 papers accepted for ACL 2013
25/3/13 Verena Rieser elected to the Scientific Advisory Committee of SIGDIAL
Jan 2013: Verena Rieser elected to the board of SIGGEN
6/11/12: SpaceBook and JAMES at the SICSA DemoFest
1/11/12 New book:
1/10/12: Welcome to our new Research Fellow Heriberto Cuayáhuitl
1/10/12: Welcome to our new Phd students Kevin O'Connor and Ioannis Efstathiou !
Hiring: 2 positions in Multimodal Interfaces /HRI/NLP (july 2012)
Hiring: new RA post in NLP and dialogue (june 2012)
ECHOES project: video
Human-Robot Interaction: VIDEO
May 2012: call of the wired: Conversational Interface work at HWU and Cambridge
Faculty positions in Computer Science: join us!
New project EC FP7 Parlance started November 1st.
See here for details
New forum for recruiting subjects at Heriot-Watt. Website here
3/11/11 Invited talk introducing Machine Learning to medics and psychologists at the Western General Hospital. Slides Available here.
1/11/11 Verena Rieser joined as a new faculty member.
1/11/11 ECHOES project at ESRC Festival of Social Science
Trialling a new spoken dialogue system for Edinburgh restaurants. ABC SDS prize draw.
Dialogue data released from CLASSiC project in our archive.
Heriot-Watt has been named as Scottish University of the Year 2011/12 by the Sunday Times.
Hosting SICSA workshop on Affective Interaction and Virtual Characters, 17 June 2011
Special issue of ACM TSLP on adaptive dialogue is out now!
2 papers accepted for SIGDIAL 2011
New projects James and Spacebook kick off this month (March 2011).
Outreach activity (4/3/2011): Tommy the Robot visits local school.
New project Help4Mood kicks off (January 2010).
ECHOES project features in "Britain in 2010" magazine (p. 94)
Interaction Lab joins the Vision and Language Network (Feb 2010)
The new Interaction Lab at Heriot-Watt up and running (posted 9/10/09).