Assessment: 50% exam and 50% coursework. All students sit the same exam. Masters students are expected to demonstrate more extensive critical awareness and understanding of the course material in their coursework project (see below). Coursework will be marked at 80% for the final report, 10% for the presentation, and 10% on a self-reflection report about the group collaboration process (see below).
Coursework -- group project: design / analyse / develop / evaluate (see below) a novel user interface that uses multiple modalities for interaction (i.e. two or more of: graphics, speech, audio, tactile, gesture, movement, ....). For example: use speech on Android, or OpenDial or IrisTK, to design a new spoken dialogue system; use a Leap Motion or Kinect or a touch-table to design a new gesture-controlled interface; combine this with the Oculus Rift virtual reality system; use the NAO or FurHat to create a Human-robot interaction system; design a location-based mobile app that tells users (using speech) what amenities are nearby, perhaps based on a user model, or plays location-based songs or ..... be creative!
The project theme for this year is interfaces for the `smart home', see e.g. http://en.wikipedia.org/wiki/Home_automation and the video links below!
Equipment that you can use:
Your project must cover several of the following aspects:
Possible group-member roles:
* = essential role (some team member(s) must do this!)
(many people will have several roles, and most projects will not have all of these roles):
Deliverables: (submit all of these via email to the lecturers, with email header: Name, group name, deliverable name)
This is a short (1 page) report answering the following questions:
How did you plan and manage your own work within the group?
To what extent did you independently solve problems and take initiative within the group?
How did you take responsibility for your own and other’s work by contributing effectively and conscientiously to the work of your group?
How did you actively maintain good working relationships with group members?
Did you lead the direction of the group project or any aspect of it?
Critically reflect on your roles and responsibilities within the group, and the roles and responsibilities of the other members.
Guest lectures: NLP -Arash Eshghi; UX - Stephen Denning; HRI ?? ; visit to User Vision
Research space and equipment: Google Glass, Oculus Rift, Leap Motion, Kinect v2 dev kit, NAO robot torso, FurHat animated robot head, various eyetrackers, etc
Lectures / labs / class plan:
Tuesday classes: 14.15- 15.15 in EM 336, Friday classes 10.15-12.15 in EM 307 (for weeks 7-9 10.15 class is in MGB22)
Cereproc: speech synthesiser
Balsamiq: wireframing / mockup tools
NVivo: software for qualitative research
NLTK: Natural Language ToolKit
CoreNLP: Stanford parsing and NLP tools
Praat: speech analysis software
ELAN: annotation tool
Android speech API: speech recognition and synthesis on Android
OpenEars: free speech recognition and synthesis on iPhone
Web speech API for Chrome
KALDI: speech recognition toolkit
Boxer: language understanding
WIT AI : API for spoken language understanding
OpenDial: dialogue system toolkit
IrisTK: multimodal dialogue system toolkit
Voice XML
SPSS: statistical analysis (this is installed on the university computers)
Sirius: open source personal assistant (like Siri)
Estimote iBeacons: micro-location and contextual awareness
Interaction Lab videos: SpeechCity, JAMES, ECHOES, Parlance
Incremental spoken dialogue system: "Numbers"
Google video on speech understanding, deep neural networks
WIT AI : API for spoken language understanding
Google Now
Assistant (ai.api)
Siri
Cortana
Indigo
SpeechCity
These might help you think about things you could do for your group projects:
Android home automation: https://www.youtube.com/watch?v=hYMpMt0lwUY
Kinect gesture-based controls: https://www.youtube.com/watch?v=g92MtCEgYSs&feature=youtu.be
Amazon Echo: https://www.youtube.com/watch?v=KkOCeAtKHIc
FurHat robot videos: http://www.speech.kth.se/furhat/videos
Home automation dialogue system from POSTECH: https://www.youtube.com/watch?v=dbOpzjGf17A
iPad graphical interface: https://www.youtube.com/watch?v=2xtPDTBPsTU
VoxCommando Vera: https://www.youtube.com/watch?v=-M7htWfF3tQ
Crestron iPad interface: https://www.youtube.com/watch?v=z9VUuYqazps
OpenHab NAO robot interface (french): https://www.youtube.com/watch?v=4vBSJ7csp8g