** ATTENTION *** The course is likely to be oversubscribed. We plan to let students know by the evening on Thursday Sep 4 if they are in the class for credit or as a listener. If you are interested in taking the course for credit or as a listener you must fill out THIS FORM by 12:30pm on the first day of class. Many listeners can be allowed, but for-credit spots are very limited.
The first day of class will be 10am-noon on Wed Sep 3, 2024 in the Media Lab at MIT (75 Amherst St; Cambridge, MA) Room E14-633 (Location will move to E15-341 after the first day). Access to the building is restricted to people with an MIT ID or TIM Tickets. If you don't have access, email r-admin@media.mit.edu for help.
All are invited to attend the first day. Our plan is to notify students who signed the above form within 24 hours after the first day, if they are invited to take the class for credit or as a listener.
Wednesdays, 10am-noon, Sep 3 - Dec 10, 2024 (except Media Lab Members-Week Oct 9)
Class meets in person in the Media Lab (75 Amherst St; Cambridge, MA)
To reach the course staff email: ACMMI-staff at media dot mit dot edu
Put ACMMI in the subject line to speed the reply
Affective Computing was birthed at the MIT Media Lab and is now an internationally recognized field that includes an IEEE journal (Transactions on Affective Computing), an international conference (see ACII 2019, ACII 2021 and ACII 2023 hosted at the MIT Media Lab), an international professional association you are invited to join (The Association for the Advancement of Affective Computing), and perhaps the most widely-viewed (but not scientific) contribution: an emotion detector for Sheldon on Big Bang Theory.
Picard, R.W. (2000). Affective Computing. The MIT Press.
Calvo, R.A., D'Mello, S.K., Gratch, J., and Kappas, A. (2015). The Oxford Handbook of Affective Computing. Oxford University Press.
Other readings will be handed out as needed.
How can wearables/mobile phones recognize your affective state? And, when is this desirable? (Or not?)
How can we build multimodal technologies to predict and prevent unwanted states like anxiety or depression?
How can your emotion be manipulated, and how might this change what you buy and pay?
How can a (robot, agent, conversational bot) better recognize multimodal human non-verbal communication?
How can an interactive agent communicate through multimodal behaviors, and when is this desirable?
Why is it a smart idea to have fun (yay!) before you do creative work - how do emotions impact cognition?
How do we prevent misuses of affective and multimodal sensing, without hurting innovation and good uses?
How can skills of emotional intelligence improve robotics and HCI - or entice people to giveaway data?
How does the "most reliable" lie detection work and how is it kept from being used in harmful ways in daily life?