4. Multimodal HMI

Objectives:

  • Research of different input modalities for Human-Machine Interfaces and Human-Machine Interaction: electroencephalography (EEG), electrooculography (EOG), facial electromyography (fEMG), head movement based inertial measurement unit (IMU), touch, vision-based hand-tracking;
  • Research of methods to infer user's mental and emotional states , based on EEG, EOG, GSR, EMG, GSR (galvanic skin response);
  • Research of methods capable of automatically detect user or system errors;
  • Analysis and identification of correlates between mental/emotional states and BCI performance;
  • Evaluation of the above items in different BCI and non-BCI applications: communication devices, brain-actuated wheelchair, games, human-robot interaction, robotic walker, and ambient assisted living scenarios.

Mental and emotional state characterization

Achievements:

    • Design of a protocol to record EEG, EOG, EMG and GSR of users while driving a wheelchair in realistic environments. Collection of biosignal data labeled with event markers associated with navigation situation (door passage, crowded environment, dynamic obstacles, unexpected navigation error, etc.). The experiments were performed with healthy participants and participants with motor impairments recruited from APCC. These datasets will allow a characterization of mental states during wheelchair driving as well as to infer correlates between mental states and BCI performance;
  • User state characterization based on eye movements recording (blinks, saccades and frown) during several forced contexts (e.g., prolonged use of BCI until user is tired). Signals were recorded from EOG and automatically classified;
  • Development of an algorithm based on statistical spatial filtering for automatic detection of BCI errors, combining P300 and ErrP event related potentials. This algorithm was applied in the context of the LSC communication speller;
  • Correlation between mental state and BCI performance based on psychometric questionnaires;

Human-robot interaction/interface based on EOG, facial EMG, IMU and touch, and BCI-games

Achievements:

  • Development of a framework for basic facial expression detection (neutral, happiness, anger, sadness) based on facial-EMG, eye movements detection (blinks, vertical and horizontal saccades) based on EOG and EOG-gaze-tracking;
  • Algorithm that decouples EMG and EOG so that ocular saccades are deactivated during a facial expression event and vice-versa, avoiding crosstalk effects. Currently implemented with a sliding window for snapshot classification;

Fig. Facial expression setup and Avatar morphs (eye movements and facial expressions: neutral, angry, happy, sad, frowning).

  • Development of an interface based on head movements, detected by IMU sensors, to steer a wheelchair. Compared to conventional interfaces based on head/mouth/chin control, the proposed head motion unit (HMU) interface, is more natural and flexible and requires less effort from user (this prototype, still under development, was the winner of the portuguese national contest Poliempreende2014);
  • Development of an assistive system based on touch interaction to control home appliances (IR and RF), designed for persons with motor weakness on hands. The system was validated by a child with muscular dystrophy dependent of wheelchair, in home settings. Integration of the system in AAL infra-structure;

Fig. Interface based on touch and gateway to control home appliances. The prototype is installed in child’s wheelchair;

  • Development of several BCI games intended to be used by people with limited motor performance (e.g. cerebral palsy), controlled by steady-state visual evoked potentials (SSVEP) and by brain rhythms;

Fig. 2D/3D BCI games.

Vision-based HMI for robotic walkers

Achievements:

  • xxxxxxx

Publications

  • A. Cruz, D. Garcia, G. Pires, U. Nunes, Facial Expression Recognition Based on EOG Toward Emotion Detection for Human-Robot Interaction, 8th International Conference on Bio-inspired Systems and Signal Processing, Biosignals2015, 2015, [PDF], doi:10.5220/0005187200310037.
  • R. Parafita, G. Pires, U. Nunes, and M. Castelo-Branco, “A spacecraft game controlled with a brain-computer interface using SSVEP with phase tagging”, IEEE 2nd Int. Conf. on Serious Games and Applications for Health, SeGAH2013, 2013, doi:10.1109/SeGAH.2013.6665309.
  • J. Paulo, P. Peixoto, Classification of reaching and gripping gestures for safety on walking aids” IEEE RO-MAN'14: IEEE International Symposium on Robot and Human Interactive Communication, Edinburgh, United Kingdom, 2014.
  • J. Paulo, P. Peixoto and U. Nunes, A novel vision-based user interface for a robotic walker framework, IEEE RO-MAN'15: IEEE International Symposium on Robot and Human Interactive Communication, Japan, 2015.
  • xxxxxxxxxxxx

Awards

Project Thesis (PhD)

    • Aniana Brito, “Mental state monitoring to improve the robustness of Brain-computer Interfaces in scenarios of human-robot interaction”, PhD Project Thesis (supervisors: U. Nunes, G. Pires), FCTUC, Sept. 2014.

MSc and BSc Theses

    • Ivo Batista, "Desenvolvimento de um jogo controlado através de potenciais EEG estacionários evocados visualmente", (supervisors: U. Nunes and G. Pires), FCTUC, Fev. 2015.
    • D. Mendes, R. Cruz, "DriveByMind 2.0", (supervisors: A. Manso and G. Pires), IPT, Setembro 2015
    • G. Freitas, F. Rosa, H. Pereira, Infraestrutura para monitorização de atividade humana em ambientes de vida assistida, (supervisors: G. Pires, A. Lopes and A. Manso), IPT, Outubro 2015.
    • J. Martins, R. Costa, "DriveByMind", (supervisors: A. Manso and G. Pires), IPT, Novembro 2014
    • C. Caramelo, F. Pereira, "Interface inercial para condução de uma cadeira de rodas", (supervisor: G. Pires), IPT, Setembro de 2014.
    • J. Bué, J. Maurício,"LiveBF - Visualização gráfica de ondas cerebrais e jogos 2D", (supervisors: A. Manso and G. Pires), IPT, Outubro 2013
    • A. Lagarto, D. Ferreira, L. Caetano, "Casa Inteligente: sistema Ambient Assisted Living", (supervisor: G. Pires), IPT, Setembro 2013
    • B. Oliveira, T. Veríssimo, "Remote Control System", (supervisors: A. Manso and G. Pires), IPT, Julho de 2013

Technical reports

    • D. Garcia, "Modelação de expressões faciais num avatar usando classificação de biossinais", (supervisors: U. Nunes and G. Pires), FCTUC, Sept. 2014
    • A. Brito, “TR-AMS-HMIT5-AC-04: Protocol to record and label biosignals of users, while driving a wheelchair in realistic settings”, 2015