Objectives:
- Develop gaze-independent P300-based BCIs suitable for people with ocular impairment, in particular patients in (complete) locked-in state (LIS);
- Develop BCIs controlled in a self-paced manner;
- Develop BCIs that need a single calibration;
- Validation of the above items in communication devices and for wheelchair steering;
Hybrid Visual-Auditory P300-Based BCI
Achievements: A hybrid visual-auditory (HVA) P300-based BCI combining simultaneously visual and auditory stimulation was proposed. Auditory stimuli are based on natural meaningful spoken words. The paradigm was designed to match the skills of LIS patients, minimizing memory and cognitive demand, and maximizing stimuli discrimination and perception. The comparison of HVA with purely auditory (AU) and purely visual covert (VC) stimulation, showed that HVA evidences a neurophysiologic sum of the two independent processes (see left figure). This effect leads to an increase of 32% in online classification accuracy in comparison to AU and VC approaches.
Self-paced control and one-time calibration
Achievements: A Self-paced approach was implemented in the domain of classification projections and was tested to steer an intelligent wheelchair in realistic scenarios. Two approaches were successfully validated: static window (constant number of repetitions within a trial) and dynamic window (adaptive number of repetitions within a trial). A collaborative system supports human, dealing with BCI errors and performing navigation manoeuvres. Experiments were performed with participants with motor impairments recruited from cerebral palsy association APCC (see tasks 7 and 10).
In the context of a communication speller (LSC), self-paced control and one-time calibration were combined in the same framework. Methods developed in "Task4-Biosignals processing" (DTA and SSST) allowed classification accuracies similiar to those obtained if calibration and usere were made in the same session .
Publications
Invited talks