Home

Industrial activities, Research Projects and Datasets:

------------------------------------------------------------------------------------------------------------------------------------
Stratuscent

A Startup with ~1.5M$ funding that was incubated within TandemLaunch. Stratuscent have exclusive license access to 6 granted patents from JPL, NASA  that cover fundamental technologies for the development of electronic noses.
I lead the AI research and developments of the company and currently, we are in stealth mode.

------------------------------------------------------------------------------------------------------------------------------------
SensAura Technologies

A Startup with ~750k$ funding that was incubated within TandemLaunch.
Given our proprietary large datasets, we developed tailored deep-neural-networks based technologies for emotion (arousal/valence) regression that could handle thousands of users simultaneously on a cloud infrastructure (Amazon web services). The core AI engines get heart-rate time-series as input, to outputs arousal/valence measures as well as the intensity of 9 classes of emotions.
The theories and math behind our tech is based on my ~6-years PhD research which is itself built on top of many years of research of the computational neurosceince, computational psychology and machine learning communities.

Our  Real-World Affective Computing (Emotion Recognition) solutions feature:
  • Real-Time
  • Multi-Modal
  • Device Agnostic
  • Cloud Computing
Although our great technologies were validated by some big names in consumer electronics market, given that we were somewhat early for the market, we did not succeed with business developments and fund raising series A.
The IP Assets including the patents are available for acquisitions by interested entities.

------------------------------------------------------------------------------------------------------------------------------------
DECAF

MEG brain signals + horizontal EOG + ECG + Trapezius EMG + NIR Facial Video responses of 30 people (+6) to 36 movie video clips and 40 music video clips (+2 long videos and 12 TV-ads)

Journal papers and dataset:
in IEEE Transactions on Affective Computing 2015
DECAF Dataset Webpage


------------------------------------------------------------------------------------------------------------------------------------
ASCERTAIN

Systematic Analysis of emotional responses and Personality of Humans (Frontal EEG + ECG + GSR + Facial Videos).

Conference papers:
in  17th ACM International Conference on Multimodal Interaction (ICMI 2015)


Journal papers and dataset:
in IEEE Transactions on Affective Computing 2016
ASCERTAIN Dataset Webpage

------------------------------------------------------------------------------------------------------------------------------------
AMIGOS

A part of my research to be done in EECS, QMUL during spring and summer 2014.
Systematic Analysis of emotional responses and Personality/PANAS of Human users (Frontal EEG + ECG + GSR + Facial Videos).


in

Implicit Personality, Mood and Co-presence Prediction Using Neuro-Physiological Signals on Long Videos

------------------------------------------------------------------------------------------------------------------------------------
QAMAF

NuroSky MindWave EEG signal + ECG + GSR + RGB Facial Video responses of 32 people to 20 classical style piano music excerpts generated by the Robin composer System
Signal Quality annotations are available


Conference paper:
in Annual ACM International Conference on Multimedia Retrieval (ICMR), June 2016, New York.
“A Quality Adaptive Multimodal Affect Recognition System for User-Centric Multimedia Indexing”,
R. Gupta, M. Khomami Abadi, J. A. Cárdenes Cabré, F. Morreale, T. H. Falk, N. Sebe.

A signal quality adaptive multi-modal emotion recognition system (QAMAF) is introduced and
One subject out cross-validation for the emotion recognition setup is employed


QAMAF Dataset Webpage




------------------------------------------------------------------------------------------------------------------------------------
UDirector

 Recognizing engagement level of TV-viewers through emotional responses (Frontal EEG + GSR + Facial Videos)


------------------------------------------------------------------------------------------------------------------------------------
COAF

This project is about crowd-sourcing affective annotations.

------------------------------------------------------------------------------------------------------------------------------------
MindReading playground

Decoding Mind through MEG brain responses and MultiMedia Content Retrieval/Analysis
------------------------------------------------------------------------------------------------------------------------------------
EffCom 

Analysis of Effectiveness of  TV-Commercials (TV-ads)

Will be detailed later ...
------------------------------------------------------------------------------------------------------------------------------------


Comments