Lie Detection using BCI

 LIE DETECTION USING EMOTIV BCI

Introduction to the Project

This project presents Lie Detection System using a Brain Computer Interface (BCI) based on Electro-encephalo-graphy (EEG) technology. The Emotiv™ EPOC headset (Emotiv™ Systems Inc., San Francisco, CA, USA) was used to record electro-encephalo-graph data.

EEG data recorded from the Emotiv™ EPOC headset is comparable to data recorded by traditional EEG devices (Bobrov et al, 2011;Stytsenko al., 2011). Electro-encephalo-graphy is the recording of the electrical activity along the scalp. EEG measures the fluctuations in voltage resulting from ionic current flows within the neurons of the brain. This project was based on the idea; when a person utters a lie, he undergoes several physiological signals like heart rate, skin conductance and brain signals.(Bos,2006;Partala,Jokiniemi,& Surakka,2000; Takahashi,2004).

An Excitement is characterized by activation in the sympathetic nervous system which results in a range of physiological responses including pupil dilation, eye widening, sweat gland stimulation, heart rate and muscle tension increases, blood diversion, and digestive inhibition. The Emotiv™ Software Development Kit (SDK) offers three detection suites. The Expressiv™ Suite which offers detections of facial expressions and facial movements using an in-built gyroscope, the Cognitiv™  Suite that evaluates a user’s real time brainwave activity to discern the user’s conscious intent to perform distinct physical actions on a real or virtual object, and the Affectiv™ Suite which reports real time changes in the subjective emotions experienced by the user. From the Affectiv™ Suite, three distinct detections can be made - Engagement, Instantaneous Excitement, and Long-Term Excitement.

Instantaneous Excitement is an experience of awareness or feeling of physiological arousal with a positive value. In general, the greater the increase in physiological arousal the greater the output score for the detection. In our project, the Instantaneous Excitement is tuned to provide output scores that more accurately reflect short-term changes in excitement over time periods as short as several seconds.

Long-Term Excitement is experienced and defined in the same way as Instantaneous Excitement, but the detection is designed and tuned to be more accurate when measuring changes in excitement over longer time periods, typically measured in minutes.

Engagement is experienced as alertness and the conscious direction of attention towards task-relevant stimuli. It is characterized by increased physiological arousal and beta waves (a well-known type of EEG waveform) along with attenuated alpha waves (another type of EEG waveform). The opposite pole of this detection is referred to as “Boredom” in Emotiv™ Control Panel and the Emotiv™ API.  The greater the attention, focus and cognitive workload, the greater the output score reported by the detection. Since the project required real time reporting of the physiological changes and the EEG signals, it seemed apt to analyze Instantaneous Excitement over others offered by Affectiv™ Detection suite.

Experimental Setup

A set of ten trivial questions were asked in sequence to subjects and the real time data was captured. The questions were designed on the lines of the "Relevant - Irrelevant Test" (RIT), which was originally developed from the research of Marston (1917), for use in Law Enforcement Applications (Keeler,1933;Larson,1932). Although it has been the dominant polygraph technique for almost 20 years, it is now used seldom in criminal investigations (Raskin,1989). It was observed that, in our test the data varied significantly from the previous record of the model. This was because, we tested our volunteers on trivial questions in a relaxed environment, with the freedom to lie on their free will.

The Reid CQT, also known as the "Modified General Questions Test" (MCQT), was the first CQT Test. The concept of comparison questions was first described by Summers (1939:341). Since this model was requirement of the rules of conduct that we respect each other’s personal privacy, the questions posed were purely based on commonplace knowledge. A sample questionnaire is shown below:

 

a.       What is the name of your favorite band?

b.      What is the color of your shirt?

c.       What are the courses you are taking next semester?

d.      What is your favorite color?

e.       What is your favorite flavor of ice-cream?

f.       What do you think of Picasso?

Each subject was given a time gap of 20 to 30 seconds to think and answer the questions. And because of the triviality of the questions, the subjects were asked to lie beforehand on any three to four questions randomly. 

Results and Discussions

Following were some experimental observations noted

·         The change in values of the Instantaneous Excitement reflected the change in the physiological responses and consequently indicated the probability of a lie.

·         The time allotted for answering each question had notably impact on the values of the Instantaneous Excitement values.

·         There was some momentum effect noted on the values of the answer next to a lie. To reduce this, the data of first 5 seconds was neglected.

·         Values of Instantaneous Excitement gave more accurate result when the answers required extension to more than mere yes or no.

·         Lastly, a graphical analysis was conducted on the data so received. Well-defined local and global minima's were formed corresponding to the question number where the subject had lied(Graph - 1).


                                                                              Graph - 1

·         Unfortunately there were questions for which there wasn’t a significant increase in the values of the Instantaneous Excitement and hence lie was not caught.

·         A rough estimate of the accuracy of the Lie Detection System was placed at around 60 to 65% (Graph - II).


                                                                                                 Graph - II

Program Listings

For communicating with the EmotivTM EPOC headset required a connection between the EmotivTM Control Center and our own application. For this purpose, libraries namely edk.dll, edkerror.h and emostatedll.h have been build VC++ 2005. For better compatibility, our program was also build on VC++ 2005. Following is the program 

(This program will not work until enviornment elements in VC++ 2005 have not been appropriately setup)

#include <iostream>

#include <fstream>

#include <conio.h>

#include <sstream>

#include <windows.h>

#include <map>

 

#include "EmoStateDLL.h"

#include "edk.h"

#include "edkErrorCode.h"

 

#pragma comment(lib, "../lib/edk.lib")

 

void logEmoState(std::ostream& os, unsigned int userID, EmoStateHandle eState, bool withHeader = false);

 

 

int main(int argc, char** argv) {

 

            EmoEngineEventHandle eEvent                                 =         EE_EmoEngineEventCreate();

            EmoStateHandle eState                                               = EE_EmoStateCreate();

            unsigned int userID                                                     = 0;

            const unsigned short composerPort    = 1726;

            int option = 0;

            int state  = 0;

            std::string input;

 

 

 

                        if (argc != 2) {

                                    throw std::exception("Please supply the log file name.\nUsage: EmoStateLogger [log_file_name].");

                        }

 

                        std::cout << "===================================================================" << std::endl;

                        std::cout << "LIE DETECTION SYSTEM" << std::endl;

                        std::cout << "===================================================================" << std::endl;

                        std::cout << "Press '1' to start and connect to the EmoEngine                    " << std::endl;

                                                std::cout << ">> ";

 

                        std::getline(std::cin, input, '\n');

                        option = atoi(input.c_str());

 

                        if (option== 1)

                                    {

                                                if (EE_EngineConnect() != EDK_OK) {

                                                            throw std::exception("Emotiv Engine start up                                                failed.");

                                                }

                                    }

                                                else

                                    {

                                                throw std::exception("Invalid option...");

                                                break;

                                    }

                       

                       

                       

                        int num;

                        std::ofstream ofs(argv[1]);

                        bool writeHeader = true;

                        std::cout<<"Enter Number of Questions = ";

                        std::cin>>num;

                        for (int i=1;i<=num;i++)

                        {do{std::cout<<"\n\tPlease ask the question, then press any              key..";}while(!_getch());

                                    std::cout<<"\n\tQuestion Number "<<i<<" is running\n";

                        while (!_kbhit()) {

                                    //std::cout<<"I am here";

                                    state = EE_EngineGetNextEvent(eEvent);

 

                                    // New event needs to be handled

                                    if (state == EDK_OK) {

 

                                                EE_Event_t eventType =                                            EE_EmoEngineEventGetType(eEvent);

                                                EE_EmoEngineEventGetUserId(eEvent, &userID);

 

                                                // Log the EmoState if it has been updated

                                                if (eventType == EE_EmoStateUpdated) {

 

                                                            EE_EmoEngineEventGetEmoState(eEvent, eState);

                                                            //const float timestamp = ES_GetTimeFromStart(eState);

 

                                                            //printf("%10.3fs : New EmoState from user %d ...\r", timestamp, userID);

                                                           

                                                            logEmoState(ofs, i, eState, writeHeader);

                                                            writeHeader = false;

                                                }//if (eventType == EE_EmoStateUpdated)

                                    }//if (state == EDK_OK)

                                    else if (state != EDK_NO_EVENT) {

                                                std::cout << "Internal error in Emotiv Engine!" << std::endl;

                                                break;

                                    }//else if (state != EDK_NO_EVENT)

 

                                    Sleep(1);

                        }//while

                        _getch();//compulsory to put kbhit back to 0(false so that it can wait for another keyhit

                       

            }//for (int i=1;i<=num;i++)

            ofs.close();

            EE_EngineDisconnect();

            EE_EmoStateFree(eState);

            EE_EmoEngineEventFree(eEvent);

 

            return 0;

}

 

 

void logEmoState(std::ostream& os, unsigned int i, EmoStateHandle eState, bool withHeader) {

 

            // Create the top header

            if (withHeader) {

                        os << "Question Number\t";             

                        os << "Blink,";

                        os << "Wink Left,";

                        os << "Wink Right,";

                        os << "Look Left,";

                        os << "Look Right,";

                        os << "Eyebrow,";

                        os << "Furrow,";

                        os << "Smile,";

                        os << "Clench,";

                        os << "Smirk Left,";

                        os << "Smirk Right,";

                        os << "Laugh,";

                        os << "Short Term Excitement\t";

                        os << "Long Term Excitement\t";

                        os << "Engagement/Boredom\t";

                        os << "Cognitiv Action\t";

                        os << "Cognitiv Power\t";

                        os << std::endl;

            }

 

            // Log the time stamp and user ID

            os << i << "\t";

            // Expressiv Suite results

            os << ES_ExpressivIsBlink(eState) << ",";

            os << ES_ExpressivIsLeftWink(eState) << ",";

            os << ES_ExpressivIsRightWink(eState) << ",";

 

            os << ES_ExpressivIsLookingLeft(eState) << ",";

            os << ES_ExpressivIsLookingRight(eState) << ",";

 

            std::map<EE_ExpressivAlgo_t, float> expressivStates;

 

            EE_ExpressivAlgo_t upperFaceAction =       ES_ExpressivGetUpperFaceAction(eState);

            float  upperFacePower  =       ES_ExpressivGetUpperFaceActionPower(eState);

 

            EE_ExpressivAlgo_t lowerFaceAction = ES_ExpressivGetLowerFaceAction(eState);

            float   lowerFacePower  = ES_ExpressivGetLowerFaceActionPower(eState);

 

            expressivStates[ upperFaceAction ] = upperFacePower;

            expressivStates[ lowerFaceAction ] = lowerFacePower;

           

            os << expressivStates[ EXP_EYEBROW     ] << ","; // eyebrow

            os << expressivStates[ EXP_FURROW      ] << ","; // furrow

            os << expressivStates[ EXP_SMILE       ] << ","; // smile

            os << expressivStates[ EXP_CLENCH      ] << ","; // clench

            os << expressivStates[ EXP_SMIRK_LEFT  ] << ","; // smirk left

            os << expressivStates[ EXP_SMIRK_RIGHT ] << ","; // smirk right

            os << expressivStates[ EXP_LAUGH       ] << ","; // laugh

 

            // Affectiv Suite results

            os << ES_AffectivGetExcitementShortTermScore(eState) << "\t";

            os << ES_AffectivGetExcitementLongTermScore(eState) << "\t";

            std::cout<<ES_AffectivGetExcitementLongTermScore(eState) << "\t";

 

            os << ES_AffectivGetEngagementBoredomScore(eState) << "\t";

 

            // Cognitiv Suite results

            os << static_cast<int>(ES_CognitivGetCurrentAction(eState)) << "\t";

            os << ES_CognitivGetCurrentActionPower(eState);

 

            os << std::endl;

}

References

1.      Bobrov, P., Frolov, A., Cantor, C., Fedulova, I.,Bakhnyan, M., and Zhavoronkov, A. (2011).Brain-computer interface based on generation of visual images. PLoS ONE 6,e20674.doi:10.1371/journal.pone.0020674.

2.      Stytsenko, K., Jablonskis, E., and Praham, C. (2011).Evaluation of consumer EEG device Emotiv EPOC. Paper presented at the Mei:CogSci Conference, Ljubljana, Slovenia.

3.       Emotiv EPOC Research Edition SDK. https://emotiv.com/store/sdk/eeg-bci/research-edition-sdk/ [accessed 11.20.12].

4.       Niedermeyer, E. and Da Silva, F.L. (2004). Electroencephalography: Basic Principles, Clinical Applications, and Related Fields. Lippincot Williams & Wilkins. ISBN 0-7817-5126-8.

5.      Bos, D.O.(2006).EEG-based emotion recognition, the influence of visual and auditory stimuli. Internal Report, Department of Computer Science, University of Twente.

6.      Partala,T.,Jokiniemi,M., & Surakka,V.(2000).Pupillary responses to emotionally provocative stimuli. In Proceedings of the 2000 symposium on eye-tracking research and applications (pp.123-129).New York,USA: ACM Press.

7.      Takahashi,K.(2004).Remarks on emotion recognition from bio-potential signals. In Proceedings of the second international conference on autonomous robots and agents.(pp. 186-191).

8.      Marston,W.M.(1917) Systolic blood pressure symptoms of deception. Journal of Experimental Psychology,2,117-163.

9.      Keelerr, L.(1933)Scientific methods of criminal detection with the polygraph.*****Kansas Bar Association.

10.  Larson, J.A.(1932)Lying and its Detection. Chicago: University of Chicago press.

11.  Raskin, D. C, Kircher,,J.C., Horowitz, S.W. and Honts, C.R.(1989) Recent Laboratory and field research on polygraph techniques. In: J.C. Yuille (ed)Credibility Assessment,(pp.1-24).Deventer, The Netherlands: Kluver.

12.  Raskin,David, Honts, C.,R.,The Comparison Question Test In: Kleiner, Murray.(2002).Handbook of Polygraph Testing.1-48

Further Work

Considering the constraint on type of questions allowed to ask to volunteers success rate of 60-65% is encouraging. Our experiment focused on mapping brain's behavior only, whereas there are several other physiological factors that change when someone lies. Identifying and coupling the methods that measure these changes could significantly increase Lie Detection Capabilities. Also, use of other better BCI devices can significantly increase the quality and decrease the connection lags of EmotivTM