Eye tracking is the process of measuring either the point of gaze (where one is looking) or the motion of an eye relative to the head. An eye tracker is a device for measuring eye positions and eye movement. Eye trackers are used in research on the visual system, in gaming, in psychology, in cognitive linguistics, marketing, as an input device for human-computer interaction, and in product design.
The current security features in a mobile phone such as using a number code or drawing patterns can be easily breached due to the following reasons:
Also, there are many applications in the market which implements eye tracking for various uses. But none of which is for a mobile device. Therefore, we thought that applying eye tracking as a security feature in unlocking mobile phones was something very new and interesting.
To achieve this, we have created a mobile application. The goal of this project is to create a mobile application that performs Eye tracking and Eye-gaze estimation. In this application, we are going to unlock a screen by performing a sequence of eye gazes.
More recently, there has been a growth in using eye tracking to study how users interact with different computer interfaces. Specific questions researchers ask are related to the how easy different interfaces are for users. The results of the eye tracking research can lead to changes in design of the interface. Another recent area of research focuses on Web development. This can include how users react to drop-down menus or where they focus their attention on a website, so the developer knows where to place an advertisement.
According to Hoffman, the current consensus is that visual attention is always slightly (100 to 250 ms) ahead of the eye. But as soon as attention moves to a new position, the eyes will want to follow.
We still cannot infer specific cognitive processes directly from a fixation on a particular object in a scene. For instance, a fixation on a face in a picture may indicate recognition, liking, dislike, puzzlement, etc. Therefore, Eye tracking is often coupled with other methodologies, such as introspective verbal protocols.
We estimate the eye gaze by first localizing the pupil of the eye and detecting the eye corners.
In order to achieve this, we have implemented the following steps:
We are using the “lbpcascade_frontalface.xml” Classifier for the Face Detection and the Haar Classifiers “haarcascade_lefteye_2splits” and “haarcascade_righteye_2splits” for the Eye Detection. We implemented a tutorial provided by Roman Hošek for Android eye detection and tracking with OpenCV.
Figure 1: Face Detection and Eye Region Tracking. Green Rectangle Highlights the Face and Red Rectangles Highlights the Eye Region.
We have successfully implemented Eye Detection and Tracking on Android.
Figure 2: Eye Only Region Cropping. The inner red rectangles Highlights the Cropped Eye-Only Region.
We are cropping out the eye regions to get better localization of pupil and eye corners which will be used in the later parts to deduce the eye gaze estimation.
In order to calculate the Pupil, we are locating the pixel whose intensity is the lowest in the cropped eye regions. We are using the "minMaxLoc" method to achieve this result.
To note that, the pupil detection produces many false positives in challenging lighting conditions. During eye tracking, the pupil does not remain static. Instead, there is a rapid movement of the pupil which leads to false gaze estimations. These factors will be discussed in detail in the “Challenges Faced” section.
Figure 3: Pupil Tracking. The Green Dots Highlights the Eye Pupil Localization Points.
In order to estimate the gaze, we need to locate the pupil with respect to the eye region. Since we are already cropping out the eye only regions, we are considering the corners of the rectangle to be the corners of an eye.
Figure 4: Reference Points for Gaze Estimation. TL-Top Left, TR- Top Right, BL-Bottom Left, BR-Bottom Right
We have developed our algorithm to estimate the gaze. The algorithm is as follows:
1. Calculate the distance of the pupil from each of the 4 eye corners (Top Left, Top Right, Bottom Left and Bottom Right)
2. Calibrate the average distance of the pupil from each of the corners by taking 100 readings.
3. Assign weights to each of the 4 corners to make them equally likely.
4. Calculate the minimum distance of the 4 corners.
5. Perform 10 readings and select the corner which is repeated the most.
6. The resultant corner is the eye gaze estimation.
Figure 5: Gaze Estimation. Highlights distance of the eye pupil from the reference points.
Now that we have successfully estimated the eye gaze, we begin with a pattern recognition that enables a user to unlock the screen. We have placed 4 radio buttons on each corner of the screen. Initially these buttons are disabled. When we start the eye tracking, the radio buttons turn “Green” depending on the eye gaze. When all the corners are detected, all the buttons turn “Green” and the screen gets unlocked.
Figure 6: Screen-Unlock in Progress. Three Radio Buttons are Active
Figure 7: Screen Unlocked
We have made this application more secure by turning the buttons “Red” when the user gazes a particular direction for more time.
Figure 8: Gaze Error. The User Gazed for Longer Time.
We faced tons of challenges while implementing this project. Some of them are listed below:
Despite facing these challenges, we achieved the goal of our project.
We dreamed of creating an eye gaze and eye tracking application, and we made it into a reality! This application works like a charm but has certain limitations. The size of a mobile phone is a huge constraint as it is difficult to differentiate between eye-gazes within the screen. Based on our experiments, we noticed certain bugs in the OpenCV face detection, where it detects false positives thereby affecting our performance negatively. For best results, we need to reduce the camera shake while capturing the video.
We have prepared a short video demonstration (~3 min) of our project and uploaded it on YouTube.