Due to recent independent research on sclera biometrics, it is required to benchmark the growth on this subject of research using a common dataset. To fulfil this aim and to attract the attention of researchers, 1st and 2nd Sclera Segmentation Benchmarking Competitions (SSBC 2015 and 2016) in conjunction with BTAS 2015 and 2016 respectively and 1st Sclera Segmentation and Recognition Benchmarking Competition (SSRBC 2016) in conjunction with ICB 2016 were organized. Inspired by the successful completion of these competitions and available scope of development, we plan this competition.

Another significant growing aspect of research is ocular biometrics in the visible spectrum. Significant amount of desecrate research interest can be recorded in the recent literature. Mainly three traits namely iris, sclera and peri-ocular individual or a combination of them are employed in these pieces works. Therefore we plan to run a competition, where the participants can use any of these traits or a combination of them to report their highest recognition accuracy for ocular biometrics employing a common dataset.

Schedule of the competition

Competition website opens: 10th December 2016

Registration starts: 10th December 2016

Test dataset available: 10th December 2016

Registration closes: 15th May 2017

Algorithm/system submission deadline: 15th May 2017

Results announcement: 31st May 2017


The competition registration can be done by email. If you would like to register and receive the test dataset, please send an email to abhijit.das@griffithuni.edu.au with subject line as SSERBC 2017, with the following information:




Phone number,

Mailing Address

Details about the competition

The competition aims to benchmark the sclera segmentation task with a common dataset. This database consists of 2624 RGB images taken from 82 identities and images were collected from both eyes so 164 different eye. Here for each individual image in four multi-angles (looking straight, left, right and up) is considered. For each angle 4 images are considered. The individual comprise of both male and female and different colour, few of them were wearing contact lens and images were taken in the different time in the day. The database contains images with blinking eyes, closed eye and blurred eye images. High resolution images are provided in the database (300 dpi resolution and 7500 x 5000 dimensions). All the images are in JPEG format. A NIKON D 800 camera and 28300 lenses were used for image capturing. A ground truth or manual sclera, iris and peri-ocular segmentation of this dataset is prepared.

For development purposes, a subset of the database, both eye images and ground truth (1 image for each angle of first 30 individual’s i.e 120) will be provided to the participants. The proposed algorithm of the participants will be evaluated by the organizer. The evaluation measure will be the precision and recall (recall will be consider the prior measure for ranking the algorithms). The algorithm that the participates will submit must not take more than 10 seconds to segment and generate a mask for an image in Intel core i7 processor.

For the recognition task, a part (10 users, 4 image per angle) of the dataset will be available upon registration, which will contain segmented sclera, iris and peri-ocular eye images and the corresponding eye images. Recognition percentage will be used as the measure for recognition task.

Method of participation for those that enter the competition

Participants will be able to register for the competition by filling in a form; they can either participate in the recognition competition or segmentation or both. The participants need to provide a file that can read the images from a directory and it writes the segmented mask in a particular directory with a naming convention.

The participants need to provide a file that can read the images from a directory and generate the training model. Another separate file that can read images from a directory that prompts which class it belongs to.



Segmentation task:

Recognition task::