Home

Keystroke Biometrics Ongoing Competition (KBOC)


The entire database is available. Download and sign the EULA 

Extended paper accepted for IEEE Access:

A. Morales, J. Fierrez, R. Tolosana, J. Ortega-Garcia, J. Galbally, M. Gomez-Barrero, A. Anjos and S. Marcel, "Keystroke Biometrics Ongoing Competition", IEEE Access, In press, 2016. [Link]

Paper with all the competition details and main results was presented at BTAS16 (06/09/10)

A. Morales, J. Fierrez, M. Gomez-Barrero, J. Ortega-Garcia, R. Daza, J.V. Monaco, J. Montalvão, J. Canuto, A. George, "KBOC: Keystroke Biometrics OnGoing Competition", Proc. 8th IEEE International Conference on Biometrics: Theory, Applications, and Systems, Buffalo, USA, pp. 1-6, 2016.


KBOC
is an official competition of The IEEE Eighth International Conference on Biometrics: Theory, Applications, and Systems (BTAS 2016) organized by ATVS Biometric Research Group



Background

Keystroke biometrics applications have been investigated over the past several decades, attracting both academics and practitioners. Keystroke is interesting for the pattern recognition research community because of the several challenges associated to modeling and matching dynamic sequences with high intra-class variability (e.g. the human behavior is strongly user-dependent and varies significantly between subjects). In addition, the simple nature of the data (time sequences) makes keystroke biometrics a good field to introduce new researchers (without previous experience on biometric applications) in this challenging area.

The KBOC competition is the first keystroke ongoing competition which overcome the limitations of traditional competitions based on a static snapshot of the state-of-the-art. In addition, the competition includes a public benchmark involving 3600 keystroke sequences from 300 users simulating a realistic scenario in which each user types his own sequence (given name and family name) and 3600 impostor attacks (users who try to spoof the identity of others).

 

Ongoing competition

Traditional biometric competitions give a static snapshot of the state-of-the-art in a specific research area. The main problem is how to encourage researchers to invest their resources and time to participate in these competitions (usually operative during a short window of time). Without the participation of the main players, the snapshot will be inaccurate. The ongoing competitions emerge providing a dynamic view updated by the community. The ongoing competition will be available permanently (beyond the BTAS16 conference) and results will be updated automatically under a fully reproducible platform.

 

Participation

There are two modes of participation (participants can freely choose to participate on either one or both modalities):

  • ONGOING PARTICIPANT: The competition will exploit the potential of the BEAT platform, which was created under the FP7 EU BEAT project to promote reproducible research in biometrics. The platform provides the necessary tools to implement biometric technology evaluations including: i) free access for the research community; ii) high performance parallel computation; iii) web-environment without software installation requirements; iv) Python code support with most popular pattern recognition and machine learning libraries available and v) public or private submissions. The competition will provide instructions and examples to facilitate the participation of researchers without previous experience on BEAT. There is no limit regarding the number of systems evaluated and the results are automatically provided to the participants on the platform (you know the performance of your systems before the submission deadline). Participants may choose if their results (as well as identities) are revealed or remain anonymous (only a brief description of the system is mandatory).

Important: the participation on the ongoing competition and the use of the platform does not imply the publication of your code. The organizers have not access to the code (if private code is chosen) evaluated by the platform but only to the results obtained.

 

  • OFFLINE PARTICIPANT: the participant will have access to the training set (labeled files) and test set (unlabeled data). Participants can submit up to fifteen submissions in total, up to the competition deadline. Algorithms are expected to be executed at the participants premises (only score files will be send). Once finished the submission period (see important dates), the results (in form of classification scores) will be used by the organizers to calculate the performances. The performances will be notified after the submission deadline to each participant. The final results will be publicly available (participants may choose if their names are revealed or remain anonymous, only a brief description of the system is mandatory).

 

Important: dataset and evaluation protocols of both types of participation are exactly the same. The results of the competition will be presented together (although type of participation will be mentioned).

 

Final Ranking (Offline Participation)

P1 = Indian Institute of Technology Kharagpur (India)

P2 = Federal University of Sergipe (Brazil)

P3 = Anonymous

P4 = U.S. Army Research Laboratory (USA)


 

Equal Error Rate (%)

 

P1

P2

P3

P4

System 1

19,33

12,9

17,9

7,82

System 2

16,82

11,85

 

6,46

System 3

16,52

12,12

7,32

System 4

16,47

13,48

 

7,35

System 5

15,73

12,25

8,02

System 6

 

13,92

 

5,32

System 7

11,82

7,95

System 8

 

13,03

 

8,08

System 9

13,03

5,68

System 10

 

14,66

 

5,91

System 11

10,35

System 12

 

 

 

10,89

System 13

11,2

System 14

 

 

 

11,23

System 15

6,26





 

Important dates

Registration

open now

Availability of the development set (Offline Participation)

available now

Availability of the test set (Offline Participation)

available now

Availability of the ongoing tool (Ongoing Participation)

available now

Submission of the test set results (Offline Participation)

April 22, 2016 (Extended)

End of ongoing participation*

April 22, 2016 (Extended)

Notification of the results

available now

  * The Ongoing competition will continue but only results previous to the deadline will be included in the BTAS competition results.

  

For more information on BEAT Platform

·       Ongoing Evaluation:

 https://www.beat-eu.org/platform/experiments/robertodaza/robertodaza/competition_kboc16/2/Kboc16_Competition_Baseline_Modified_Scaled_Manhattan_Distance/ 

·        User guide of BEAT platform: https://www.beat-eu.org/platform/static/guide/index.html

·        BEAT Platform website: https://www.beat-eu.org/platform/

·        Tutorial on BEAT platform: http://www.idiap.ch/~marcel/professional/FG_2015.html

·        EU BEAT Project website: https://www.beat-eu.org/

·         BOB (signal-processing and machine learning toolbox useful for BEAT developers): http://idiap.github.io/bob/ 

 

For more information on reproducible keystroke biometrics on BEAT platform

Aythami Morales, Mario Falanga, Julian Fierrez, Carlo Sansone, Javier Ortega-Garcia, “Keystroke Dynamics Recognition based on Personal Data: A Comparative Experimental Evaluation Implementing Reproducible Research”,   Proc. the IEEE Seventh International Conference on Biometrics: Theory, Applications and Systems (BTAS15), Arlington, Virginia, USA, pp 1-6, 2015. [download]

 

Organizing committee

·        (Chair) Dr. Aythami Morales, Universidad Autonoma de Madrid (aythami.morales at uam.es)

·        Dr. Julian Fierrez, Universidad Autonoma de Madrid (julian.fierrez at uam.es)

·         Dr. Javier Ortega-Garcia, Universidad Autonoma de Madrid (javier.ortega at uam.es)

·        Marta Gomez-Barrero, Universidad Autonoma de Madrid (marta.barrero at uam.es) 

·        Roberto Daza Garcia, Universidad Autonoma de Madrid (roberto.daza at estudiante.uam.es)       

 

Competition co-chairs:

·        Zhenan Sun (znsun at nlpr.ia.ac.cn)

·        Ross Beveridge (ross at cs.colostate.edu)

 

Please report any errors or inconsistencies to kboc2016competition@gmail.com.