ICFHR2018 Competition on

Vietnamese Online Handwritten Text Recognition

using VNOnDB

Overview

ICFHR2018 Competition on Vietnamese Online Handwritten Text Recognition using HANDS-VNOnDB (VNOnDB in short) database is the first attempt to bring together researchers working on handwritten text recognition and provide them a proper benchmark to compare their approaches on the tasks of transcribing Vietnamese online handwritten text. The goal of this competition is to encourage the studies on Vietnamese online handwritten text recognition and analyze the different approaches of the participants.

This competition (VNOnDB2018) is organized in the framework of the ICFHR 2018 competitions by Nakagawa Laboratory of Tokyo University of Agriculture and Technology, Department of Computer and Information Sciences.

In order to share the ideas and systems for other researchers, we encourage all participants to present their approaches in a conference paper at ICFHR 2018 and also publish their source codes after the competition results have been announced.


Important dates, an overview of tasks in the competition, data formats, and other details are provided below.

Important Dates

  • Call for Participation & Registration : Jan. 20th ~ Feb. 28th 2018
  • Extend registration deadline: Apr. 1st 2018
  • Open submission: April 10th 2018
  • Submission deadline: May 20th 2018
  • Result notification: Jun. 1st 2018.

ICFHR2018 VOHTR Tasks

+ Task 1: Word level (VNOnDB-Word)

In task 1, the segmented handwritten words and their ground truth are provided. We verified and eliminated the words which contain the long-distance delayed strokes such as the delayed strokes written after finished other words, or even a sentence. Thus, task 1 is used to evaluate the performance of recognizers with short-distance delayed strokes since in this task, there are only short-distance delayed strokes.

+ Task 2: Text line level (VNOnDB-Line)

In task 2, the text lines and their ground truth are provided. In this task, there is both long-distance, and short-distance delayed strokes which is appropriate for evaluating the robustness of systems with different kinds of delayed strokes.

+ Task 3: Paragraph level (VNOnDB-Paragraph)

In task 3, there are the handwritten text, which usually contains multiple text lines, and the paragraph level ground truth, which is a long sequence of characters. Task 3 is suitable for measuring the limitation of recognition system on the long sequences with many delayed strokes.

More information about VNOnDB database.

Submission & Evaluation Protocol

Please read the Submission & Evaluation page for more details.

+Submission

All participants are required to submit a description file of their approach for each task and source code (or executable files) with manual for running experiments. Each participant could submit several systems, and all the results will be considered when presenting the competition results. In each submission, the participant must describe the submitted system, especially present the approach using in that system.

+ Evaluation Protocol

After the submission deadline, we will evaluate the performance of submitted systems based on the transcription results from those recognition systems. The evaluation measurements will be the Character Error Rate (CER) and the Word Error Rate (WER). For each task, the submitted systems will be ranked according to CER and WER on the test set.