CHallenge UP:

Multimodal Fall Detection

Overview

Submission Instructions

  1. First, register you as participant of this competition using the registration form found in section Registration.
  2. Using the only the testing dataset, participants should send a single CSV file containing a column of timestamps and a column of class labels, using as headlines "timestamp" and "class", representing the activity/fall detected each second, for all activities and attempts performed by three subjects. See Submission File in section Evaluation.
  3. Participants should also include a detailed description of the proposed classification system, as a paper (limited to 6 to 8 pages) that contains at least the sections: methodology, results and discussion.
  4. The CSV file and the paper should be sent to Hiram Ponce (hponce@up.edu.mx) before the deadline. See section Important Dates.
  5. Once all the participants sent their results and the paper, the committee will evaluate the F1-score metric, using the ground truth, and the methodology provided. See section Evaluation.
  6. The winner will be the one who obtains the best F1-score evaluation. If two or more participants obtain the best same result, the committee will elect the one with the best methodology (less number of sensors for classification and originality).
  7. The results of the metrics as well as the confusion matrix obtained will be sent to the participants, before announcing the winners.
  8. Three winners (the best scores) will be announced during the conference IJCNN 2019.

Notes

  • It is important that the potential winners will be contacted by the committee in order to ask them for their source codes, because the final results sent as CSV files will be compared with the results obtained directly from the source codes. Only CSV files proved to be generated by the same source codes will be use for final judgement.
  • Some submitted results and papers, considered by the judges, will be invited to extend their results for a book chapter in the Cognitive Technologies series of Springer (see Prizes).