Start Date: February 14 End Date: May 24 Competition URL: Hugging Face
Motivation
Automatic recognition of fungi species assists mycologists, citizen scientists and nature enthusiasts in species identification in the wild. Its availability supports the collection of valuable biodiversity data. In practice, species identification typically does not depend solely on the visual observation of the specimen but also on other information available to the observer - such as habitat, substrate, location and time. Thanks to rich metadata, precise annotations, and baselines available to all competitors, the challenge provides a benchmark for image recognition with the use of additional information. Moreover, the toxicity of a mushroom can be crucial for the decision of a mushroom picker. We will explore the decision process within the competition beyond the commonly assumed 0/1 cost function.
Task Description
Given the set of real fungi species observations and corresponding metadata, the goal of the task is to create a classification model that, for each observation (multiple photographs of the same individual + geographical location), returns a ranked list of predicted species. The classification model will have to fit limits for memory footprint (ONNX model with max size of 1GB) and prediction time limit (will be announced later) measured on the submission server. The model should have to consider and minimize the danger to human life, i.e., the confusion between poisonous and edible species.
Note: Since the test set contains multiple out-of the-scope classes. Solution have to handle such classes.
Evaluation Process
This competition provides an evaluation ground for the development of methods suitable for not just snake species recognition. We want you to evaluate new bright ideas rather than finishing first on the leaderboard. Thus, we this year we will award an authorship / co-authorship on a Journal publication and payment for an OpenAccess fee.
The whole evaluation process will be divided into two parts:
(i) CSV-based evaluation, and
(ii) provided model evaluation on our private data.
The final performance will be provided only on the LifeCLEF web.
Metrics
Context
This competition is held jointly as part of:
the LifeCLEF 2023 lab of the CLEF 2023 conference, and of
the FGVC10 workshop, organized in conjunction with CVPR 2023 conference.
The participants are required, in order to participate in the LifeCLEF lab to register using this form (and checking "Task 5 - SnakeCLEF" of LifeCLEF).
Only registered participants can submit a working-note paper to peer-reviewed LifeCLEF proceedings (CEUR-WS) after the competition ends.
This paper should provide sufficient information to reproduce the final submitted runs. Only participants who submitted a working-note paper will be part of the officially published ranking used for scientific communication.
Tentative Timeline
December 2023: Registration opens
14 February 2023: training data release
24 May 2023: deadline for submission of runs by participants
27 May 2023: release of processed results by the task organizers
7 June 2023: deadline for submission of working note papers by participants [CEUR-WS proceedings]
30 June 2023: notification of acceptance of working note papers [CEUR-WS proceedings]
7 July 2023: camera-ready copy of participant's working note papers and extended lab overviews by organizers
All deadlines are at 11:59 PM UTC on a corresponding day unless otherwise noted. The competition organizers reserve the right to update the contest timeline if they deem it necessary.
Organizers
Lukas Picek, [PiVa AI / University of West Bohemia, Czechia]
Milan Sulc [Rossum.ai]
Jiri Matas [Czech Technical University in Prague, Czechia]
Jacob Heilmann-Clausen [University of Copenhagen, Denmark]
Rail Chamidullin [PiVa AI / University of West Bohemia, Czechia]