The 1st Workshop On Multimodal Product Identification in Livestreaming and WAB Challenge 2021
Call For Papers
Important Dates
Submission deadline for workshop papers: July 30, 2021 August 10th, 2021 AOE
Notification of acceptance: August 26th, 2021
Cameray ready: September 2nd, 2021
Scope
The submissions are expected to deal with visual and multimodal perception tasks which include but are not limited to:
Applications of computer vision on Instance level recognition in video
Localization and open-set identification of object
Cross-modal retrieval
Video object detection and tracking techniques
Retrieval and ranking techniques
Fine-grained object recognition
Real-time deep learning inference
Multimodal analysis techniques
Paper submission
CMT Submission Website is available now at https://cmt3.research.microsoft.com/ACMMM2021
Please choose "Track: 1st Workshop on Multimodal Product Identification in Livestreaming and WAB Challenge" to submit your workshop paper.
Authors are invited to submit a full paper (two-column format, 6-8 pages, not including references) electronically. Two kinds of submissions(full paper and short paper) are welcomed. Submissions can be of varying length from 3 to 8 pages, two-column format, not including references. We use the ACM Article Template (https://www.acm.org/publications/proceedings-template). There is no distinction between full and short papers, but the authors may themselves decide on the appropriate length of the paper. All papers will undergo the same review process and review period. Accepted papers will be published in the ACM Digital Library alongside the ACM Multimedia main conference papers.
Paper submissions conform with the "single-blind" review policy(author names and affiliations could be listed). All papers will be peer-reviewed by experts in the field. Acceptance will be based on relevance to the workshop, scientific novelty, and technical quality. Depending on the number, maturity and topics of the accepted submissions, the work will be presented via oral or poster sessions.