Recent advances in Artificial Intelligence Generated Content (AIGC) have transformed synthetic image creation. Diffusion- and transformer-based generative models such as DALL·E, Stable Diffusion, and Imagen can now generate visually compelling, semantically rich, and diverse images across a wide range of applications. Despite this progress, AIGC images often contain non-traditional distortions, including semantic inconsistencies, unnatural object structures, and perceptual artifacts, that are poorly handled by conventional Image Quality Assessment (IQA) methods.
Most existing IQA models are designed for traditional degradations such as blur, noise, or compression artifacts and rely heavily on large pretrained backbones and external data. This makes them unsuitable for evaluating AIGC content and impractical for resource-constrained and edge deployment scenarios.
The ICME 2026 Grand Challenge on Lightweight Neural Networks for AIGC Image Quality Assessment addresses these limitations by encouraging the development of compact, no-reference IQA models that are trained strictly from scratch and optimized for extreme efficiency while maintaining high correlation with human perceptual judgments.
Participants are expected to:
Design lightweight neural network architectures for no-reference AIGC IQA
Train models from scratch without using pretrained weights or external data
Achieve strong perceptual correlation under strict parameter and efficiency constraints
Explore perceptually-informed feature design, efficient learning strategies, and optimization techniques
The challenge promotes fair benchmarking, deployable solutions, and innovation aligned with ICME’s focus on efficient multimedia processing and edge intelligence.
The challenge leverages a diverse set of publicly available AIGC IQA datasets, covering multiple styles, scales, and content types:
AGIQA-1K: 1,080 AI-generated images with Mean Opinion Scores (MOS)
AIGCIQA2023: 2,400 images evaluated across quality, authenticity, and correspondence
AGIQA-3K: 3,000 images with fine-grained subjective annotations
AIGCOIQA2024: 300 omnidirectional (360°) AIGC images scored for quality, comfort, and correspondence
AIGIQA-20K: 20,000 images annotated for perceptual quality and prompt alignment
Standardized dataset splits will be provided:
Training set (labeled)
Public validation set (hidden labels)
Private test set (used for final ranking)
Task: No-reference perceptual quality prediction for AIGC images
Input: Single AI-generated image
Output: Predicted perceptual quality score
❌ No pretrained models
❌ No external datasets
✅ Training strictly from scratch
📦 Model size ≤ 10 million parameters
👥 Teams of up to 5 members
🔁 Full reproducibility required (code + executable submission)
Submissions are evaluated based on both prediction accuracy and model efficiency.
Pearson Linear Correlation Coefficient (PLCC)
Spearman Rank-Order Correlation Coefficient (SRCC)
Kendall Rank Correlation Coefficient (KRCC)
Root Mean Square Error (RMSE)
Score = (PLCC + SRCC + KRCC) / (3 × log₂(Model Size))
Private test composite score: 50%
Public test composite score: 30%
Novelty & technical quality (4-page short paper): 20%
Ties are broken using RMSE and model size.
Participants must submit:
Model predictions on the test sets
Full source code and trained weights
Participants to the Grand Challenge are encouraged to submit short papers (up to 4 pages) that, in case of acceptance, will be published in the conference workshop proceedings.
Submission Deadline: March 25, 2026
Feb 10, 2026 Challenge website launch, registration opens, training data released
Feb 15, 2026 Baseline code & evaluation scripts released
Mar 25, 2026 Challenge closes & paper submission deadline
Apr 5, 2026 Paper acceptance notification
Apr 10, 2026 Camera-ready submission
Jul 5-9, 2026 ICME 2026 Grand Challenge session & winner announcement
Patrick Le Callet
Professor
Nantes Université, France
Kumar Rahul
Senior Research Scientist
Amazon Prime Video, USA
Hitika Tiwari
Assistant Professor
IIT Madras Zanzibar, Tanzania
Industry partners are welcome to sponsor prizes, awards, or travel grants. Please contact the organizers for collaboration opportunities.
If you have any questions or concerns, please contact us: icme26-aigciqa-gc@googlegroups.com