The few-shot image denoising track starts now! We release validation data and training data. Check out the "Get data" page and prepare the submission!
2024.01.10 Challenge site online
2024.01.19 Validation server online
2024.03.01 Release of test data (input only), test server online
2024.03.06 Test results submission deadline, test server closed
2024.03.06 Fact sheets and code/executable submission deadline
Few-shot raw image denoisng is geared towards training neural networks for raw image denoising in scenarios where paired data is limited. Initially, the prevailing practice involved using paired data for training neural networks to perform denoising tasks. However, as the challenges of acquiring extensive paired data and the advancements in noise modeling grounded in physical principles became apparent, an increasing number of methods have started to embrace synthetic noise-based strategies. Nonetheless, these approaches necessitate a multi-step process involving: 1) gathering calibration data, 2) computing noise model parameters, and 3) training networks from scratch. This process can be quite cumbersome, particularly when dealing with various sensor types, each requiring its specialized denoising network. Consequently, leveraging few-shot paired data, which allows bypassing the calibration data collection and noise parameter estimation stages, can effectively alleviate the limitations associated with calibration algorithms.
More details are found on the data section of the competition.
A starting-kit can be found on Github.
The training data is already made available to the registered participants.
Please check the terms and conditions for further rules and details.
Jin, Xin, et al. "Lighting every darkness in two pairs: A calibration-free pipeline for raw denoising." Proceedings of the IEEE/CVF International Conference on Computer Vision. 2023.
Wei, Kaixuan, et al. "Physics-based noise modeling for extreme low-light photography." IEEE Transactions on Pattern Analysis and Machine Intelligence 44.11 (2021): 8520-8537.
If you have any questions, please feel free to post threads on 'Forum' tab and discuss related topics. You could also contact us by sending an email to organizers with title 'Few-shot RAW Image Denoising Inquiry'.
Due to the large size of RAW data, which is not suitable for direct upload, our evaluation metrics are calculated within the sRGB color space. We assess the performance by measuring the discrepancy between the results and the ground truth images.
We employ the standard Peak Signal To Noise Ratio (PSNR) and the Structural Similarity Index (SSIM) in grayscale, as is commonly used in the literature. The final evaluation metric can be calculated using the following formula:
$$Score=\log_k(SSIM*k^{PSNR})=PSNR-\log_k(SSIM)$$
In our implementation, $k=1.2$.
For the final ranking, we will use the average Score as the primary measure. The complexity of the algorithm will only serve as a reference and will not be included in the final metric. Please refer to the evaluation function in the 'evaluate.py' of the scoring program.
The Few-shot RAW Image Denoising Challenge is one track of MIPI-Challenge, Mobile Intelligent Photography & Imaging Workshop 2024, in conjunction with CVPR 2024. Participants are not restricted to train their algorithms only on the provided dataset. Other PUBLIC dataset can be used as well. Participants are expected to develop more robust and generalized methods for few-shot raw image denoising in real-world scenarios.
When participating in the competition, please be reminded that:
Before downloading and using the dataset, please agree to the following terms of use. You, your employer and your affiliations are referred to as "User". The organizers and their affiliations, are referred to as "Producer".
@inproceedings{jiniccv23led, title={Lighting Every Darkness in Two Pairs: A Calibration-Free Pipeline for RAW Denoising}, author={Jin, Xin and Xiao, Jia-Wen and Han, Ling-Hao and Guo, Chunle and Zhang, Ruixun and Liu, Xialei and Li, Chongyi}, journal={Proceedings of the IEEE/CVF International Conference on Computer Vision}, year={2023} } @inproceedings{jin2023make, title={Make Explict Calibration Implicit: "Calibrate" Denoiser Instead of The Noise Model}, author={Jin, Xin and Xiao, Jia-Wen and Han, Ling-Hao and Guo, Chunle and Liu, Xialei and Li, Chongyi and Cheng, Ming-Ming}, journal={Arxiv}, year={2023} }
Industry and research labs are allowed to submit entries and to compete in both the validation phase and the final test phase. However, in order to get officially ranked on the final test leaderboard and to be eligible for awards the reproducibility of the results is a must and, therefore, the participants need to make available and submit their codes or executables. All the top entries will be checked for reproducibility and marked accordingly.
Start: Jan. 19, 2024, 7:59 a.m.
Description: The online evaluation results must be submitted through this CodaLab competition site of the Challenge.
Start: March 1, 2024, 7:59 a.m.
Description: The online evaluation results must be submitted through this CodaLab competition site of the Challenge.
March 6, 2024, 7:59 a.m.
You must be logged in to participate in competitions.
Sign In