Night Photography Rendering Challenge 2025

Organized by gosha20777 - Current server time: March 30, 2025, 3:31 a.m. UTC

First phase

Validation 1
Feb. 7, 2025, 11:59 p.m. UTC

End

Competition Ends
March 24, 2025, 11:59 p.m. UTC

 


NTIRE Workshop and Challenges @ CVPR 2025

 

Night Photography Rendering Challenge

 


 

Important dates

 

  • 07.02.2025 Release of the train data and first validation data
  • 18.02.2025 Release of the second validation data
  • 21.02.2025 Deadline for submission in the second validation checkpoint
  • 23.02.2025 The people's choice of the second validation dataset is published
  • 02.03.2025 Release of the third validation data
  • 17.06.2025 NTIRE workshop and challenges, results and award ceremony (CVPR 2025, Online)
  • 04.03.2025: Deadline for submission in the third validation checkpoint
  • 06.03.2025: The people's choice of the third validation dataset is published
  • 11.03.2025: Release of the fourth validation data
  • 13.03.2025: Deadline for submission in the fourth validation checkpoint
  • 15.03.2025: The people's choice of the fourth validation dataset is published
  • 19.03.2025: Release of the test data
  • 21.03.2025: Deadline for submission in the test checkpoint. Please note: we require docker images and .tex reports of your solutions
  • 24.03.2025: Final results are released and winners of the challenge are declared

 


 

News

 

Final Results Released!

Dear participants,
We are excited to share the final results of the 2025 Night Photography Rendering Challenge.

A total of 10 submissions were received by the deadline. All of the submissions outperformed the baseline solution in both objective metrics and perceptual quality and thus will be included in the report.

Objective metric evaluation

The winners of objective metric evaluation category are:

  • 1st place: NJUST_KMG, Nanjing University of Science and Technology, China
  • 2nd place: Mialgo, Xiaomi Inc., China
  • 3rd place: PSU Team, Prince Sultan University, Saudi Arabia

The detailed results are presented below.

TeamSSIM SSIMRank PSNR PSNRRank Final Score 
NJUST_KMG   0.810  1 23.82 1 1.0
Mialgo 0.799 2 23.06 2 2.0
PSU Team 0.774 3 22.21 3 3.0
psykhexx 0.771 4 21.69 4 4.0
POLYU-AISP 0.758 6 21.23 5 5.6
sorange 0.760 5 20.91 7 5.8
NoTeam 0.722 7 21.22 6 6.6
AITH_ITMO 0.718 8 20.58 8 8.0
colab 0.656 9 19.15 9 9.0
OzU-VGL 0.640 10 17.18 10 10.0
Baseline 0.386   9.55    

Perceptual quality evaluation

The winners of perceptual quality evaluation category are:

  • 1st place: Mialgo, Xiaomi Inc., China
  • 2nd place: NJUST_KMG, Nanjing University of Science and Technology, China
  • 3rd place: PSU Team, Prince Sultan University, Saudi Arabia

The detailed results are presented below.

TeamScore 
Mialgo 128.49 
NJUST_KMG   123.73
PSU Team 116.60
psykhexx 106.43
POLYU-AISP 101.44
sorange 95.65
AITH_ITMO 73.68
colab 64.14
NoTeam 58.58
OzU-VGL 35.56
Baseline 6.53

The winners will be informed on how to receive their prizes in the upcoming days.

We sincerely congratulate the winners and thank all teams for participating in our challenge. We wish you the best of luck in all your future endeavors.

Best regards,
The NTIRE Night Photography Rendering Challenge Team

 

Final Testing Phase Begins

 

Dear participants,

we announce the start of the final Testing phase of our challenge. To participate in the final checkpoint two conditions must be met:

  • you should fill out the registration form if you have not done so already. The registration form can be accessed via this link;
  • you should submit your solution as a docker container and a report in TeX format via this form; you will find further guidelines inside. Your solutions are due 21.03.2025 23:59 UTC+0. You are only allowed one submission and cannot edit your answer.

Note the following:

  • for this checkpoint, you are not supposed to submit any images; instead, we will run your solutions on the 200 RAW images which match the format of the data you used during the entire competition.
  • Both a docker solution and a TeX report are required to be eligible for participation. Additionally, your solution is required to outperform a baseline solution we provided, evaluated by objective metrics, to be included in the final report.
  • We ultimately reserve the right to reject a solution in case it does not follow the submission guidelines or fails to generate output properly.

We also impose an additional rule for evaluation regarding tie breaks. If two teams tie for a podium place (1st - 3rd), the winner of the tie will be the team with higher SSIM. Otherwise, teams will share a tied place.

The results will be available on the CodaLab page and on the official webpage of the challenge after evaluation.

If you have any additional questions, please let us know. We wish you the best of luck!

 

Validation 4 Results Released!

 

Dear participants,

We are excited to share the intermediate results of our competition based on the Validation 4 checkpoint.

For this evaluation, we considered all relevant solutions submitted before 14.03.2025, 00:00 UTC+0. A total of 11 submissions were received by the deadline.

Objective metric evaluation

We remind you that for objective metric evaluation, your final score is given by 0.4 * PSNRRank + 0.6 * SSIMRank. The standings are as follows:

Participant     PSNR PSNR Rank SSIM SSIM Rank Final Score 
Daniil_S 21.60 1 0.731 1 1.0
Gideon 21.35 2 0.730 2 2.0
Stangeriness 20.44 3 0.728 3 3.0
ps 20.41 4 0.703 4 4.0
AnasM 20.12 5 0.702 5 5.0
MaXiaoyang 19.77 7 0.697 6 6.4
zongqihe 19.89 6 0.695 8 7.2
gwxysyhljt 19.44 8 0.696 7 7.4
camera_mi 17.95 9 0.668 9 9.0
gwan 17.28 10 0.630 10 10.0
azwaad.mohiuddin 7.71 11 0.120 11 11.0

Perceptual quality evaluation

We remind you that for the perceptual quality evaluation, we conduct a pairwise comparison of images generated by the solutions against the original DSLR images. Survey participants were asked to select which image in each pair better matched the original DSLR image. The score represents the average win percentage across all evaluated images. The standings are as follows:

Participant     Score 
Gideon 116.51
Daniil_S 106.90
Stangeriness 99.25
zongqihe 88.62
ps 80.50
gwxysyhljt 75.57
MaXiaoyang 71.37
gwan 54.34
AnasM 47.18
camera_mi 43.50
azwaad.mohiuddin 0.56

Soon we will allow submissions for the final testing. Please be ready to submit your solutions as docker containers as well as reports describing your solution.

Thank you for participating and best of luck in the final checkpoint!

 

Validation 4 and Alternative Submission Options

 

Dear participants,

we are happy to anounce the start of the Validation 4 checkpoint. You can find the data at the Participate/Get Data section of the CodaLab page. Please submit your solutions until 13.03.2025 23:59 UTC+0.

Due to certain issues with CodaLab website and in preparation to the final Testing phase we provide an alternative way to submit your solution. You can fill out the form below and attach the resulting images as well as your docker solutions. Docker containers are not required for this checkpoint, however we encourage you to provide them as it will streamline the process of running your final solutions. Instructions can be found in the form.

Link to the submission form

Responses submitted through the form will be run manually. Due to that, participants are limited to two solutions per day. Please only employ this method of submission when submitting through CodaLab is impossible. The results will be maintained in a table on the CodaLab page. Please note that it will not be updated in real-time.

Solutions sent through both CodaLab and Google Forms will be eligible for participating in Validation 4 checkpoint MOS evaluation. The last solution (between CodaLab and Forms) will be used for evaluation.

We remind you that submitting your docker solutions as well as reports will be necessary for participating in the Testing phase.

We wish you the best of luck.

Google Forms Leaderboard (last upadated on 13.03.2025 17:30 UTC+0)

Participant   PSNR SSIM 
NJUST-KMG 20.33 0.72
CajunDADL 9.41 0.13

 

Validation 3 Results Released!

 

Dear participants,

We are excited to share the intermediate results of our competition based on the Validation 3 checkpoint.

For this evaluation, we considered all relevant solutions submitted before 05.03.2025, 00:00 UTC+0. A total of 8 submissions were received by the deadline. Participants who were unable to submit on time are encouraged to take part in the upcoming Validation 4 checkpoint.

Objective metric evaluation

We remind you that for objective metric evaluation, your final score is given by 0.4 * PSNRRank + 0.6 * SSIMRank. The standings are as follows:

Participant   PSNR PSNR Ranking SSIM SSIM Ranking Final Score 
Hazzy 21.34 2 0.73 1 1.4
Daniil_S 22.12 1 0.72 3 2.2
Stangeriness 21.24 3 0.73 2 2.4
Gideon 21.20 4 0.72 4 4.0
AnasM 18.79 6 0.68 5 5.4
mialgo_ls 18.84 5 0.64 7 6.2
camera_mi 18.47 8 0.66 6 6.8
gwxysyhljt 18.69 7 0.64 8 7.6

Perceptual quality evaluation

For the perceptual quality evaluation, we conducted a pairwise comparison of images generated by the solutions against the original DSLR images. Participants were asked to select which image in each pair better matched the original DSLR image. The score represents the average win percentage across all evaluated images. The standings are as follows:

Participant   Score 
Gideon 52.29
Hazzy 45.66
Daniil_S 42.23
Stangeriness 40.42
gwxysyhljt 23.51
mialgo_ls 23.15
camera_mi 17.39
AnasM 16.26

We remind you that the data for the next Validation 4 checkpoint will be released soon. Thank you for participating in our challenge!

 

MOS Delay and Additional Training Data

Dear participants,

we notify you that due to certain techical difficulties we have experienced with collecting MOS scores the results are delayed. We deeply apologize for this inconvenience. The results will be made available as soon as possible.

To accomodate this issue, the release and the deadline for the fourth validation checkpoint will be adjusted. The checkpoint data will be released on 11.03.2025 and the solutions will be due 13.03.2025. Please see the updated timeline.

To stimulate development of your solutions, we decided to release additional data. Camera images for the Validation 2 checkpoint are made available to you. Together with raw images from the Validation 2 checkpoint, they can be used as additional training data for your models. The download link in the "Get Data" section of the CodaLab page.

We will soon provide details on how to submit your final docker solutions and reports. Thank you for your interest in our challenge.

 

Additional Validation Checkpoint, Changes to the Timeline and Final Ranking Process Info

Dear participants,

we want to notify you about important changes coming to our challenge. Due to not being able to collect MOS during the Validation 2 checkpoint, we decided to add an additional Validation 4 checkpoint that will also include MOS calculation. That is so you can receive additional feedback on your solutions.

This also implies changes to the timeline. The testing phase will be postponed. Please see the updated timeline.

Finally we anounce how your final ranking in the objective metric leaderboard be evaluated. First, all participants wil be ranked based on their PSNR and SSIM scores; each participant will receive a PSNRRank and SSIMRank. Final score is calculated as Score = 0.4 * PSNRRank + 0.6 * SSIMRank, and participants with the lowest scores will be considered winners.

If you have any questions, please let us know.

 


 

Challenge overview

 

Welcome to the "Night Photography" challenge part of the NTIRE workshop at CVPR 2025. The challenge seeks to advance image processing methods for night photography by addressing the complexities of rendering images captured at night, particularly this year with raw mobile phone captures, and evaluating the results based on perceptual quality and computational efficiency.

The process of capturing and processing images taken by cameras relies on onboard processing to transform raw sensor data into polished photographs, typically encoded in a standard color space such as sRGB. Night photography, however, presents unique challenges not encountered in daylight scenes. Night images often feature complex lighting conditions, including multiple visible illuminants, and are characterized by distinctive noise patterns that make conventional photo-finishing techniques for daytime photography unsuitable. Additionally, widely used image quality metrics like SSIM and LPIPS are often ineffective for evaluating the nuances of night photography. In previous editions of this challenge, participants were tasked with processing raw night scene images into visually pleasing renders, assessed subjectively using mean opinion scores. These efforts have driven significant advancements in the field of night image processing.

This year's challenge introduces a new and fundamentally different approach while retaining the use of the mean opinion score as an evaluation metric. Moving away from purely subjective aesthetics, the 2025 challenge will adopt a potentially more objective framework by leveraging paired datasets captured with a Huawei smartphone and a high-end Sony camera using a beam splitter to ensure identical perspectives. Participants will be provided with raw Huawei smartphone images as input and corresponding processed Sony camera images as the expected output. The goal is to develop algorithms that transform the raw Huawei images—characterized by higher noise levels, vignetting, and reduced detail—into outputs that convincingly resemble the high-quality processed Sony images.

Mean opinion score will be employed to evaluate submissions, ensuring that human viewers, rather than automated metrics that might be exploited, determine how convincingly the processed Huawei images align with the processed Sony images. This focus on human perception underscores the importance of developing rendering algorithms that produce results that are not only technically accurate but also perceptually closer from the target images.

By addressing the unique limitations of mobile phone raw images and the inherent complexities of night photography, the 2025 challenge seeks to advance the state of the art in image processing. The combination of objective input-output alignment and perceptual validation through mean opinion scores provides a robust framework for fostering innovation in mobile-focused night photography rendering where objectivity has a more pronounced role, but is not expected to be abused in anyway due to the final subjective oversight.

 


 

Challenge Goal and Uniqueness

 

This challenge tackles the complexities of nighttime photography by leveraging paired datasets of raw Huawei smartphone images and processed Sony camera images, providing a clear ground truth for evaluation. The goal is to develop algorithms that process raw Huawei images to convincingly resemble the high-quality Sony outputs, addressing a long-standing challenge in computer vision.

Night photography is vital for applications like surveillance and security and also has artistic significance in creating stunning images. By combining objective ground-truth comparisons with human perception through mean opinion scores, the challenge ensures that results are both technically accurate and visually convincing.

This unique framework bridges mobile device constraints, low-light conditions, and human-centric evaluation, advancing the state of the art in night image processing.

 


 

 Reporting

 

In order to be eligible for the prizes, the participants will be required to send code and reports about their solutions in the form of short papers during the submission. If this report is not submitted, the participants that would otherwise win a prize will be passed over.

 


 

Prizes

 

There will be two prize categories:

  1. Objective metrics (best score based on PSNR and SSIM)
  2. Subjective metrics (best score based on MOS)

First three places from each category will be awarded 1000 USD, 650 USD, and 350 USD for the first, second, and third place, respectively.

Winners will receive a winner certificate and will have an opportunity to submit their paper to NTIRE'2025 and participate in the common report which also will be submitted to CVPR workshop.

 


 

Contacts and Questions

 

Please direct any and all questions at nightphotochallenge [at] gmail.com.

NTIRE Workshop and Challenges @ CVPR 2025

 

Night Photography Rendering Challenge

 


 

 Performance Evaluation

 

In this challenge, each submission will be evaluated based on:

1. Objective metrics: the standard Peak Signal To Noise Ratio (PSNR) and the Structural Similarity Index (SSIM);
2. Perceptual quality, measured by the Mean Opinion Score (MOS).


PSNR and SSIM are used as often employed in the literature. Implementations can be found in most of the image processing toolboxes. For each dataset we report the average results over all the processed images belonging to it. The evaluation of the reconstructed results consists from the comparison of the rendered smartphone images with the ground truth camera images.

NTIRE Workshop and Challenges @ CVPR 2025

 

Night Photography Rendering Challenge

 


 

These are the official rules (terms and conditions) that govern how the NTIRE Workshop 2025 challenge on Night Photography Rendering will operate. This challenge will be simply referred to as the "challenge" or the "contest" throughout the remaining part of these rules and may be named as "NTIRE" or "Night Photography Rendering" benchmark, challenge, or contest, elsewhere (our webpage, our documentation, other publications).

In these rules, "we", "our", and "us" refer to the organizers (nightphotochallenge [at] gmail.com) of NTIRE challenge and "you" and "yourself" refer to an eligible contest participant.

Note that these official rules can change during the contest until the start of the final phase. If at any point during the contest the registered participant considers that can not anymore meet the eligibility criteria or does not agree with the changes in the official terms and conditions then it is the responsibility of the participant to send an email to the organizers such that to be removed from all the records. Once the contest is over no change is possible in the status of the registered participants and their entries.


1. Contest description

This is a skill-based contest and chance plays no part in the determination of the winner(s).

The goal of the contest is to render an sRGB image from provided RAW input and the challenge is called Night Photography Rendering.

Focus of the contest: it will be made available a dataset adapted for the specific needs of the challenge. The images have a large diversity of contents. We will refer to this dataset, its partition, and related materials as the Night Photography Rendering Dataset. The dataset is divided into training, validation, and testing data. We focus on the perceptual quality of the results, the aim is to achieve RGB images with the best fidelity (PSNR+SSIM or MOS) to the reference ground truth from the DSLR camera. The participants will not have access to the ground truth images from the test data. The ranking of the participants is according to the performance of their methods on the test data. The winners will be determined according to their entries, the reproducibility of the results and uploaded codes, and the above-mentioned criteria as judged by the organizers.


2. Tentative contest schedule

The registered participants will be notified by email if any changes are made to the schedule. The schedule is available on the Night Photography Rendering web page and on the Overview of the Codalab competition.


3. Eligibility

You are eligible to register and compete in this contest only if you meet all the following requirements:

  • you are an individual or a team of people willing to contribute to the open tasks, who accepts to follow the rules of this contest
  • you are not an NTIRE challenge organizer or an employee of NTIRE challenge organizers
  • you are not involved in any part of the administration and execution of this contest
  • you are not a first-degree relative, partner, household member of an employee or of an organizer of NTIRE challenge or of a person involved in any part of the administration and execution of this contest

This contest is void wherever it is prohibited by law.

Entries submitted but not qualified to enter the contest, it is considered voluntary and for any entry, you submit NTIRE reserves the right to evaluate it for scientific purposes, however under no circumstances will such entries qualify for sponsored prizes. If you are an employee, affiliated with or representant of any of the NTIRE challenge sponsors then you are allowed to enter in the contest and get ranked, however, if you will rank among the winners with eligible entries you will receive only a diploma award and none of the sponsored money, products or travel grants.

NOTE: industry and research labs are allowed to submit entries and to compete in both the validation phase and final test phase. However, in order to get officially ranked on the final test leaderboard and to be eligible for awards the reproducibility of the results is a must and, therefore, the participants need to make available and submit their codes or executables. All the top entries will be checked for reproducibility and marked accordingly.

 

4. Entry

In order to be eligible for judging, an entry must meet all the following requirements:

Entry contents: the participants are required to submit image results and code. Please note that code of the top-ranking participants (three top participants in each category) will be publicly released and made online accessible. All the participants are also invited (not mandatory) to submit a paper for peer-reviewing and publication at the NTIRE Workshop and Challenges (to be held online in 2025).

Use of data provided: all data provided by NTIRE are freely available to the participants from the website of the challenge. The data are available only for open research and educational purposes, within the scope of the challenge. NTIRE and the organizers make no warranties regarding the database, including but not limited to warranties of non-infringement or fitness for a particular purpose. The copyright of the images remains in the property of their respective owners. By downloading and making use of the data, you accept full responsibility for using the data. You shall defend and indemnify NTIRE and the organizers, including their employees, Trustees, officers, and agents, against any and all claims arising from your use of the data. You agree not to redistribute the data without this notice.

  • Test data: The organizers will use the test data for the final evaluation and ranking of the entries. The ground truth test data will not be made available to the participants during the contest.
  • Training and validation data: The organizers will make available to the participants a training dataset with ground truth images and three validation dataset without ground truth images. At the start of the final phase, the test data without ground truth images will be made available to the registered participants.
  • Submission: the entries will be online submitted via the CodaLab web platform. During the development phase, while the validation server is online, the participants will receive immediate feedback on validation data based on objective metrics. The assessments based on Mean Opinion Scores will be published according to the challenge timeline. The final perceptual evaluation will be computed on the test data submissions, the final scores will be released after the challenge is over.
  • Original work, permissions: In addition, by submitting your entry into this contest you confirm that to the best of your knowledge: - your entry is your own original work; and - your entry only includes material that you own, or that you have permission to use.


5. Potential use of entry

Other than what is set forth below, we are not claiming any ownership rights to your entry. However, by submitting your entry, you:

Are granting us an irrevocable, worldwide right and license, in exchange for your opportunity to participate in the contest and potential prize awards, for the duration of the protection of the copyrights to:

  1. Use, review, assess, test and analyze the submission, the results produced by your code or executables or other material submitted by you in this challenge in connection with this contest or any future work done by the organizers; and
  2. Feature your entry and all its content in connection with the promotion of this contest or related in all media (now known or later developed);

Agree to sign any necessary documentation that may be required for us and our designees to make use of the rights you granted above;

Understand and acknowledge that we and other entrants may have developed or commissioned materials similar or identical to your submission and you waive any claims you may have resulting from any similarities to your entry;

Understand that we cannot control the incoming information you will disclose to our representatives or our co-sponsor’s representatives in the course of entering, or what our representatives will remember about your entry. You also understand that we will not restrict work assignments of representatives or our co-sponsor’s representatives who have had access to your entry. By entering this contest, you agree that use of information in our representatives’ or our co-sponsor’s representatives unaided memories in the development or deployment of our products or services does not create liability for us under this agreement or copyright or trade secret law;

Understand that you will not receive any compensation or credit for use of your entry, other than what is described in these official rules.

If you do not want to grant us these rights to your entry, please do not enter this contest.


6. Submission of entries

The participants will follow the instructions on the CodaLab website to submit entries.

The participants will be registered as mutually exclusive teams. Each team is allowed to submit only one single final entry. We are not responsible for entries that we do not receive for any reason, or for entries that we receive but do not work properly.

The participants must follow the instructions and the rules. We will automatically disqualify incomplete or invalid entries.


7. Prizes and Awards

The financial sponsors of this contest are listed on Night Photography Rendering Challenge web page.


8. Other Sponsored Events

Publishing papers is optional and will not be a condition for entering the challenge or winning prizes. The top-ranking participants are invited to submit a paper following CVPR 2025 author rules, for peer-reviewing to the NTIRE workshop.

The results of the challenge will be published together with NTIRE 2025 workshop papers in the 2025 CVPR Workshops proceedings.

The top ranked participants and participants contributing interesting and novel methods to the challenge will be invited to be co-authors of the challenge report paper which will be published in the 2025 CVPR Workshops proceedings. A detailed description of the ranked solution as well as the reproducibility of the results are a must to be an eligible co-author.


9. Notifications

If there is any change to data, schedule, instructions of participation, or these rules, the registered participants will be notified on the competition page and/or at the email they provided with the registration.

Within seven days following the determination of winners, we will send a notification to the potential winners. If the notification that we send is returned as undeliverable, or you are otherwise unreachable for any reason, we may award the prize to an alternate winner, unless forbidden by applicable law.

The prize will be delivered to the registered team leader given that the team is not affiliated with any of the sponsors. It is up to the team to share the prize. If this person becomes unavailable for any reason, the prize will be delivered to be the authorized account holder of the e-mail address used to make the winning entry.

If you are a potential winner, we may require you to sign a declaration of eligibility, use, indemnity, and liability/publicity release, and applicable tax forms. If you are a potential winner and are a minor in your place of residence, and we require that your parent or legal guardian will be designated as the winner, and we may require that they sign a declaration of eligibility, use, indemnity, and liability/publicity release on your behalf. If you, (or your parent/legal guardian if applicable), do not sign and return these required forms within the time period listed on the winner notification message, we may disqualify you (or the designated parent/legal guardian) and select an alternate selected winner.

 


The terms and conditions are inspired by and use verbatim text from the `Terms and conditions' of ChaLearn Looking at People Challenges and of the NTIRE challenges .

NTIRE Workshop and Challenges @ CVPR 2025

 

Organizers

 


 

The Night Photography Rendering challenge is organized jointly with the NTIRE 2025 workshop. The results of the challenge will be published at the CVPR 2025 workshops proceedings.

Please direct any and all questions at nightphotochallenge [at] gmail.com.

More information about NTIRE workshop and challenge organizers is available here: https://cvlai.net/ntire/2025/.

 

This challenge is powered by Artificial Intelligence Research Institute (AIRI Institute).

List of organizers:

  • Russian Academy of Sciences - IITP (Moscow, Russia)
    • Egor Ershov
    • Sergey Korchagin
    • Artyom Panshin
    • Arseniy Terekhin
    • Ekaterina Zaychenkova
    • Georgiy Lobarev
    • Aleksei Khalin
    • Vsevolod Plokhotnyuk
    • Denis Abramov
    • Elisey Zhdanov
    • Sofia Dorogova

 

  • Gideon Brothers (Croatia)
    • Nikola Banić

     

  • University of Würzburg (Germany)
    • Radu Timofte
    • Georgii Perevozchikov

Validation 1

Start: Feb. 7, 2025, 11:59 p.m.

Validation 2

Start: Feb. 18, 2025, 11:59 p.m.

Validation 3

Start: March 2, 2025, 11:59 p.m.

Validation 4

Start: March 11, 2025, 11:59 p.m.

Competition Ends

March 24, 2025, 11:59 p.m.

You must be logged in to participate in competitions.

Sign In