360-Degree Video Super Resolution and Quality Enhancement Challenge - Track1. 2x @ICIP 2024

Organized by Ahmed_Telili - Current server time: Nov. 26, 2025, 3:30 p.m. UTC
Reward $1,500

First phase

Development
Feb. 1, 2024, midnight UTC

End

Competition Ends
March 9, 2024, 11 p.m. UTC

360-Degree Video Super Resolution and Quality Enhancement Challenge @ ICIP 2024

Real-Time Video Super-Resolution

 Track #1. ICIP challenge (2x super resolution)


 

Important Dates

  • 2024.02.05 Release of train data (input and output videos) and validation data (inputs only)
  • 2024.02.05 Validation server online
  • 2024.02.21 Test phase beginning 
  • 2024.03.07 Code submission deadline
  • 2024.03.13 Final test results release to the participants and winner announcement
  • 2024.04.03 Challenge paper submission deadline
  • 2024.10.27 Workshop days, results and award ceremony (ICIP 2024, Abu Dhabi, UAE)

 

Challenge overview

The 2024 IEEE International Conference on Image Processing (ICIP 2024) is scheduled to be held in Abu Dhabi from October 27th to 30th. The conference will offer a comprehensive program focusing on image and video processing, as well as computer vision.

Omnidirectional visual content, commonly referred to as 360-degree images and videos, has garnered significant interest in both academia and industry, establishing itself as the primary media modality for VR/XR applications. 360-degree videos offer numerous features and advantages, allowing users to view scenes in all directions, providing an immersive quality of experience with up to 3 degrees of freedom (3DoF). When integrated on embedded devices with remote control, 360-degree videos offer additional degrees of freedom, enabling movement within the space (6DoF). However, 360-degree videos come with specific requirements, such as high-resolution content with up to 16K video resolution to ensure a high-quality representation of the scene. Moreover, limited bandwidth in wireless communication, especially under mobility conditions, imposes strict constraints on the available throughput to prevent packet loss and maintain low end-to-end latency. Adaptive resolution and efficient compression of 360-degree video content can address these challenges by adapting to the available throughput while maintaining high video quality at the decoder. Nevertheless, the downscaling and coding of the original content before transmission introduce visible distortions and loss of information details that cannot be recovered at the decoder side. In this context, machine learning techniques have demonstrated outstanding performance in alleviating coding artifacts and recovering lost details, particularly for 2D video. Compared to 2D video, 360-degree video presents a lower angular resolution issue, requiring augmentation of both the resolution and the quality of the video. This challenge presents an opportunity for the scientific research and industrial community to propose solutions for quality enhancement and super-resolution for 360-degree videos.

The objective of this challenge is to reconstruct high-resolution (HR) 360-degree videos from low-resolution (LR) compressed versions. In the model development phase, the training set and the LR-only validation set will be provided. Participants can train their models using the training set and subsequently assess them on the validation set by submitting their enhanced, super-resolved results to the validation server, along with the runtime.  During the test phase, both the validation set with ground truth and an LR-only test set will be accessible. At the end of the test phase, participants are required to submit a Docker file (for which a public tutorial will be provided) and their code script to the organizers for fair evaluation.


 

Awards Summary

The top three models in this track will be eligible for awards. The prizes for the top three models are as follows:

First place awards: Certificate & 1000$

Second place awards: Certificate & 500$ 

Third place awards: Certificate

 

Judges will prioritize novelty in solution.


 

Paper Submission

Contributors to the challenge are encouraged to submit a challenge paper to ICIP 2024 by the deadline of Mars 27, 2024. 


 

Winner Announcement

We are thrilled to announce the winners of the ICIP 2024 360-degree Video Super-Resolution Challenge. Congratulations to the top performers.

  1. First Place: try1try8
  2. Second Place: Wing 
  3. Third Place: Dz360

We extend our heartfelt thanks to all participants for their hard work and contributions.

Please note that Track 2 of our challenge  is still ongoing and will conclude on April 28, 2024. 


  

Provided Resources

  • Scripts: Along with the dataset, the organizers will provide scripts to assist participants in developing their models and replicating baseline results. However, participants are completely free to use their own code in place of our provided resources. For more information and access to these scripts, please visit the official GitHub repository: https://github.com/Omnidirectional-video-group/360VISTA
  • Contact: You can use the forum page (highly recommended!) or directly contact the challenge organizers by email (ahmed.telili@tii.ae, brahim.farhat@tii.ae, wassim.hamidouche@tii.ae and adi.amirpour@aau.a) if you have doubts or any question.

 

Organizers

Ahmed Telili @ TII

Ibrahim Farhat @ TII

Wassim Hamidouche @ TII

Hadi Amirpour @AAU

More information about challenge organizers is available here: https://www.icip24-video360sr.ae/home


 

Sponsors:

We are proud to announce the Technology Innovation Institute (TII) is the sponsor of this challenge.  TII is a leading global research center dedicated to pushing the frontiers of knowledge.  Our teams of scientists, researchers and engineers work in an open, flexible and agile environment to deliver discovery science and transformative technologies. TII is is part of the Abu Dhabi Government’s Advanced Technology Research Council (ATRC), which oversees technology research in the emirate.

 

360-Degree Video Super Resolution and Quality Enhancement Challenge @ ICIP 2024 

Real-Time Video Super-Resolution

 Track 1. 2x super resolution


 

Evaluation Criteria

Overview

The evaluation of submitted solutions involves comparing their x2 super-resolved frames with the corresponding ground truth frames and reporting the inference time of the SR model. The primary goal is to improve the quality of low-resolution videos to high-resolution, prioritizing high quality while ensuring minimal runtime for live broadcast scenarios.

Quality Metrics

For video quality assessment, we primarily utilize the Weighted Signal to Noise Ratio (WS-PSNR) while also reporting the Weighted Structural Similarity Index (WS-SSIM). WS-PSNR serves as our main metric, offering a comprehensive measure of the super-resolved frame quality relative to the original high-resolution frames. The average WS-PSNR and WS-SSIM values are computed across all processed frames.

Scoring Formula

The final score combines quality and runtime. It is computed as follows:

  • Quality score calculation:

The quality score metric is calculated and normalized using the WS-PSNR results of the bicubic model. We define two fixed values, WS-PSNRBicubic and WS-PSNRmax, which establish the lower and upper limits for model performance, respectively. Utilizing these values, we introduce penalties for models that produce WS-PSNR values below the bicubic baseline, while giving emphasis to those that achieve values higher than the defined maximum.

  • Runtime score calculation:

The runtime evaluation metric assigns a full score to models that achieve a processing time of 0.016 seconds or less per 2K frame, as this speed enables a smooth 60 frames per second (fps), essential for high-quality live streaming. To reinforce this standard, we apply penalties in our evaluation criteria for runtimes that exceed 0.016 seconds. The slower the model, the larger the penalties.

  • Final Score calculation:

The final score is calculated by assigning specific coefficients to the quality and runtime scores and then multiplying the result by 100, as outlined in the formula above.


Submission

Please note that during the development phase, we will base our evaluation on the runtime reported by participants. However, for the final test phase and rankings, the challenge organizers will independently verify and execute the provided solutions to compute the measurements.

Development phase:

For submitting the results, you need to follow these steps:

  • For each input frame, process it and save the output frame with the same file name as the input. For instance, if the input file is "081.png", the corresponding output file should also be named "081.png".
  • Ensure that output images are saved with lossless compression and have a pixel size that is four times larger than the input images.
  • During the development phase, to reduce data transfer time and upload size, only 5 frames per video will be evaluated. These should be every 20th frame, starting from "001.png" to "081.png". Therefore, please only submit the following frames: "001.png", "021.png", "041.png", "061.png", and "081.png".
  • Organize your submission as follows:

    • Create a ZIP archive named submission.zip
    • Inside this archive, create a folder named 1.
    • Within the 1 folder, include 20 subfolders (each representing a video). Each subfolder should contain the 5 specified frames.
    • Also, include a readme.txt file in the root of the 1 folder.

 

  • submission.zip
    • ├─ 1
      • ├─ 001
        • ├─ 001.png
        • ├─ 021.png
        • ├─ 041.png
        • ├─ 061.png
        • ├─ 081.png
      • ├─ 002
        • ├─ 001.png
        • ├─ 021.png
        • ├─ 041.png
        • ├─ 061.png
        • ├─ 081.png
      • ├─ ...
    • ├─ readme.txt
  • The readme.txt file must include the following details: runtime (in seconds), the platform used for running the model, and an indication (1 for Yes, 0 for No) of whether extra data was used for training the model. Note that the validation set contains both 2K and 4K frames; however, the specified runtime should be for 2K frames. An 'Other' section can also be included for additional information or remarks. The format for the readme.txt should be as follows:

 

  • Runtime: 0.102
  • CPU/GPU: GPU
    Data: 0
    Other:

 

  • Please ensure all required information is accurately filled in, particularly the runtime for 2K frames. 
  • Additionally, a test submission example has been made publicly available for reference. Participants are encouraged to review this example to better understand the expected submission format and content. Link

Test phase

Submit your code, your pretrainedd model and a README with instructions for running it in a .zip file to ahmed.telili@tii.ae by 2nd March 2024. Testing on the test set will follow, with results and the winner announcement by 6th March. 

 

 

360-Degree Video Super Resolution and Quality Enhancement Challenge @ ICIP 2024 

Terms and Condidtions 

 

  • Eligibility

You are eligible to register and compete in this contest only if you meet all the following requirements:

  1. you are an individual or a team of people willing to contribute to the open tasks, who accepts to follow the rules of this contest
  2. you are not a challenge organizer or an employee of TII 
  3. you are not involved in any part of the administration and execution of this contest
  4. you are not a first-degree relative, partner, household member of an employee or of an employee of TII or of a person involved in any part of the administration and execution of this contest.
  • Prizes and Awards

The financial sponsors of this contest is Technology Innovation Institute (TII) . There will be economic incentive prizes for the winners to boost contest participation; these prizes will not require participants to enter into an IP agreement with any of the sponsors, to disclose algorithms, or to deliver source code to them. The participants affiliated with the industry sponsors agree to not receive any sponsored money, product, or travel grant in the case they will be among the winners.

Incentive Prizes only for track #1 competitions (ICIP chalennge) are as follows:

1st place: Certificate & 1000$ 
2nd place: Certificate & 500$ 
3rd place: Certificate

  • Multi-submission policy 

We accept multiple submissions from a single team if they are significantly different. However, solutions with minor differences can be rejected by the organizers. For example, if solution A and B are submitted where B is an ensemble method of A, we do not consider them to be essentially different.

If you are preparing multiple submissions, please contact the organizers in advance for clarification.

  • Notifcations

If there is any change to data, schedule, instructions of participation, or these rules, the registered participants will be notified at the email they provided with the registration.
If you are a potential winner, we will notify you by sending a message to the e-mail address listed on your final entry within seven days following the determination of winners. If the notification that we send is returned as undeliverable, or you are otherwise unreachable for any reason, we may award the prize to an alternate winner, unless forbidden by applicable law.

If you are a potential winner, we may require you to sign a declaration of eligibility, use, indemnity and liability/publicity release and applicable tax forms. If you are a potential winner and are a minor in your place of residence, and we require that your parent or legal guardian will be designated as the winner, and we may require that they sign a declaration of eligibility, use, indemnity and liability/publicity release on your behalf. If you, (or your parent/legal guardian if applicable), do not sign and return these required forms within the time period listed on the winner notification message, we may disqualify you (or the designated parent/legal guardian) and select an alternate selected winner.

  • Potential use of entry

Other than what is set forth below, we are not claiming any ownership rights to your entry. However, by submitting your entry, you:
(a)    Are granting us an irrevocable, worldwide right and license, in exchange for your opportunity to participate in the contest and potential prize awards, for the duration of the protection of the copyrights to:

  1. Use, review, assess, test and otherwise analyze results submitted or produced by your code (if code submission is a requirement) and other material submitted by you in connection with this contest and any future research or contests sponsored by the organizers and other challenge sponsors; and
  2. Feature your entry and all its content in connection with the promotion of this contest in all media (now known or later developed);
  3. This license does not extend to methods, algorithms, source code used to generate your entry.

(b)    Agree to sign any necessary documentation that may be required for us and our designees to make use of the rights you granted above;
(c)    Understand and acknowledge that sponsors and other entrants may have developed or commissioned materials similar or identical to your submission and you waive any claims you may have resulting from any similarities to your entry;
(d)    Understand that we cannot control the incoming information you will disclose to our representatives or our co-sponsor's representatives in the course of entering, or what our representatives will remember about your entry. You also understand that we will not restrict work assignments of representatives or our co-sponsor's representatives who have had access to your entry. By entering this contest, you agree that use of information in our representatives' or our co-sponsor's  representatives unaided memories in the development or deployment of our products or services does not create liability for us under this agreement or copyright or trade secret law;
(e)    Understand that you will not receive any compensation or credit for use of your entry, other than what is described in these official rules.
If you do not want to grant us these rights to your entry, please do not enter this contest.

  • Submission of entries

(a)    Follow the instructions on the challenge website to submit entries. The organizers may use a third party platform to submit and score entries ("submission platform").
(b)    Unless otherwise specified on the website of the challenge, the participants will be registered as mutually exclusive teams. Each team may submit only one single final entry. We are not responsible for entries that we do not receive for any reason, or for entries that we receive but are not functioning properly.
(c)    The participants are also subject to accepting the terms and conditions of the submission platform chosen to submit and score entries.
(d)    The participants must follow the instructions. We will automatically disqualify incomplete or invalid entries.

  • Judging the entries

The organizers will judge the entries; all judges will be forbidden to enter the contest and will be experts in machine learning, or a related field, or experts in challenge organization.  A list of the judges will be made available upon request. The judges will review all eligible entries received and select the winners based upon the prediction score on test data. The judges will verify that the winners complied with the rules.
The decisions of these judges are final and binding. If we do not receive a sufficient number of entries meeting the entry requirements, we may, at our discretion based on the above criteria, not award any or all of the contest prizes below.  In the event of a tie between any eligible entries, the tie will be broken by giving preference to the earliest submission, using the time stamp of the submission platform.

  • Sponsor

The contest is proudly sponsored by the Technology Innovation Institute (TII), a leading center for technological advancements based in Abu Dhabi, UAE.

  • Privacy

During the development phase of the contest and when they submit their final entries, contest participants do not need to disclose their real identity, but must provide a valid email address where we can deliver notifications to them regarding the contest. To be eligible for prizes, however, contest participants will need to disclose their real identity to contest organizers, informing them by email of their name, professional affiliation, and address. To enter the contest, the participants will need to become users of the web-based submission platform. Any profile information stored on this platform can be viewed and edited by the users. After the contest, the participants may cancel their account with the submission platform and cease to be users of that platform. All personal information will then be destroyed. The submission platform privacy policy will apply to contest information submitted by participants on that platform.

Development

Start: Feb. 1, 2024, midnight

Description: Development phase - train your solution and submit results on validation server. Please note that the submission process takes time. Do not refresh the website during this period.

Competition Ends

March 9, 2024, 11 p.m.

You must be logged in to participate in competitions.

Sign In