The 2024 IEEE International Conference on Image Processing (ICIP 2024) is scheduled to be held in Abu Dhabi from October 27th to 30th. The conference will offer a comprehensive program focusing on image and video processing, as well as computer vision.
Omnidirectional visual content, commonly referred to as 360-degree images and videos, has garnered significant interest in both academia and industry, establishing itself as the primary media modality for VR/XR applications. 360-degree videos offer numerous features and advantages, allowing users to view scenes in all directions, providing an immersive quality of experience with up to 3 degrees of freedom (3DoF). When integrated on embedded devices with remote control, 360-degree videos offer additional degrees of freedom, enabling movement within the space (6DoF). However, 360-degree videos come with specific requirements, such as high-resolution content with up to 16K video resolution to ensure a high-quality representation of the scene. Moreover, limited bandwidth in wireless communication, especially under mobility conditions, imposes strict constraints on the available throughput to prevent packet loss and maintain low end-to-end latency. Adaptive resolution and efficient compression of 360-degree video content can address these challenges by adapting to the available throughput while maintaining high video quality at the decoder. Nevertheless, the downscaling and coding of the original content before transmission introduce visible distortions and loss of information details that cannot be recovered at the decoder side. In this context, machine learning techniques have demonstrated outstanding performance in alleviating coding artifacts and recovering lost details, particularly for 2D video. Compared to 2D video, 360-degree video presents a lower angular resolution issue, requiring augmentation of both the resolution and the quality of the video. This challenge presents an opportunity for the scientific research and industrial community to propose solutions for quality enhancement and super-resolution for 360-degree videos.
The objective of this challenge is to reconstruct high-resolution (HR) 360-degree videos from low-resolution (LR) compressed versions. In the model development phase, the training set and the LR-only validation set will be provided. Participants can train their models using the training set and subsequently assess them on the validation set by submitting their enhanced, super-resolved results to the validation server, along with the runtime. During the test phase, both the validation set with ground truth and an LR-only test set will be accessible. At the end of the test phase, participants are required to submit a Docker file (for which a public tutorial will be provided) and their code script to the organizers for fair evaluation.
The top three models in this track will be eligible for awards. The prizes for the top three models are as follows:
First place awards: Certificate & 1000$
Second place awards: Certificate & 500$
Third place awards: Certificate
Judges will prioritize novelty in solution.
Contributors to the challenge are encouraged to submit a challenge paper to ICIP 2024 by the deadline of Mars 27, 2024.
We are thrilled to announce the winners of the ICIP 2024 360-degree Video Super-Resolution Challenge. Congratulations to the top performers.
We extend our heartfelt thanks to all participants for their hard work and contributions.
Please note that Track 2 of our challenge is still ongoing and will conclude on April 28, 2024.
Ahmed Telili @ TII
Ibrahim Farhat @ TII
Wassim Hamidouche @ TII
Hadi Amirpour @AAU
More information about challenge organizers is available here: https://www.icip24-video360sr.ae/home
We are proud to announce the Technology Innovation Institute (TII) is the sponsor of this challenge. TII is a leading global research center dedicated to pushing the frontiers of knowledge. Our teams of scientists, researchers and engineers work in an open, flexible and agile environment to deliver discovery science and transformative technologies. TII is is part of the Abu Dhabi Government’s Advanced Technology Research Council (ATRC), which oversees technology research in the emirate.

The evaluation of submitted solutions involves comparing their x2 super-resolved frames with the corresponding ground truth frames and reporting the inference time of the SR model. The primary goal is to improve the quality of low-resolution videos to high-resolution, prioritizing high quality while ensuring minimal runtime for live broadcast scenarios.
For video quality assessment, we primarily utilize the Weighted Signal to Noise Ratio (WS-PSNR) while also reporting the Weighted Structural Similarity Index (WS-SSIM). WS-PSNR serves as our main metric, offering a comprehensive measure of the super-resolved frame quality relative to the original high-resolution frames. The average WS-PSNR and WS-SSIM values are computed across all processed frames.
The final score combines quality and runtime. It is computed as follows:
The quality score metric is calculated and normalized using the WS-PSNR results of the bicubic model. We define two fixed values, WS-PSNRBicubic and WS-PSNRmax, which establish the lower and upper limits for model performance, respectively. Utilizing these values, we introduce penalties for models that produce WS-PSNR values below the bicubic baseline, while giving emphasis to those that achieve values higher than the defined maximum.
Runtime score calculation:
The runtime evaluation metric assigns a full score to models that achieve a processing time of 0.016 seconds or less per 2K frame, as this speed enables a smooth 60 frames per second (fps), essential for high-quality live streaming. To reinforce this standard, we apply penalties in our evaluation criteria for runtimes that exceed 0.016 seconds. The slower the model, the larger the penalties.
The final score is calculated by assigning specific coefficients to the quality and runtime scores and then multiplying the result by 100, as outlined in the formula above.
Please note that during the development phase, we will base our evaluation on the runtime reported by participants. However, for the final test phase and rankings, the challenge organizers will independently verify and execute the provided solutions to compute the measurements.
Development phase:
For submitting the results, you need to follow these steps:
Organize your submission as follows:
submission.zip1.1 folder, include 20 subfolders (each representing a video). Each subfolder should contain the 5 specified frames.readme.txt file in the root of the 1 folder.
readme.txt file must include the following details: runtime (in seconds), the platform used for running the model, and an indication (1 for Yes, 0 for No) of whether extra data was used for training the model. Note that the validation set contains both 2K and 4K frames; however, the specified runtime should be for 2K frames. An 'Other' section can also be included for additional information or remarks. The format for the readme.txt should be as follows:
Test phase
Submit your code, your pretrainedd model and a README with instructions for running it in a .zip file to ahmed.telili@tii.ae by 2nd March 2024. Testing on the test set will follow, with results and the winner announcement by 6th March.
You are eligible to register and compete in this contest only if you meet all the following requirements:
The financial sponsors of this contest is Technology Innovation Institute (TII) . There will be economic incentive prizes for the winners to boost contest participation; these prizes will not require participants to enter into an IP agreement with any of the sponsors, to disclose algorithms, or to deliver source code to them. The participants affiliated with the industry sponsors agree to not receive any sponsored money, product, or travel grant in the case they will be among the winners.
Incentive Prizes only for track #1 competitions (ICIP chalennge) are as follows:
1st place: Certificate & 1000$
2nd place: Certificate & 500$
3rd place: Certificate
We accept multiple submissions from a single team if they are significantly different. However, solutions with minor differences can be rejected by the organizers. For example, if solution A and B are submitted where B is an ensemble method of A, we do not consider them to be essentially different.
If you are preparing multiple submissions, please contact the organizers in advance for clarification.
If there is any change to data, schedule, instructions of participation, or these rules, the registered participants will be notified at the email they provided with the registration.
If you are a potential winner, we will notify you by sending a message to the e-mail address listed on your final entry within seven days following the determination of winners. If the notification that we send is returned as undeliverable, or you are otherwise unreachable for any reason, we may award the prize to an alternate winner, unless forbidden by applicable law.
If you are a potential winner, we may require you to sign a declaration of eligibility, use, indemnity and liability/publicity release and applicable tax forms. If you are a potential winner and are a minor in your place of residence, and we require that your parent or legal guardian will be designated as the winner, and we may require that they sign a declaration of eligibility, use, indemnity and liability/publicity release on your behalf. If you, (or your parent/legal guardian if applicable), do not sign and return these required forms within the time period listed on the winner notification message, we may disqualify you (or the designated parent/legal guardian) and select an alternate selected winner.
Other than what is set forth below, we are not claiming any ownership rights to your entry. However, by submitting your entry, you:
(a) Are granting us an irrevocable, worldwide right and license, in exchange for your opportunity to participate in the contest and potential prize awards, for the duration of the protection of the copyrights to:
(b) Agree to sign any necessary documentation that may be required for us and our designees to make use of the rights you granted above;
(c) Understand and acknowledge that sponsors and other entrants may have developed or commissioned materials similar or identical to your submission and you waive any claims you may have resulting from any similarities to your entry;
(d) Understand that we cannot control the incoming information you will disclose to our representatives or our co-sponsor's representatives in the course of entering, or what our representatives will remember about your entry. You also understand that we will not restrict work assignments of representatives or our co-sponsor's representatives who have had access to your entry. By entering this contest, you agree that use of information in our representatives' or our co-sponsor's representatives unaided memories in the development or deployment of our products or services does not create liability for us under this agreement or copyright or trade secret law;
(e) Understand that you will not receive any compensation or credit for use of your entry, other than what is described in these official rules.
If you do not want to grant us these rights to your entry, please do not enter this contest.
(a) Follow the instructions on the challenge website to submit entries. The organizers may use a third party platform to submit and score entries ("submission platform").
(b) Unless otherwise specified on the website of the challenge, the participants will be registered as mutually exclusive teams. Each team may submit only one single final entry. We are not responsible for entries that we do not receive for any reason, or for entries that we receive but are not functioning properly.
(c) The participants are also subject to accepting the terms and conditions of the submission platform chosen to submit and score entries.
(d) The participants must follow the instructions. We will automatically disqualify incomplete or invalid entries.
The organizers will judge the entries; all judges will be forbidden to enter the contest and will be experts in machine learning, or a related field, or experts in challenge organization. A list of the judges will be made available upon request. The judges will review all eligible entries received and select the winners based upon the prediction score on test data. The judges will verify that the winners complied with the rules.
The decisions of these judges are final and binding. If we do not receive a sufficient number of entries meeting the entry requirements, we may, at our discretion based on the above criteria, not award any or all of the contest prizes below. In the event of a tie between any eligible entries, the tie will be broken by giving preference to the earliest submission, using the time stamp of the submission platform.
The contest is proudly sponsored by the Technology Innovation Institute (TII), a leading center for technological advancements based in Abu Dhabi, UAE.
During the development phase of the contest and when they submit their final entries, contest participants do not need to disclose their real identity, but must provide a valid email address where we can deliver notifications to them regarding the contest. To be eligible for prizes, however, contest participants will need to disclose their real identity to contest organizers, informing them by email of their name, professional affiliation, and address. To enter the contest, the participants will need to become users of the web-based submission platform. Any profile information stored on this platform can be viewed and edited by the users. After the contest, the participants may cancel their account with the submission platform and cease to be users of that platform. All personal information will then be destroyed. The submission platform privacy policy will apply to contest information submitted by participants on that platform.
Start: Feb. 1, 2024, midnight
Description: Development phase - train your solution and submit results on validation server. Please note that the submission process takes time. Do not refresh the website during this period.
March 9, 2024, 11 p.m.
You must be logged in to participate in competitions.
Sign In