Woodscape Fisheye Object Detection Challenge for Autonomous Driving | CVPR 2022 OmniCV Workshop

Organized by saravanabalagi - Current server time: Sept. 3, 2025, 1:23 a.m. UTC
Reward $1,000

Previous

Test Phase
June 4, 2022, 12:01 a.m. UTC

Current

Test Phase
June 4, 2022, 12:01 a.m. UTC

End

Competition Ends
June 5, 2022, 11:59 p.m. UTC

Welcome!

OmniCV 2022 Logo

Welcome to the Woodscape 2D Object Detection Challenge 2022 being run in conjunction with IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR) OmniCV 2022 Workshop. Full details on both the workshop and the competition are available at the OmniCV 2022 webpage.

Description of Challenge

Woodscape is a multi-task, multi-camera fisheye dataset for autonomous driving. The dataset provides ground-truth labels for nine different tasks that include 2D Object Detection labels for 5 classes: vehicles, person, bicycle, traffic_light, traffic_sign. The objective of this challenge is to generate 2D bounding boxes for fisheye images, and in particular to advance the state of the art and to benchmark techniques for 2D Object Detection on fisheye images.

Data can be downloaded from Participate > Get Data section and information regarding the submission format and evaluation methods is provided in the Submission and Evaluation sections respectively.

Prize

  • 1,000 EUR reward through sponsorship from Lero to the team/individual at the top of the leaderboard. 

  • The winning individual/team is expected to present their technical solution in a speaking slot at the OmniCV workshop event. There is no associated paper or poster required.

  • The prize will be awarded through a single payment to the Team Lead. Distribution of the prize amongst team members is the responsibility of the Team Lead.

  • In the case of a tie the prize will be split.

  • The final award remains at the discretion of the organizing committee. 

  • The final leaderboard will be published on the competition website. We encourage participants to share details of their solutions by sharing links to associated publications and code.

Competition Results

The competition has ended on June 5, 2022 and we congratulate all participants for their effort. The top 3 teams, GroundTruth, heboyong, and IPIU-XDU were announced and the winning team, GroundTruth, presented their method at the CVPR OmniCV 2022 Workshop on June 20, 2022.

We have added the post-challenge paper on arXiv providing summary of the competition. If you publish your technique you used in the challenge, we encourage you to cite the competition using

@misc{ramachandran2022woodscape,
 author={Saravanabalagi Ramachandran and Ganesh Sistu and John McDonald and Senthil Yogamani},
 url = {https://arxiv.org/abs/2206.12912},
 author = {Ramachandran, Saravanabalagi and Sistu, Ganesh and Kumar, Varun Ravi and McDonald, John and Yogamani, Senthil},
 title = {{Woodscape Fisheye Object Detection for Autonomous Driving -- CVPR 2022 OmniCV Workshop Challenge}},  
 publisher = {arXiv},	
 year={2022},
 eprint={2206.12912},
 archivePrefix={arXiv},
 primaryClass={cs.CV},
 copyright = {arXiv.org perpetual, non-exclusive license}
}

We have started accepting submissions again keeping the challenge open to everyone without any deadline to welcome research on fisheye images and to advance the state-of-the-art.

Challenge Organizers

Sponsors

Forum and Contact

We have setup a forum for public queries and discussion, and for private queries, please send an email to saravanabalagi [at] gmail [dot] com

Evaluation

The competition entries will be evaluated and ordered based on the mean Average Precision (mAP) score calculated by taking an arithmetic mean of the Average Precision (AP) scores of all 5 classes. The leaderboard will display the mAP values as score and also the individual class AP scores.

Note that evaluation for Phase 1 will be done on a subset of 1000 images to prevent participants from overfitting their model on the test data. The subset is the same for all participants, however, the subset will change every week. Phase 2 will be done on all of the test images

Participants shall use the provided evaluation kit to generate and check the score and other metrics on their validation data. Instructions for setting up the environment, placing the data and running the evaluation are enclosed in the kit.

Evaluation kit is available here.

Terms and Conditions

  • Individuals and teams (unlimited size) can enter the competition. 

  • Limit of 10 submissions per day and 100 submissions in total per person/team. 

  • External data, freely & publicly available, is allowed. This includes pre-trained models.

  • There are no limits on training time or network capacity.

  • No Valeo employees may take part in the challenge.

  • Employees of third party companies, universities or institutions that contributed to the creation or have access to the full WoodScape dataset shall NOT take part in the challenge.

  • The associated WoodScape terms of use continue to apply for the data usage within the challenge.

Submission

  • Object Detection bounding boxes should be saved as TXT files with each line containing class_name,class_index,x1,y1,x2,y2 corresponding to a bounding box, where class_name and class_index taken from vehicles, person, bicycle, traffic_light, traffic_sign, while top left and bottom right pixels are represented using x1, y1, x2, y2 respectively. If you also find confidence scores for bounding boxes, sort the bounding boxes for each image in descending order based on the confidence scores and do not add the confidence scores to the txt file.
To complete a submission, competitors should generate and submit a single zip file containing `object_detection` directory. This task directory should contain output `.txt` files for all the test files, and the output files should have same name as that of the Test RGB images.
  • Partial submissions will result in an error and will not be evaluated.
  • Please check the output log (available at Participate > Submit / View Files) to check evaluation logs such as additional metrics, errors and warnings.

Zip file Structure

  • object_detection
    • 00001.txt
    • 00002.txt
    • 00003.txt
    • ...

Dev Phase

Start: April 15, 2022, midnight

Test Phase

Start: June 4, 2022, 12:01 a.m.

Competition Ends

June 5, 2022, 11:59 p.m.

You must be logged in to participate in competitions.

Sign In
# Username Score
1 GroundTruth 0.51
2 heboyong 0.50
3 IPIU-XDU 0.49