2022 NICO Common Context Generalization Challenge (ECCV 2022 Workshop) Forum

Go back to competition Back to thread list Post in this thread

> Is it acceptable to use multi-model ensemble?

Hello,I am a team member participating in this competition, and I noticed that the competition rules state that "the model should be able to be trained with 8 TITAN X GPUs with 12288M memory", which led me to a few confusions:
How to calculate the training resources of ensemble? If multiple models are trained within 8 GPUs, does it still meet the training requirements if these models are ensemble as prediction models?
Is it possible to use multi model weights avarage? Similarly, does the parameter averaging of multiple models still meet the requirements of computing resources?

Posted by: megvii_is_fp @ June 2, 2022, 2:29 a.m.

Multi-model ensembling is allowed as long as the whole model can be trained with 8 GPUs since only an 8-GPU server can be used for training the submitted model for each team in phase 2. If your models can be trained iteratively with 8 GPUs each time or they can be trained together with 8 GPUs, it meets the rule.

Posted by: NICO-official @ June 3, 2022, 5 p.m.
Post in this thread