I am submitting a zip folder having a json file, but I can't submit. It gives this error:
WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap.
/opt/conda/lib/python3.7/site-packages/mmcv/__init__.py:21: UserWarning: On January 1, 2023, MMCV will release v2.0.0, in which it will remove components related to the training process and add a data transformation module. In addition, it will rename the package names mmcv to mmcv-lite and mmcv-full to mmcv. See https://github.com/open-mmlab/mmcv/blob/master/docs/en/compatibility.md for more details.
'On January 1, 2023, MMCV will release v2.0.0, in which it will remove '
Traceback (most recent call last):
File "/tmp/codalab/tmpCV6mvS/run/program/demo_eval.py", line 57, in
evaluate_track_1(gt_file_fine, segm_json_file, output_file)
File "/tmp/codalab/tmpCV6mvS/run/program/demo_eval.py", line 43, in evaluate_track_1
results_summary_fine, results_classwise_fine = evaluate_all(gt_file_fine, results_file, CLASSES_track_1_fine)
File "/tmp/codalab/tmpCV6mvS/run/program/demo_eval.py", line 39, in evaluate_all
metrics, None, True)
File "/tmp/codalab/tmpCV6mvS/run/program/coco.py", line 437, in evaluate_det_segm
predictions = mmcv.load(result_files[metric])
File "/opt/conda/lib/python3.7/site-packages/mmcv/fileio/io.py", line 64, in load
with StringIO(file_client.get_text(file)) as f:
File "/opt/conda/lib/python3.7/site-packages/mmcv/fileio/file_client.py", line 1027, in get_text
return self.client.get_text(filepath, encoding)
File "/opt/conda/lib/python3.7/site-packages/mmcv/fileio/file_client.py", line 552, in get_text
with open(filepath, encoding=encoding) as f:
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/codalab/tmpCV6mvS/run/input/res/results.json'
@FaryalJaved Hello, we checked the zip file you submitted. The reason for the failure is that the zip file was unzipped to get a folder instead of results.json. Please check your compression method and submit again.
Posted by: dfc2023.iecas @ Feb. 24, 2023, 6:35 a.m.