There is no response when uploading the inference results,and upload keeps failing
I think now it is just done.It seems the server is using CPU instead of GPU, which requires further inspection.