This is the page for submitting results to the Track 2, Open Condition, of the ASVspoof 5 a spoofing-robust automatic speaker verification (SASV) task. SASV systems must compare an unlabeled probe (test) utterance to an enrollment utterance(s) of known target speakers. SASV systems developed by participants will be benchmarked using a mix of three types of trials-target bonafide (target), non-target bonafide (non-target), and spoofing attack (spoof). SASV systems should accept targets only. In the Open Condition, the use of external data and pre-trained models is permitted, provided that their use is declared and described clearly in the system description accompanying any submission. Please, check the rules and the data usage of the open condition in the ASVspoof 5 evaluation plan.
You must be logged in to participate. The registration procedure is described in the ASVspoof 5 evaluation plan.
$ zip submission.zip score.tsv
Considering possible scenarios for submitting CM, ASV, or End2End SASV system scores, we define two approaches and corresponding score formats. The score file has five-columns structure <spk> <filename> <cm-score> <asv-score> <sasv-score>, separated with a tabular ('\t') delimiter.
Example score.tsv
Approach A: Participant End2End SASV system
spk filename cm-score asv-score sasv-score E_2040 E_000001 - - 0.001 E_2040 E_000002 - - 0.002 ...
Approach B: Participant SASV system by combined participant CM and ASV systems
spk filename cm-score asv-score sasv-score E_2040 E_000001 0.001 0.003 0.005 E_2040 E_000002 0.002 0.004 0.006 ...
For any of the approaches above, participant should submit a single five-columns file. However, participant decides which approach suits his scenario and fill out corresponding columns. Fill out the column for missing system(-s) with dash symbols ('-').
For scores, please submit floating point values only (e.g., no NaNs / inf values).
You can find an example submissions at: Participate → Files → Download: Public Data
It is a zip archive of score files produced by our baseline systems.
Please be patient while uploading your submission zip file. CodaLab does not show the progress bar, and it may take few minutes until submission is uploaded.
After submission is successful, status changes might take more than 5 minutes from Submitted to Finished.
Please track log files for warnings/error messages.
Please avoid submissions in the last minutes/hours of a phase.
Please find detailed results through downloading CodaLab outputs that are created during the scoring of your submission(s).
Participants should refer to the ASVspoof 5 challenge evaluation plan for further details regarding the participation rules.
Participants are tasked with the design of systems which must assign real-valued detection scores to each unlabeled audio segment. Track 2 evaluation metrics include minimum agnostic DCF (min a-DCF), the minimum tandem detection function (min t-DCF), and the tandem equal error rate (t-EER).
The primary evaluation metric for the Track 2 is the min a-DCF which assigns a cost to a system, taking into account the miss rate and the two false alarm rates (relating to non-target or spoof trials).
Costs and priors for calculation of the min a-DCF will be specified prior to the release of evaluation data (see schedule in Section 10).
Note, when participants submit scores only for End2End SASV system, the min a-DCF metric will be computed. It is not feasible to compute t-DCF and t-EER in that case, therefore, the default values of 100 will be shown on the leaderboard.
As the results of evaluation, you will receive the evaluation metrics. Note, class labeling assumptions: human speech results in higher scores; spoof speech results in lower scores.
Rules and schedule are described in the ASVspoof 5 evaluation plan.
Start: June 12, 2024, midnight
Description: During the progress phase, each team is allowed to make up to 4 submissions per day. Results determined from a subset of trials will be made available for each submission.
Start: July 21, 2024, noon
Description: During the evaluation phase, participants will be allowed only a single submission for which results will be determined from the remaining evaluation trials.
Start: July 24, 2024, noon
Description: No submissions possible.
Start: Sept. 5, 2024, midnight
Description: As for the evaluation phase, but with up to 2 submissions per day.
Dec. 31, 2024, 11:59 p.m.
You must be logged in to participate in competitions.
Sign In