Good afternoon,
I was trying to extract the COCO metadata with the code suggested in the Colab tutorial (https://github.com/styvesg/nsd_gnet8x/blob/main/data_preparation.ipynb).
In that code, some .mat and -hdf5 files are loaded, however such files are not present in the provided challenge data.
It's easy to download them from the NSD dataset, however I'm not sure whether to proceed because the challenge says that by getting access to the holdout data will result in disqualification, and I'm not sure whether the NSD dataset includes it.
Thanks in advance!
Posted by: Riccardo-Chimisso @ April 24, 2023, 3:13 p.m.Hi,
I'm also working on finding out the mapping from NSD ID# to COCO Image ID#. After I checked the website provided by the example notebook, I think the link below may be helpful for both of us to get some meta data:
https://cvnlab.slite.page/p/dC~rBTjqjb/How-to-get-the-data
Best,
Catosine
Posted by: catosine @ April 25, 2023, 8:42 a.m.Hi,
Thank you for your reply, however my doubt was whether downloading data directly from NSD will result in disqualification because it contains the holdout set of the challenge or if it is allowed to access the NSD dataset and download whichever files.
Thanks!
Posted by: Riccardo-Chimisso @ April 25, 2023, 9:39 a.m.Hi Riccardo-Chimisso!
That's a good point. I actually didn't think about that. My plan is to use COCO's annotations (segmentation mask/categories/etc) as more training data. I think these annotations are safe to use since no test fMRI are involved and the organizer allows to use any data that you can find to train you model.
Best,
Catosine
Posted by: catosine @ April 25, 2023, 11:09 a.m.Hi everyone,
The notebook to extract the COCO metadata uses the 73k images contained in the .hdf5 from the NSD release. We do not provide this file as part of the Algonauts Challenge data. However, the file names of the Challenge images that we provide have an index that you can use to map back to the 73k images in the .hdf5 file of the NSD release (and from there to the COCO metadata using the notebook). Therefore, if you're interested in using the COCO metadata to train your models, you will first have to download the .hdf5 images file from the NSD data release. For more information on the indices in the Challenge images file names please see the README.txt accompanying the Algonauts Challenge data.
Downloading and using any data from the NSD release will NOT result in disqualification, as the Algonauts Challenge test split all comes from the NSD partition that is withheld and therefore not available to the public (i.e., the last 3 scanning sessions of each subject). Please refer to the website and challenge paper for actions that would result in disqualification from the Algonauts Challenge.
The Algonauts Challenge Organizers
Posted by: giffordale95 @ April 26, 2023, 7:31 a.m.Hi Algonauts Team!
Thank you for your clarification! And I'm very glad to see that the NSD database are okay to use.
Best,
Catosine
Posted by: catosine @ April 26, 2023, 8:03 a.m.Hi Challenge Organizers,
Thank you for clearing my doubt.
I tried downloading the following files:
1. nsddata_stimuli/stimuli/nsd/nsd_stimuli.hdf5
2. nsddata_stimuli/stimuli/nsdsynthetic/nsdsynthetic_stimuli.hdf5
3. nsddata/experiments/nsd/nsd_expdesign.mat
4. nsddata/experiments/nsdsynthetic/nsdsynthetic_expdesign.mat
But I get access denied for #2 and #4. Actually I get the same error when I try to download anything under nsddata_stimuli/stimuli/nsdsynthetic/ or nsddata/experiments/nsdsynthetic/
#1 and #3 I can download without problems.
Thanks also to Catosine for his/her contribution to this thread.
Best,
Riccardo
Hi Riccardo,
You get those errors when downloading 2. and 4. because NSDsynthetic is a part of NSD that is not yet publicly released (you can read more about it in the NSD paper or data manual). Furthermore, the Algonauts Challenge doesn't use NSDsynthetic data in neither the train nor test splits. Therefore, simply ignore it.
The Algonauts Challenge Organizers
Posted by: giffordale95 @ April 26, 2023, 3:01 p.m.Hi Challenge Organizers,
Thank you again!
Best,
Riccardo