I’m trying to process the HCP-aging dataset, which I’ve converted to BIDS format after downloading from NDA.
For my first attempt I tried using the HCPPipelines BIDS app , which has recently been updated to use the HCP pipeline version 4.1.3 that’s being developed for processing the HCP lifespan data. I have built a singularity image from the docker container to run on my HPC. However when i run it i get this message:
bids-validator /cifs/hariri-long/Studies/HCP-Aging/BIDS/sourcedata ESC[31m1: Files with such naming scheme are not part of BIDS specification. This error is most commonly caused by typos in file names that make them not BIDS compatible. Please consult the specification and make sure your files are named correctly. If this is not a file naming issue (for example when including files not yet covered by the BIDS specification) you should include a ".bidsignore" file in your dataset. Please note that derived (processed) data should be placed in /derivatives folder and source data (such as DICOMS or behavioural logs in proprietary formats) should be placed in the /sourcedata folder. (code: 1 - NOT_INCLUDED)ESC[39m ./sub-6005242/func/sub-6005242_task-carit_dir-PA_bold.json ./sub-6005242/func/sub-6005242_task-facename_dir-PA_bold.nii.gz ./sub-6005242/func/sub-6005242_task-vismotor_dir-PA_bold.nii.gz
[plus a jillion more files]
However, I am able to validate the dataset just fine running the bids validator app.
My second attempt has been to start out running the PreFreeSurfer.sh script within my singularity container manually, passing all the arguments to it. With that I’m getting the error:
Spin echo fieldmap has different dimensions than scout image, this requires a manual fix
I’m passing it the fieldmap that’s designated for the T1 (i.e., it’s stored in the T1 folder in the raw download), so I haven’t been able to figure out yet what’s going on here but I’m still working on it (don’t yet fully understand exactly which fieldmaps i need to be passing to this script / how).
I think my biggest question is whether I’m missing any tools or documentation that are available for processing the HCP lifespan data, since I would expect that there’d be something out there by now!
Any guidance anyone has is much appreciated, thanks!!!