Processing HCP-Aging dataset in BIDS format

Hi all,

I’m trying to process the HCP-aging dataset, which I’ve converted to BIDS format after downloading from NDA.

For my first attempt I tried using the HCPPipelines BIDS app , which has recently been updated to use the HCP pipeline version 4.1.3 that’s being developed for processing the HCP lifespan data. I have built a singularity image from the docker container to run on my HPC. However when i run it i get this message:

bids-validator /cifs/hariri-long/Studies/HCP-Aging/BIDS/sourcedata
ESC[31m1: Files with such naming scheme are not part of BIDS specification. This error is most commonly caused by typos in file names that make them not BIDS compatible. Please consult the specification and make sure your files are named correctly. If this is not a file naming issue (for example when including files not yet covered by the BIDS specification) you should include a ".bidsignore" file in your dataset. Please note that derived (processed) data should be placed in /derivatives folder and source data (such as DICOMS or behavioural logs in proprietary formats) should be placed in the /sourcedata folder. (code: 1 - NOT_INCLUDED)ESC[39m

            ./sub-6005242/func/sub-6005242_task-carit_dir-PA_bold.json
            ./sub-6005242/func/sub-6005242_task-facename_dir-PA_bold.nii.gz
            ./sub-6005242/func/sub-6005242_task-vismotor_dir-PA_bold.nii.gz


[plus a jillion more files]

However, I am able to validate the dataset just fine running the bids validator app.

My second attempt has been to start out running the PreFreeSurfer.sh script within my singularity container manually, passing all the arguments to it. With that I’m getting the error:

Spin echo fieldmap has different dimensions than scout image, this requires a manual fix

I’m passing it the fieldmap that’s designated for the T1 (i.e., it’s stored in the T1 folder in the raw download), so I haven’t been able to figure out yet what’s going on here but I’m still working on it (don’t yet fully understand exactly which fieldmaps i need to be passing to this script / how).

I think my biggest question is whether I’m missing any tools or documentation that are available for processing the HCP lifespan data, since I would expect that there’d be something out there by now!

Any guidance anyone has is much appreciated, thanks!!!

Hi @aknodt

Thank you for your message and welcome to NeuroStars! This image was recently updated (about an hour ago) with the latest image. May you please try pulling the latest image and processing the dataset?

Thank you,
Franklin

1 Like

Thanks for your response and for your work on this, @franklin! Per Roeland’s advice on the HCPpipelines github page, I have now built my singularity image from rhancock/hcpbids. I get past the bids validation (yay!), but then pretty quickly hit this error:

Tue Jul 21 11:05:08 EDT 2020:FreeSurferPipeline.sh: Thresholding T1w image to eliminate negative voxel values
Tue Jul 21 11:05:08 EDT 2020:FreeSurferPipeline.sh: …This produces a new file named: /work/long/HCP_MPP/HCP-A/sub-6005242/T1w/T1w_acpc_dc_restore_zero_threshold.nii.gz
Image Exception : #63 :: No image files match: /work/long/HCP_MPP/HCP-A/sub-6005242/T1w/T1w_acpc_dc_restore

I think this image was supposed to be generated by the pipeline, right? Will keep troubleshooting. Thanks!

1 Like

Hi @aknodt,
The T1w_acpc_dc_restore image is supposed to be generated at the PreFreeSurfer stage, so there is likely an earlier error. Could you upload the complete output log from running the container?

Roeland

1 Like

Ah, makes sense. I don’t see anything particularly suspect yet but hopefully it’s obvious to you:

VERSION: v4.1.3
========================================
Tue Jul 21 11:05:04 EDT 2020:FreeSurferPipeline.sh: Showing recon-all.v6.hires version
Tue Jul 21 11:05:04 EDT 2020:FreeSurferPipeline.sh: /opt/HCP-Pipelines/FreeSurfer/custom/recon-all.v6.hires
freesurfer-Linux-centos7_x86_64-dev-20171219-d5d1e74
Tue Jul 21 11:05:04 EDT 2020:FreeSurferPipeline.sh: Showing tkregister version
/opt/freesurfer/tktools/tkregister
dev build (use --all-info flag for full version info)
Tue Jul 21 11:05:06 EDT 2020:FreeSurferPipeline.sh: Showing mri_concatenate_lta version
/opt/freesurfer/bin/mri_concatenate_lta
stable6
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: Showing mri_surf2surf version
/opt/freesurfer/bin/mri_surf2surf
stable6
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: Showing fslmaths location
/usr/local/fsl/bin/fslmaths
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: INFO: Determined that FreeSurfer full version string is: freesurfer-Linux-centos7_x86_64-dev-20171219-d5d1e74
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: INFO: Determined that FreeSurfer version is: d5d1e74
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: Using named parameters
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: Subject Directory: /work/long/HCP_MPP/HCP-A/sub-6005242/T1w
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: Subject: sub-6005242
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: T1w Image: /work/long/HCP_MPP/HCP-A/sub-6005242/T1w/T1w_acpc_dc_restore.nii.gz
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: T1w Brain: /work/long/HCP_MPP/HCP-A/sub-6005242/T1w/T1w_acpc_dc_restore_brain.nii.gz
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: Include -conf2hires flag in recon-all: TRUE
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: ProcessingMode: HCPStyleData
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: Starting main functionality
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: Retrieve positional parameters
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: SubjectDIR: /work/long/HCP_MPP/HCP-A/sub-6005242/T1w
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: SubjectID: sub-6005242
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: T1wImage: /work/long/HCP_MPP/HCP-A/sub-6005242/T1w/T1w_acpc_dc_restore.nii.gz
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: T1wImageBrain: /work/long/HCP_MPP/HCP-A/sub-6005242/T1w/T1w_acpc_dc_restore_brain.nii.gz
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: T2wImage: /work/long/HCP_MPP/HCP-A/sub-6005242/T1w/T2w_acpc_dc_restore.nii.gz
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: recon_all_seed:
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: flair: FALSE
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: existing_subject: FALSE
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: extra_reconall_args:
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: conf2hires: TRUE
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: Figure out the number of cores to use.
Tue Jul 21 11:05:07 EDT 2020:FreeSurferPipeline.sh: num_cores: 1
Tue Jul 21 11:05:08 EDT 2020:FreeSurferPipeline.sh: Thresholding T1w image to eliminate negative voxel values
Tue Jul 21 11:05:08 EDT 2020:FreeSurferPipeline.sh: ...This produces a new file named: /work/long/HCP_MPP/HCP-A/sub-6005242/T1w/T1w_acpc_dc_restore_zero_threshold.nii.gz
Image Exception : #63 :: No image files match: /work/long/HCP_MPP/HCP-A/sub-6005242/T1w/T1w_acpc_dc_restore
terminate called after throwing an instance of 'std::runtime_error'
  what():  No image files match: /work/long/HCP_MPP/HCP-A/sub-6005242/T1w/T1w_acpc_dc_restore
/opt/HCP-Pipelines/FreeSurfer/FreeSurferPipeline.sh: line 565: 62001 Aborted                 fslmaths ${T1wImage} -thr 0 ${zero_threshold_T1wImage}
Tue Jul 21 11:05:09 EDT 2020:FreeSurferPipeline.sh: While running '/opt/HCP-Pipelines/FreeSurfer/FreeSurferPipeline.sh --subject=sub-6005242 --subjectDIR=/work/long/HCP_MPP/HCP-A/sub-6005242/T1w --t1=/work/long/HCP_MPP/HCP-A/sub-6005242/T1w/T1w_acpc_dc_restore.nii.gz --t1brain=/work/long/HCP_MPP/HCP-A/sub-6005242/T1w/T1w_acpc_dc_restore_brain.nii.gz --processing-mode=HCPStyleData --t2=/work/long/HCP_MPP/HCP-A/sub-6005242/T1w/T2w_acpc_dc_restore.nii.gz':
Tue Jul 21 11:05:09 EDT 2020:FreeSurferPipeline.sh: ERROR: fslmaths command failed with return_code: 134
Tue Jul 21 11:05:09 EDT 2020:FreeSurferPipeline.sh: ERROR: fslmaths command failed with return_code: 134
Tue Jul 21 11:05:09 EDT 2020:FreeSurferPipeline.sh: ABORTING

Traceback (most recent call last):
  File "/run.py", line 402, in <module>
    stage_func()
  File "/run.py", line 103, in run_freesurfer
    "OMP_NUM_THREADS": str(args["n_cpus"])})
  File "/run.py", line 31, in run
    raise Exception("Non zero return code: %d"%process.returncode)
Exception: Non zero return code: 1

Thank you!!

Thanks! The log you provided is starting from the FreeSurfer stage (the second processing stage) though, and the prior PreFreeSurfer stage is likely where the issue is. Do you have some earlier output messages, starting from the beginning of the messages generated by the container? If you didn’t get a chance to capture those messages, would you be able to run the container again with a new output directory (or deleting the existing output) while redirecting the messages to a file? You can add > output.log > errors.log to your singularity command line to save the output.

1 Like

Oh snap! That was a typo in my --stages argument. Trying again now - fingers crossed! Thank you!!

Hi @aknodt,
Ah, hope that solves it! Sounds like I should add some better input validation. @ me if you still have trouble.

Thanks @rhancockn!! I was able to make it through the PreFreeSurfer stage. It doesn’t seem to have worked properly though (the final stage T1w_acpc_dc_restore_brain.nii.gz image is essentially empty), but it seems most likely to me that the issue could easily be with the way I set up the data etc.

I was pretty careful about converting the dataset to BIDs format (it comes as *nii.gzs and *json sidecars, so mainly I just did some re-arranging and re-naming), but it seems like there could easily be an issue with the jsons or something. I did get this message between Gradient Unwarping and FAST that looks to indicate an issue:


Tue Jul 21 18:33:30 EDT 2020:TopupPreprocessingAll.sh: END: Topup Field Map Generation and Gradient Unwarping
Cannot interpret shift direction = NONE
Cannot interpret shift direction = NONE
Cannot interpret shift direction = NONE
Cannot interpret shift direction = NONE
Running FAST segmentation

I think I’m going to try running the PreFreeSurferPipeline.sh script inside the container and passing it all the arguments manually, since we have had success doing that with some other datasets. And will also keep trying to learn more about how to properly configure and run the pipeline. Thought I’d follow-up with this in the meantime in case there are any obvious solutions I’m missing.

THanks!

Hi @aknodt,
Possibly the gradient unwrap direction is not getting passed correctly. This should be specified with the flag --anat_unwarpdir when running the container, e.g. --anat_unwarpdir z might work for HCP data.

The command line options that PreFreeSurfer.sh is called with should be logged near the beginning of the output. If you post that bit, I can see if anything looks odd there.

Piggybacking off this post:

Does anyone know of an hcp2bids scripts that works for HCP-Aging 2.0 data?

Thanks,
Bram

Hi @bramdiamond,

If the HCP-aging dataset is organized similar to HCP1200 (not sure if it is), could you just clone the hcp2bids repo and make a few changes to make parameters correspond with the aging dataset? Looking at the code, not that much of it is strictly hard-coded for HCP1200.

Best,
Steven