Singularity and heudiconv

Hi, I am trying to follow the http://reproducibility.stanford.edu/bids-tutorial-series-part-2a/ (which is great) using singularity rather than docker. Each time I try to run the command using singularity, I receive the error “no match”. I am not very familiar with singularity, so I assume this is due to where singularity is stored vs the data?

singularity run /usr/pubsw/packages/heudiconv/heudiconv.simg --bind /data:/base nipy/heudiconv:latest -d /base/dicom/sub-{subject}/ses-{session}/scans//.dcm -o /base/nifti/ -f convertall -s 01 -ss 001 -c none --overwrite

Any help to get started would be very appreciated.

Hi @laurenbreithaupt,

Your command line reflects some confusion about how singularity works. On the one hand, you are trying to use a singularity image stored at /usr/pubsw/packages/heudiconv/heudiconv.simg. If this is correct and that singularity image has been correctly created with any of the multiple possible ways of building them, then you do not need the bit of your command line that looks like a docker image identifier (nipy/heudiconv:latest).

Then, you also have set the --bind argument at a wrong position.

The appropriate command line is:

singularity run -B /data:/base /usr/pubsw/packages/heudiconv/heudiconv.simg -d /base/dicom/sub-{subject}/ses-{session}/scans/ */* .dcm -o /base/nifti/ -f convertall -s 01 -ss 001 -c none --overwrite

Hi @oesteban,
Appreciate your help. When I try the command suggested, I receive a new error: heudiconv: error: unrecognized arguments: Desktop/lauren Documents/archives_below_308 matlab/startup.m .dcm

What is the exact command line you are running? It seems different from the earlier post. Did you switch to docker?

I am also trying to get heudiconv to work using Singularity. I have it working using Docker, and the following command lines:

docker run --rm -it -v /media/danella/DATA1/BEAM:/base nipy/heudiconv:latest -d /base/Dicom/sub-{subject}/ses-{session}//.dcm -o /base/Nifti_temp/ -f convertall -s 103 -ss pre -c none

docker run --rm -it -v /media/danella/DATA1/BEAM:/base nipy/heudiconv:latest -d /base/Dicom/sub-{subject}/ses-{session}//.dcm -o /base/Nifti_temp/ -f /base/Nifti/code/convert_BEAM.py -s 103 -ss pre -c dcm2niix -b

I attempted to convert this to singularity using the following code (where DATA_RAW has been defined as “base” above:

singularity run -B ${DATA_RAW}:${HOME}/data ${HOME}/heudiconv-unstable.simg -d ${HOME}/data/Dicom/sub-{subject}/ses-{session}//.dcm -o ${HOME}/data/Nifti_temp/ -f convertall -s 103 -ss pre -c none

singularity run -B ${DATA_RAW}:${HOME}/data ${HOME}/heudiconv-unstable.simg -d ${HOME}/data/Dicom/sub-{subject}/ses-{session}//.dcm -o ${HOME}/data/Nifti_temp/ -f ${HOME}/data/Nifti/code/convert_BEAM.py -s 103 -ss pre -c dcm2niix -b

The first item of code runs, and the .heudiconv directory is created. However, the second part crashes following conversion of the first run, with the following error code:

ERROR: Embedding failed: [Errno 1] Operation not permitted: ‘/home/danella/data/Nifti_temp/sub-103/ses-pre/func/sub-103_ses-pre_task-cogert1_run-001_bold.json’
INFO: Post-treating /home/danella/data/Nifti_temp/sub-103/ses-pre/func/sub-103_ses-pre_task-cogert1_run-001_bold.json file
Traceback (most recent call last):
File “/opt/miniconda-latest/bin/heudiconv”, line 11, in
load_entry_point(‘heudiconv’, ‘console_scripts’, ‘heudiconv’)()
File “/src/heudiconv/heudiconv/cli/run.py”, line 127, in main
process_args(args)
File “/src/heudiconv/heudiconv/cli/run.py”, line 323, in process_args
dcmconfig=args.dcmconfig,)
File “/src/heudiconv/heudiconv/convert.py”, line 199, in prep_conversion
dcmconfig=dcmconfig,)
File “/src/heudiconv/heudiconv/convert.py”, line 318, in convert
treat_infofile(scaninfo)
File “/src/heudiconv/heudiconv/utils.py”, line 256, in treat_infofile
set_readonly(filename, False)
File “/src/heudiconv/heudiconv/utils.py”, line 389, in set_readonly
os.chmod(path, new_perms)
PermissionError: [Errno 1] Operation not permitted: ‘/home/danella/data/Nifti_temp/sub-103/ses-pre/func/sub-103_ses-pre_task-cogert1_run-001_bold.json’

I am a complete newbie when it comes to Singularity, so any assistance would be much appreciated!

This may be unhelpful, but have you tried pulling a different version of heudiconv?

I created my singularity image using singularity pull docker://nipy/heudiconv:0.5.4 (we have singularity 3 installed on our cluster), but also have it working with the debian heudiconv release: https://hub.docker.com/r/nipy/heudiconv/tags

See however my issue at: Overwriting heudiconv sub.auto.txt and sub.edit.txt

I am also at the Martinos center, and got the same singularity: No match. error. The center uses tcsh as the default shell. When I typed the same command in bash, it worked fine:

tcsh$ bash
bash$ singularity run -B /autofs/space/piper_003/users/path/to/my/files/:/base /usr/pubsw/packages/heudiconv/heudiconv.simg -d /base/dicom/{subject}/* -o /base/nifti/ -f convertall -s mysubject080 -c none

I also discovered that absolute paths don’t work; the -B is essential because it gives Singularity read access to the files. If you don’t use -B with the colon notation, the following error is printed:
INFO: Need to process 0 study sessions

1 Like

Apologies on reviving an old thread, but I followed the the above commands and I’ve noticed two things that have completely flummoxed me:

singularity run -B /common/projects/[study]/:/base /common/tools/containers/heudiconv/heudiconv_0.5.4.sif -d /base/dicom/01/01-{subject}// -o /base/subjects/01/01-{subject}/nii -f reproin -c dcm2niix -b -s 001 003 004 005 006 007 008 009 010 011 --minmeta -l .

  1. When it runs, it creates only a subjects/01/01-{subject} dir, and doesn’t replace it with the actual subject ID; for instance I’d prefer it to be subjects/01/01-001, /01-003, etc, with a nii dir in each that has all the nii files in it. Instead just the above {subject} dir is created, even with multiple subjects.

  2. It doesn’t actually create .nii files; instead it creates the .json file, .tsv file, CHANGES, README and a .heudiconv which seems to just contain more useless filetypes.

I’m not sure what to do from here as it does appear to be reading the dirs and seeing the files, but it doesn’t create them for reasons I don’t understand, and doesn’t create the directory structure either. Of note, the subjects themselves have been deidentified in their data and I only have access to the dcm files that have this deidentification.

Any ideas anyone?