Using Nipype under Docker



Dear Neuroinfopracticioners,

I would like to deploy a Docker image for using Nipype with ANTS to perform a VBM analysis.

It would be great to have a little tutorial on using Nipype under Docker. How do I make an image that contains
both Nipype and the other softwares (FSL, ANTS, SPM etc.,) packed together?

Any help regarding this would be highly appreciated and if I get it working would be happy to share it with others
in the community



The Nipype docker image includes many of those tools.

details here:

images here:


Thank you Satra for the pointer.

But I would love to have an example of how to use this docker.

I guess I git clone the docker repo and then build it using
docker build -t=“docker/Dockerfile_py35”

And then I run it, but how do I get data in and out?

Thanks again for your help.


Hi @rajatthomas,

Since you are not planning on extending/modifying the current nipype docker image (it already contains AFNI, ANTs, FSL, FreeSurfer, etc), you are probably better off just pulling the docker image:

 docker pull nipype/nipype

I am currently pushing a fresh new image for you (this will be automated as of next release).

In order to run your code, I suggest adding it as a mounted directory:

docker run --rm -it -v <path-to-your-code>:/root/workspace/code -v <path-to-your-data>:/data -v <path-to-a-workdir>:/scratch -w /scratch

In that piece of code three mount points are specified (-v arguments), you can play around them as you wish.

If you need any further help, please let us know.


Is there any mechanism to combine multiple dockerized NiPype nodes and run commands in parallel?

Something akin to‘PBS’) ) but instead‘MultiDockerTargets’) or something similar?


@dagutman - not right now, but hopefully in the new overhaul we are working on.


I am confused about the MATLAB licenses for Nipype and Docker.

Do we need to point to the MATLAB license on the host? Or don’t we need MATLAB for SPM as it is used in docker?

This is bugging me!


Nipype uses a compiled version of SPM inside docker, so a license is not needed. This has now been superseded by the Neurodocker project, which allows you to build arbitrary neuroimaging containers with specific versions of software.


Wanted to follow up on this… has anyone played around with SGE + Docker/Singularity (and/or other similar grid technology).

Now that I am starting to use the docker container more regularly for my computation, would be nice to be able to distribute it to also run things at scale…


At the University of Iowa, our Argon cluster uses SGE and has Singularity installed. I’ve been using that setup to run MRIQC and FMRIPREP successfully on our cluster. I didn’t do the any of the hpc admin stuff, but I’ve asked them a lot of annoying questions. If you have any specific questions not covered in the linked documentation, I can try pinging our hpc admins to get some answers.


would you mind sending me a sample bash script and/or submit script that your running to dagutman —at—

I imagine I can just modify my PBS submit script and append singularity run nipype or equivalent… but would like to have a known working script so I can trouble shoot.


This is a template I used for FMRIPREP, I just used sed to replace SUBJECT with the subject label in the dataset:


#$ -pe smp 16
#$ -q UI
#$ -m bea
#$ -M
#$ -o /Shared/vosslabhpc/Projects/PACR-AD/Imaging/BIDS/code/fmriprep/out
#$ -e /Shared/vosslabhpc/Projects/PACR-AD/Imaging/BIDS/code/fmriprep/err
singularity run -H ${HOME}/singularity_home -B /Shared/vosslabhpc:/mnt \
/Shared/vosslabhpc/UniversalSoftware/SingularityContainers/jdkent_fmriprep_AromaFix-2018-02-12-0353a5bbb24d.img \
/mnt/Projects/PACR-AD/Imaging/BIDS /mnt/Projects/PACR-AD/Imaging/BIDS/derivatives \
participant --participant_label SUBJECT \
-w /mnt/Projects/PACR-AD/Imaging/BIDS/derivatives/work/fmriprepAromaFix \
--write-graph --mem_mb 35000 --omp-nthreads 10 --nthreads 16 --use-aroma \
--output-space template \
--template MNI152NLin2009cAsym \
--fs-license-file /mnt/UniversalSoftware/freesurfer_license.txt


  • I have the containers saved in a particular directory: /Shared/vosslabhpc/UniversalSoftware/SingularityContainers/
  • I have singularity use an empty directory as it’s “HOME” directory so no dot files could possibly overwrite the environment variables set by the singularity container: -H ${HOME}/singularity_home
  • I know there are other (probably better) ways of submitting subjects to the cluster, but this is working for me right now.

Hope this helps!