Reproducible Dynamic Causal Modeling/ model based effective connectivity analysis for fMRI

spm12
nipype
fmri
neuroimaging
docker
#1

Dear all,

I am doing a Masters degree in Cognitive Neuroscience and was thinking on doing my master thesis on effective connectivity in Theory of Mind. I like the model-based approach of Dynamic Causal Modeling (DCM). I have been thinking about how to do that analysis in a reproducible manner. I have been reading a lot recently about reproducible fMRI analysis methods, like using Nipype in a Docker container.

However, I am unsure about how I would be able to achieve the same kind of reproducibility when using SPM, in comparison to using Nipype and Docker, where I can document every step of the analysis in a Jupyter Notebook. Nipype only seems to have interfaces for preprocessing and first level analysis in SPM12, but not for any of the DCM functionality.

I also found a Github repository where someone tried to port DCM to MNE-python, but the last commit was two years ago, so I don’t think anything will come after that.

Does any of you have an idea how to address this? Any suggestions are welcome!

Thanks a lot!

#2

Ahoi hoi @mschoettner,

great that you want to make your analyses reproducible and thinking about it at this early stage. Here are my two cents:

I think overall, you have a few options here, depending on what exactly do you want to do and how your analyses pipeline is set up.

  1. bring DCM into python (that might be rough)

  2. create a new nipype interface for DCM, following the other way the other SPM functions are implemented (for a nice overview on how to create interfaces check the corresponding part of the nipype tutorial)

  3. use DCM within a function node (for examples on how to use functions node, check here)

  4. you can use a matlab kernel within a jupyter notebook

As you were also mentioning docker: you should use the standalone version of SPM within the docker (or rather singularity image, as you want to run DCM analyses on a server system rather than your local machine). Otherwise, reproducibility would become more difficult as matlab licenses and versions need to be addressed. For example docker images that include the standalone version of SPM have a look here, here or here. Nipype supports this version of SPM without problems (please see here).
To create a docker image with the standalone version of SPM , just use neurodocker, for example like this:

  docker run --rm kaczmarj/neurodocker:master generate docker \
  --base neurodebian:stretch-non-free \
  --pkg-manager apt \
  --spm12 version=dev \
  --user=dcm_repro \
  --miniconda \
    conda_install="python=3.6 jupyter jupyterlab jupyter_contrib_nbextensions
                            nbformat nb_conda" \
    pip_install="https://github.com/nipy/nipype/tarball/master" \
    create_env="dcm_repro" \
    activate=true \
  --run-bash 'source activate neuro && jupyter nbextension enable exercise2/main && jupyter nbextension enable spellchecker/main' \
  --user=dcm_repro \
  --run 'mkdir -p ~/.jupyter && echo c.NotebookApp.ip = \"0.0.0.0\" > ~/.jupyter/jupyter_notebook_config.py' \
  --workdir /home/dcm_repro/dcm_nipype \
  --cmd jupyter-notebook  | docker build -

@Guillaume, do you have any other ideas and/or more insights?

HTH, cheers, Peer

Edit: for a nice thread on SPM and docker please have a look here.

1 Like
#3

Hey Peer,

thanks for the quick and thorough reply!

From the options you listed, I will probably opt for 2., as it seems both doable without an excessive amount of effort (as would bringing DCM into Python), while at the same time enabling other researchers to make their DCM analyses reproducible as well.

I have one follow-up question though, regarding your recommendation to use singularity and cloud computing over doing it locally. Is this a matter of computational power, or are there other reasons you would recommend it? I guess these types of analysis can become quite challenging in terms of resources, depending on how complex the model is.

Anyway, thank you again!

Best,
Mikkel

#4

Ahoi hoi @mschoettner,

no biggie.

Sounds great, keep us posted about your endeavours!

I was indeed referring to computational power with respect to the high demands DCM analyses can require. While it’s of course not impossible to run such things on a local machine using docker (or singularity), it is definitely not the best option. That being said, the admins of whatever server system and/or cloud computing resource you might have access to most likely won’t allow docker, but might be okay with (or already) using singularity.

HTH, cheers, Peer

1 Like