Fmriprep: how to incorporate pre-run freesurfer?

I’m sorry if this has already been asked somewhere.

How/can I run fmriprep without having to rerun freesurfer if I already have surfaces for my participants? I have two use cases for this:

  1. re-running the functional processing with different parameters
  2. working with freesurfer outputs from the HCPPipelines…

It sounds (from some other threads) like this is possible, but we are having trouble figuring it out the command-line args for it from the docs.


1 Like

When you run fmriprep, you should specify an output directory.

If you place your precomputed FreeSurfer results at that point, they will automatically be taken in by fmriprep.

In practice: if you intend to point your outputs to /some/output/folder, then store the FreeSurfer directory under that folder (/some/output/folder/freesurfer). The tree should look then like:


Finally, you call fmriprep like fmriprep /path/to/bids_root /some/output/folder --participant_label <subject_label>



But it looks it still rerun the freesurfer even under the /some/output/folder/freesurfer folder.
How should we avoid reruning freesurfer?

Can you share your full command, as well as the output that indicates re-running FreeSurfer?

I just wanted to follow up here. I copied my pre-run freesurfer output into the directory fmriprep would write it into, and ran the following command:

docker run -ti --rm \
     -v /Users/ilkay.isik/localApps/freesurfer/license.txt:/opt/freesurfer/license.txt:ro \
     -v /Users/ilkay.isik/project_folder_temp/fc_content/MRI_data/BIDS/Nifti:/data:ro \
     -v /Users/ilkay.isik/project_folder_temp/fc_content/MRI_data/BIDS/Nifti:/out \
     poldracklab/fmriprep:latest \
     /data /out/out \
     participant \
     --participant-label sub-01 \
     --ignore slicetiming \
     --output-space T1w template fsnative fsaverage \
     --use-aroma \
     --write-graph \
     --work-dir /out/out/temp_files/sub-01

then this happens:

RuntimeError: Command:
recon-all -autorecon-hemi rh -noparcstats -nocortparc2 -noparcstats2 -nocortparc3 -noparcstats3 -nopctsurfcon -nohyporelabel -noaparc2aseg -noapas2aseg -nosegstats -nowmparc -nobalabels -openmp 4 -subjid sub-01 -sd /out/out/freesurfer -notessellate -nosmooth1 -noinflate1 -noqsphere -nofix -nosmooth2 -noinflate2 -nocurvstats -nosphere -nosurfreg -nojacobian_white -noavgcurv -nopial
Standard output:
INFO: FreeSurfer build stamps do not match
Subject Stamp: freesurfer-Darwin-lion-stable-pub-v5.3.0
Current Stamp: freesurfer-Linux-centos6_x86_64-stable-pub-v6.0.1-f53a55a
INFO: SUBJECTS_DIR is /out/out/freesurfer
Actual FREESURFER_HOME /opt/freesurfer
Linux 907aa3ab8041 4.9.93-linuxkit-aufs #1 SMP Wed Jun 6 16:55:56 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
INFO: current FREESURFER_HOME does not match that of previous processing.
    Current: /opt/freesurfer
    Previous: /Users/ilkay.isik/localApps/freesurfer
'/opt/freesurfer/bin/recon-all' -> '/out/out/freesurfer/sub-01/scripts/recon-all.local-copy'
#@# Make White Surf rh Thu Oct 11 19:02:49 UTC 2018

mris_make_surfaces -aseg ../mri/aseg.presurf -white white.preaparc -noaparc -mgz -T1 brain.finalsurfs sub-01 rh 

using white.preaparc as white matter name...
using aseg volume ../mri/aseg.presurf to prevent surfaces crossing the midline
not using aparc to prevent surfaces crossing the midline
INFO: assuming MGZ format for volumes.
using brain.finalsurfs as T1 volume...
$Id: mris_make_surfaces.c,v 2016/12/13 22:26:32 zkaufman Exp $
$Id: mrisurf.c,v 1.781.2.6 2016/12/27 16:47:14 zkaufman Exp $
reading volume /out/out/freesurfer/sub-01/mri/filled.mgz...
reading volume /out/out/freesurfer/sub-01/mri/brain.finalsurfs.mgz...
reading volume /out/out/freesurfer/sub-01/mri/../mri/aseg.presurf.mgz...
mghRead(/out/out/freesurfer/sub-01/mri/../mri/aseg.presurf.mgz, -1): could not open file
mris_make_surfaces: could not read segmentation volume /out/out/freesurfer/sub-01/mri/../mri/aseg.presurf.mgz
Linux 907aa3ab8041 4.9.93-linuxkit-aufs #1 SMP Wed Jun 6 16:55:56 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux

recon-all -s sub-01 exited with ERRORS at Thu Oct 11 19:02:50 UTC 2018

For more details, see the log file /out/out/freesurfer/sub-01/scripts/recon-all-rh.log
To report a problem, see

Standard error:

Return code: 1

181011-19:03:07,938 nipype.workflow INFO:
	 [Node] Finished "fmriprep_wf.single_subject_01_wf.anat_preproc_wf.surface_recon_wf.fsnative_2_t1_xfm".
181011-19:03:09,499 nipype.workflow INFO:
	 [Node] Setting-up "fmriprep_wf.single_subject_01_wf.anat_preproc_wf.surface_recon_wf.t1_2_fsnative_xfm" in "/out/out/temp_files/sub-01/fmriprep_wf/single_subject_01_wf/anat_preproc_wf/surface_recon_wf/t1_2_fsnative_xfm".
181011-19:03:09,509 nipype.workflow INFO:
	 [Node] Running "t1_2_fsnative_xfm" ("nipype.interfaces.freesurfer.utils.LTAConvert"), a CommandLine Interface with command:
lta_convert --inlta /out/out/temp_files/sub-01/fmriprep_wf/single_subject_01_wf/anat_preproc_wf/surface_recon_wf/fsnative_2_t1_xfm/T1_robustreg.lta --invert --outlta /out/out/temp_files/sub-01/fmriprep_wf/single_subject_01_wf/anat_preproc_wf/surface_recon_wf/t1_2_fsnative_xfm/out.lta
181011-19:03:09,719 nipype.workflow INFO:
	 [Node] Finished "fmriprep_wf.single_subject_01_wf.anat_preproc_wf.surface_recon_wf.t1_2_fsnative_xfm".
181011-19:03:11,505 nipype.workflow INFO:
	 [Node] Setting-up "fmriprep_wf.single_subject_01_wf.anat_preproc_wf.anat_derivatives_wf.lta_2_itk" in "/out/out/temp_files/sub-01/fmriprep_wf/single_subject_01_wf/anat_preproc_wf/anat_derivatives_wf/lta_2_itk".
181011-19:03:11,518 nipype.workflow INFO:
	 [Node] Running "lta_2_itk" ("nipype.interfaces.freesurfer.utils.LTAConvert"), a CommandLine Interface with command:
lta_convert --inlta /out/out/temp_files/sub-01/fmriprep_wf/single_subject_01_wf/anat_preproc_wf/surface_recon_wf/t1_2_fsnative_xfm/out.lta --outitk /out/out/temp_files/sub-01/fmriprep_wf/single_subject_01_wf/anat_preproc_wf/anat_derivatives_wf/lta_2_itk/out.txt
181011-19:03:11,699 nipype.workflow INFO:
	 [Node] Finished "fmriprep_wf.single_subject_01_wf.anat_preproc_wf.anat_derivatives_wf.lta_2_itk".
181011-19:03:15,357 nipype.workflow ERROR:
	 could not run node: fmriprep_wf.single_subject_01_wf.anat_preproc_wf.surface_recon_wf.autorecon_resume_wf.autorecon_surfs
181011-19:03:15,360 nipype.workflow ERROR:
	 could not run node: fmriprep_wf.single_subject_01_wf.anat_preproc_wf.surface_recon_wf.autorecon_resume_wf.autorecon_surfs
Errors occurred while generating reports for participants: 01 (2).

Could you help me how to read this error?

Oh, after staring at the error for a while. I guess the issue here is that the fs recon-all is ran using an older version of freesurfer.

1 Like

Yes. FreeSurfer changed some of its steps between versions. The Nipype interface we use checks the outputs of each step to see if anything is undone, according to the expected outputs of the current version. So because we are checking a 5.3 directory with 6.0, it will see the reconstruction as unfinished.

While the FreeSurfer group highly recommends using 6.0, if you absolutely need 5.3-reconstructed subjects, the easiest way to do this will be to create a new image with 5.3 in it. Let me know if you want to do that and you want any guidance.

1 Like

No, I don’t have to stick to 5.3.
Probably the best for me is to run fmriprep with fs-recon all and upgrade the reconstructions to version 6.
Thanks for the offer though :slight_smile:

Hi again, I need another follow up on that issue…
I need to run fmriprep with pre-run freesurfer for a couple of subjects.

When I ran the following command [given that the prerun freesurfer directory is in the place where fmriprep would write into] it completes almost all of the steps successfully.

docker run -ti --rm \
     -v /Users/ilkay.isik/localApps/freesurfer/license.txt:/opt/freesurfer/license.txt:ro \
     -v /Users/ilkay.isik/project_folder_temp/fc_content/MRI_data/data_BIDS/Nifti:/data:ro \
     -v /Users/ilkay.isik/project_folder_temp/fc_content/MRI_data/data_BIDS/Nifti:/out \
     poldracklab/fmriprep:1.1.8 \
     /data /out/out \
     participant \
     --participant-label sub-$subj \
     --ignore slicetiming \
     --output-space T1w template fsnative fsaverage \
     --use-aroma \
     --write-graph \
     --nthreads 2 --n_cpus 3 --mem_mb 29000 \
     --work-dir /out/out/temp_files/sub-$subj

but in the end gives this error:

could not run node: fmriprep_wf.single_subject_01_wf.anat_preproc_wf.t1_2_mni
Errors occurred while generating reports for participants: 01 (1).

And this is what is written in the crash report:

Node: fmriprep_wf.single_subject_01_wf.anat_preproc_wf.t1_2_mni
Working directory: /out/out/temp_files/sub-01/fmriprep_wf/single_subject_01_wf/anat_preproc_wf/t1_2_mni

Node inputs:

compress_report = auto
explicit_masking = True
flavor = precise
float = True
initial_moving_transform = <undefined>
lesion_mask = <undefined>
moving = T1
moving_image = <undefined>
moving_mask = <undefined>
num_threads = 2
orientation = RAS
out_report = report.svg
reference = T1
reference_image = <undefined>
reference_mask = <undefined>
settings = <undefined>
template = mni_icbm152_nlin_asym_09c
template_resolution = 1

Traceback (most recent call last):
  File "/usr/local/miniconda/lib/python3.6/site-packages/nipype/pipeline/plugins/", line 69, in run_node
    result['result'] =
  File "/usr/local/miniconda/lib/python3.6/site-packages/nipype/pipeline/engine/", line 408, in run
    cached, updated = self.is_cached()
  File "/usr/local/miniconda/lib/python3.6/site-packages/nipype/pipeline/engine/", line 294, in is_cached
    hashed_inputs, hashvalue = self._get_hashval()
  File "/usr/local/miniconda/lib/python3.6/site-packages/nipype/pipeline/engine/", line 488, in _get_hashval
  File "/usr/local/miniconda/lib/python3.6/site-packages/nipype/pipeline/engine/", line 531, in _get_inputs
    self.set_input(key, deepcopy(output_value))
  File "/usr/local/miniconda/lib/python3.6/site-packages/nipype/pipeline/engine/", line 276, in set_input
    setattr(self.inputs, parameter, deepcopy(val))
  File "/usr/local/miniconda/lib/python3.6/site-packages/nipype/interfaces/base/", line 103, in validate
    validated_value = super(File, self).validate(object, name, value)
  File "/usr/local/miniconda/lib/python3.6/site-packages/traits/", line 411, in validate
    self.error( object, name, value )
  File "/usr/local/miniconda/lib/python3.6/site-packages/traits/", line 172, in error
    value )
traits.trait_errors.TraitError: The 'lesion_mask' trait of a RobustMNINormalizationInputSpecRPT instance must be an existing file name, but a value of ['/data/out/fmriprep/sub-01/anat/sub-01_T1w_label-aparcaseg_roi.nii.gz', '/data/out/fmriprep/sub-01/anat/sub-01_T1w_label-aseg_roi.nii.gz'] <class 'list'> was specified.
Error setting node input:
Node: t1_2_mni
input: lesion_mask
results_file: /out/out/temp_files/sub-01/fmriprep_wf/single_subject_01_wf/bidssrc/result_bidssrc.pklz
value: ['/data/out/fmriprep/sub-01/anat/sub-01_T1w_label-aparcaseg_roi.nii.gz', '/data/out/fmriprep/sub-01/anat/sub-01_T1w_label-aseg_roi.nii.gz']

Hi @ilkay_isik. The problem is that your derivatives are in a subdirectory of your dataset, and being indexed (we ignore <bids-root>/derivatives, but not all subdirectories), so when we output _roi files (which we will stop doing soon; the naming conventions for those segmentation files will be standardized in the next release) those are being picked up in later runs.

There are a few solutions:

  1. You can delete your /data/out/fmriprep directory; any future runs will re-populate it, so there’s no harm.
  2. You can move your derivative into /data/derivatives instead of /data/out.
  3. You can move your derivatives into some other directory that’s not a subdirectory of /data.

Options 2 and 3 are “correct” ways to organize derivatives, according to the release candidate spec, and we will know not to index them.

Option 1 is probably not a great solution, as it’s easy to forget to delete them, and how subdirectories like out/ are treated might vary a lot from app to app.

1 Like

Yes that was it. Thank you!

3 posts were split to a new topic: Pre-run, multi-session FreeSurfer and fMRIPrep

Hey there effigies! Sorry to bring back such an old topic, but I have Freesurfer v5.3 brains I want to use with fMRIPrep v20.0.5 (currently it only handles FreeSurfer v6.0, as you’ve mentioned). I would really like to save processing time by using these pre-ran FreeSurfer data, how do I go about “creating a new image with 5.3 in it” to make this possible?

Thanks so much!!

The approach to doing that would be to check out that version of fMRIPrep:

git clone
cd fmriprep
git checkout 20.0.5

Then edit the Dockerfile here:

The new URL should be It’s possible that you’ll need to change the --exclude list… I’m not sure the consequences of excluding something that isn’t there. You can just remove all of the --excludes. The only result will be a larger Docker image.

Once you’ve done that, run docker build, as described in Rebuild Docker Image.

If you do this, you should be very explicit in any publications that you have used a modified version of fMRIPrep 20.0.5 to work with FreeSurfer 5.3.0. It’s possible that the text we generate hard-codes the assumption of FreeSurfer 6.0.1, so you might need to adjust that as well.

If you have access to a cluster with a reasonably low queue time and don’t need to reuse 5.3.0 data to maintain compatibility with results published in another study, I would seriously consider biting the bullet and rerunning. I’ve run subjects on 16 cores with 48GB of RAM in under 12 hours each, including FreeSurfer.