Fmriprep 21.0.1 error

Hello community.

I’m new to fmriprep and using a local installation of the version 21.0.1. I’m running fmriprep on a single subject’s high-res data acquired @7T. This is my command:

fmriprep /home/khushboo/Documents/7T/bids /home/khushboo/Documents/7T/derivatives/ -w /home/khushboo/Documents/7T/derivatives/fmriprep-temp participant --participant-label pilote3a --fd-spike-threshold 0.5 --dvars-spike-threshold 2.0 --output-spaces T1w --output-spaces T1w --bold2t1w-dof 9 --skull-strip-t1w skip --return-all-components --skip_bids_validation --mem-mb 50000 --omp-nthreads 10 --nthreads 12 --fs-no-reconall --fs-license-file /usr/local/freesurfer/license.txt

And I get the following warning and error:

220218-01:54:52,391 cli WARNING:
Telemetry disabled because sentry_sdk is not installed.
220218-01:54:55,959 nipype.workflow IMPORTANT:
Running fMRIPrep version 21.0.1

     License NOTICE ##################################################
     fMRIPrep 21.0.1
     Copyright 2021 The NiPreps Developers.
     
     This product includes software developed by
     the NiPreps Community (https://nipreps.org/).
     
     Portions of this software were developed at the Department of
     Psychology at Stanford University, Stanford, CA, US.
     
     This software redistributes the versioneer Python package, which is
     Public domain source code.
     
     This software is also distributed as a Docker container image.
     The bootstraping file for the image ("Dockerfile") is licensed
     under the MIT License.
     
     This software may be distributed through an add-on package called
     "Docker Wrapper" that is under the BSD 3-clause License.
     #################################################################

220218-01:54:55,998 nipype.workflow IMPORTANT:
Building fMRIPrep’s workflow:
* BIDS dataset path: /home/khushboo/Documents/7T/bids.
* Participant list: [‘pilote3a’].
* Run identifier: 20220218-015450_4dd6f6d2-0d5a-4dc5-8f98-e4fb393d9bbd.
* Output spaces: T1w.
* Pre-run FreeSurfer’s SUBJECTS_DIR: /home/khushboo/Documents/7T/derivatives/sourcedata/freesurfer.
Process Process-2:
Traceback (most recent call last):
File “/usr/lib/python3.9/multiprocessing/process.py”, line 315, in _bootstrap
self.run()
File “/usr/lib/python3.9/multiprocessing/process.py”, line 108, in run
self._target(*self._args, **self._kwargs)
File “/home/khushboo/.local/lib/python3.9/site-packages/fmriprep/cli/workflow.py”, line 118, in build_workflow
retval[“workflow”] = init_fmriprep_wf()
File “/home/khushboo/.local/lib/python3.9/site-packages/fmriprep/workflows/base.py”, line 85, in init_fmriprep_wf
single_subject_wf = init_single_subject_wf(subject_id)
File “/home/khushboo/.local/lib/python3.9/site-packages/fmriprep/workflows/base.py”, line 259, in init_single_subject_wf
anat_preproc_wf = init_anat_preproc_wf(
File “/home/khushboo/.local/lib/python3.9/site-packages/smriprep/workflows/anatomical.py”, line 327, in init_anat_preproc_wf
ants_ver=ANTsInfo.version() or “(version unknown)”,
File “/usr/lib/python3/dist-packages/nipype/interfaces/base/core.py”, line 1152, in version
klass._version = klass.parse_version(raw_info)
File “/usr/lib/python3/dist-packages/nipype/interfaces/ants/base.py”, line 47, in parse_version
if “post” in v_string and LooseVersion(v_string) >= LooseVersion(
File “/usr/lib/python3.9/distutils/version.py”, line 70, in ge
c = self._cmp(other)
File “/usr/lib/python3.9/distutils/version.py”, line 341, in _cmp
if self.version < other.version:
TypeError: ‘<’ not supported between instances of ‘str’ and ‘int’

Can anyone please help me make sense of this error and resolve it?
Thanks,
Khushboo

Hello,

Are you running in Docker, Singularity, or Python? Also, is your dataset BIDS-valid? In addition, you specified --output-spaces twice.

It looks like you may not have ANTs installed or it may be too old a version, so I am guessing you are running in Python. You should either install/update ANTs or use Singularity/Docker (recommended).

Steven

1 Like

This looks like a recent issue in nipype with not handling dev versions of ANTs correctly. I would suggest using the same version of ANTs as in fMRIPrep.

Hello,
I was running fmriprep in python with an older version of ANTs. Even after updating the ANTs version, the above error persisted (I did remove the repeated --output-spaces flag). I did run it in Docker and it starts fine, but it still fails with the following log:

220218-13:53:18,235 nipype.workflow WARNING:
Previous output generated by version 0+unknown found.
220218-13:53:18,287 nipype.workflow IMPORTANT:
Building fMRIPrep’s workflow:
* BIDS dataset path: /data.
* Participant list: [‘pilote3a’].
* Run identifier: 20220218-135311_5d601af3-be46-4e4d-b1ef-e7ec9362b2c8.
* Output spaces: T1w.
* Pre-run FreeSurfer’s SUBJECTS_DIR: /home/khushboo/Documents/7T/derivatives/freesurfer.
220218-13:53:19,840 nipype.workflow INFO:
No single-band-reference found for sub-pilote3a_task-Interception_run-01_bold.nii.gz.
220218-13:53:19,957 nipype.workflow IMPORTANT:
BOLD series will be slice-timing corrected to an offset of 0.507s.
220218-13:53:20,78 nipype.workflow INFO:
No single-band-reference found for sub-pilote3a_task-Interception_run-02_bold.nii.gz.
220218-13:53:20,182 nipype.workflow IMPORTANT:
BOLD series will be slice-timing corrected to an offset of 0.507s.
220218-13:53:22,471 nipype.workflow INFO:
fMRIPrep workflow graph with 529 nodes built successfully.
220218-13:53:30,179 nipype.workflow IMPORTANT:
fMRIPrep started!
220218-13:53:30,590 nipype.workflow WARNING:
Some nodes exceed the total amount of memory available (50.00GB).
220218-13:53:32,382 nipype.workflow INFO:
[Node] Setting-up “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_01_wf.bold_source” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/func_preproc_task_Interception_run_01_wf/bold_source”.
220218-13:53:32,385 nipype.workflow INFO:
[Node] Running “bold_source” (“nipype.interfaces.utility.base.Select”)
220218-13:53:32,388 nipype.workflow INFO:
[Node] Setting-up “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_01_wf.func_derivatives_wf.raw_sources” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/func_preproc_task_Interception_run_01_wf/func_derivatives_wf/raw_sources”.
220218-13:53:32,390 nipype.workflow INFO:
[Node] Running “raw_sources” (“nipype.interfaces.utility.wrappers.Function”)
220218-13:53:32,417 nipype.workflow INFO:
[Node] Finished “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_01_wf.bold_source”.
220218-13:53:32,419 nipype.workflow INFO:
[Node] Finished “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_01_wf.func_derivatives_wf.raw_sources”.
220218-13:53:32,430 nipype.workflow INFO:
[Node] Setting-up “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_02_wf.bold_source” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/func_preproc_task_Interception_run_02_wf/bold_source”.
220218-13:53:32,432 nipype.workflow INFO:
[Node] Running “bold_source” (“nipype.interfaces.utility.base.Select”)
220218-13:53:32,463 nipype.workflow INFO:
[Node] Finished “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_02_wf.bold_source”.
220218-13:53:32,465 nipype.workflow INFO:
[Node] Setting-up “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_02_wf.func_derivatives_wf.raw_sources” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/func_preproc_task_Interception_run_02_wf/func_derivatives_wf/raw_sources”.
220218-13:53:32,466 nipype.workflow INFO:
[Node] Running “raw_sources” (“nipype.interfaces.utility.wrappers.Function”)
220218-13:53:32,469 nipype.workflow INFO:
[Node] Finished “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_02_wf.func_derivatives_wf.raw_sources”.
220218-13:53:32,691 nipype.workflow INFO:
[Node] Setting-up “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_01_wf.initial_boldref_wf.val_bold” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/func_preproc_task_Interception_run_01_wf/initial_boldref_wf/val_bold”.
220218-13:53:32,694 nipype.workflow INFO:
[Node] Setting-up “_val_bold0” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/func_preproc_task_Interception_run_01_wf/initial_boldref_wf/val_bold/mapflow/_val_bold0”.
220218-13:53:32,694 nipype.workflow INFO:
[Node] Setting-up “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_01_wf.initial_boldref_wf.get_dummy” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/func_preproc_task_Interception_run_01_wf/initial_boldref_wf/get_dummy”.
220218-13:53:32,696 nipype.workflow INFO:
[Node] Running “_val_bold0” (“niworkflows.interfaces.header.ValidateImage”)
220218-13:53:32,697 nipype.workflow INFO:
[Node] Running “get_dummy” (“niworkflows.interfaces.bold.NonsteadyStatesDetector”)
220218-13:53:32,708 nipype.workflow INFO:
[Node] Finished “_val_bold0”.
220218-13:53:32,710 nipype.workflow INFO:
[Node] Finished “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_01_wf.initial_boldref_wf.val_bold”.
220218-13:53:32,711 nipype.workflow INFO:
[Node] Setting-up “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_02_wf.initial_boldref_wf.get_dummy” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/func_preproc_task_Interception_run_02_wf/initial_boldref_wf/get_dummy”.
220218-13:53:32,713 nipype.workflow INFO:
[Node] Running “get_dummy” (“niworkflows.interfaces.bold.NonsteadyStatesDetector”)
220218-13:53:32,769 nipype.workflow INFO:
[Node] Setting-up “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_02_wf.initial_boldref_wf.val_bold” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/func_preproc_task_Interception_run_02_wf/initial_boldref_wf/val_bold”.
220218-13:53:32,772 nipype.workflow INFO:
[Node] Setting-up “_val_bold0” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/func_preproc_task_Interception_run_02_wf/initial_boldref_wf/val_bold/mapflow/_val_bold0”.
220218-13:53:32,775 nipype.workflow INFO:
[Node] Running “_val_bold0” (“niworkflows.interfaces.header.ValidateImage”)
220218-13:53:32,792 nipype.workflow INFO:
[Node] Finished “_val_bold0”.
220218-13:53:32,794 nipype.workflow INFO:
[Node] Finished “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_02_wf.initial_boldref_wf.val_bold”.
220218-13:53:33,527 nipype.workflow INFO:
[Node] Setting-up “fmriprep_wf.single_subject_pilote3a_wf.bidssrc” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/bidssrc”.
220218-13:53:33,529 nipype.workflow INFO:
[Node] Running “bidssrc” (“niworkflows.interfaces.bids.BIDSDataGrabber”)
220218-13:53:33,556 nipype.interface INFO:
No “t2w” images found for sub-pilote3a
220218-13:53:33,556 nipype.interface INFO:
No “flair” images found for sub-pilote3a
220218-13:53:33,556 nipype.interface INFO:
No “sbref” images found for sub-pilote3a
220218-13:53:33,556 nipype.interface INFO:
No “roi” images found for sub-pilote3a
220218-13:53:33,560 nipype.workflow INFO:
[Node] Finished “fmriprep_wf.single_subject_pilote3a_wf.bidssrc”.
220218-13:53:34,794 nipype.workflow INFO:
[Node] Setting-up “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_01_wf.select_bold” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/func_preproc_task_Interception_run_01_wf/select_bold”.
220218-13:53:34,899 nipype.workflow INFO:
[Node] Setting-up “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_02_wf.select_bold” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/func_preproc_task_Interception_run_02_wf/select_bold”.
220218-13:53:34,946 nipype.workflow INFO:
[Node] Setting-up “fmriprep_wf.single_subject_pilote3a_wf.anat_preproc_wf.anat_template_wf.t1w_ref_dimensions” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/anat_preproc_wf/anat_template_wf/t1w_ref_dimensions”.
220218-13:53:35,98 nipype.workflow INFO:
[Node] Running “select_bold” (“nipype.interfaces.utility.base.Select”)
220218-13:53:35,125 nipype.workflow INFO:
[Node] Finished “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_01_wf.select_bold”.
220218-13:53:35,166 nipype.workflow INFO:
[Node] Running “select_bold” (“nipype.interfaces.utility.base.Select”)
220218-13:53:35,193 nipype.workflow INFO:
[Node] Finished “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_02_wf.select_bold”.
220218-13:53:35,316 nipype.workflow INFO:
[Node] Setting-up “fmriprep_wf.single_subject_pilote3a_wf.bids_info” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/bids_info”.
220218-13:53:35,318 nipype.workflow INFO:
[Node] Running “bids_info” (“niworkflows.interfaces.bids.BIDSInfo”)
220218-13:53:35,325 nipype.workflow INFO:
[Node] Finished “fmriprep_wf.single_subject_pilote3a_wf.bids_info”.
220218-13:53:35,538 nipype.workflow INFO:
[Node] Running “t1w_ref_dimensions” (“niworkflows.interfaces.images.TemplateDimensions”)
220218-13:53:35,587 nipype.workflow INFO:
[Node] Finished “fmriprep_wf.single_subject_pilote3a_wf.anat_preproc_wf.anat_template_wf.t1w_ref_dimensions”.
220218-13:53:36,875 nipype.workflow INFO:
[Node] Setting-up “fmriprep_wf.single_subject_pilote3a_wf.anat_preproc_wf.anat_derivatives_wf.raw_sources” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/anat_preproc_wf/anat_derivatives_wf/raw_sources”.
220218-13:53:37,120 nipype.workflow INFO:
[Node] Running “raw_sources” (“nipype.interfaces.utility.wrappers.Function”)
220218-13:53:37,144 nipype.workflow INFO:
[Node] Finished “fmriprep_wf.single_subject_pilote3a_wf.anat_preproc_wf.anat_derivatives_wf.raw_sources”.
220218-13:53:39,146 nipype.workflow INFO:
[Node] Setting-up “_t1w_conform2” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/anat_preproc_wf/anat_template_wf/t1w_conform/mapflow/_t1w_conform2”.
220218-13:53:39,148 nipype.workflow INFO:
[Node] Running “_t1w_conform2” (“niworkflows.interfaces.images.Conform”)
220218-13:53:39,153 nipype.workflow INFO:
[Node] Setting-up “_t1w_conform1” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/anat_preproc_wf/anat_template_wf/t1w_conform/mapflow/_t1w_conform1”.
220218-13:53:39,155 nipype.workflow INFO:
[Node] Running “_t1w_conform1” (“niworkflows.interfaces.images.Conform”)
220218-13:53:39,156 nipype.workflow INFO:
[Node] Setting-up “_t1w_conform0” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/anat_preproc_wf/anat_template_wf/t1w_conform/mapflow/_t1w_conform0”.
220218-13:53:39,157 nipype.workflow INFO:
[Node] Running “_t1w_conform0” (“niworkflows.interfaces.images.Conform”)
220218-13:53:39,176 nipype.workflow INFO:
[Node] Finished “_t1w_conform2”.
220218-13:53:39,190 nipype.workflow INFO:
[Node] Finished “_t1w_conform1”.
220218-13:53:39,193 nipype.workflow INFO:
[Node] Finished “_t1w_conform0”.
220218-13:53:40,926 nipype.workflow INFO:
[Node] Setting-up “fmriprep_wf.single_subject_pilote3a_wf.anat_preproc_wf.anat_template_wf.t1w_conform” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/anat_preproc_wf/anat_template_wf/t1w_conform”.
220218-13:53:40,929 nipype.workflow INFO:
[Node] Setting-up “_t1w_conform0” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/anat_preproc_wf/anat_template_wf/t1w_conform/mapflow/_t1w_conform0”.
220218-13:53:40,930 nipype.workflow INFO:
[Node] Cached “_t1w_conform0” - collecting precomputed outputs
220218-13:53:40,930 nipype.workflow INFO:
[Node] “_t1w_conform0” found cached.
220218-13:53:40,931 nipype.workflow INFO:
[Node] Setting-up “_t1w_conform1” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/anat_preproc_wf/anat_template_wf/t1w_conform/mapflow/_t1w_conform1”.
220218-13:53:40,932 nipype.workflow INFO:
[Node] Cached “_t1w_conform1” - collecting precomputed outputs
220218-13:53:40,932 nipype.workflow INFO:
[Node] “_t1w_conform1” found cached.
220218-13:53:40,933 nipype.workflow INFO:
[Node] Setting-up “_t1w_conform2” in “/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/anat_preproc_wf/anat_template_wf/t1w_conform/mapflow/_t1w_conform2”.
220218-13:53:40,934 nipype.workflow INFO:
[Node] Cached “_t1w_conform2” - collecting precomputed outputs
220218-13:53:40,934 nipype.workflow INFO:
[Node] “_t1w_conform2” found cached.
220218-13:53:40,935 nipype.workflow INFO:
[Node] Finished “fmriprep_wf.single_subject_pilote3a_wf.anat_preproc_wf.anat_template_wf.t1w_conform”.
exception calling callback for <Future at 0x7fd0b916dee0 state=finished raised BrokenProcessPool>
Traceback (most recent call last):
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 328, in _invoke_callbacks
callback(self)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 437, in result
return self.__get_result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 389, in __get_result
raise self._exception
concurrent.futures.process.BrokenProcessPool: A process in the process pool was terminated abruptly while the future was running or pending.
exception calling callback for <Future at 0x7fd0b917e430 state=finished raised BrokenProcessPool>
Traceback (most recent call last):
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 328, in _invoke_callbacks
callback(self)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 437, in result
return self.__get_result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 389, in __get_result
raise self._exception
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 328, in _invoke_callbacks
callback(self)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 437, in result
return self.__get_result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 389, in __get_result
raise self._exception
concurrent.futures.process.BrokenProcessPool: A process in the process pool was terminated abruptly while the future was running or pending.
220218-13:53:45,525 nipype.workflow CRITICAL:
fMRIPrep failed: A child process terminated abruptly, the process pool is not usable anymore
exception calling callback for <Future at 0x7fd0b90c27f0 state=finished raised BrokenProcessPool>
Traceback (most recent call last):
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 328, in _invoke_callbacks
callback(self)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 437, in result
return self.__get_result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 389, in __get_result
raise self._exception
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 328, in _invoke_callbacks
callback(self)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 437, in result
return self.__get_result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 389, in __get_result
raise self._exception
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 328, in _invoke_callbacks
callback(self)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 437, in result
return self.__get_result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 389, in __get_result
raise self._exception
concurrent.futures.process.BrokenProcessPool: A process in the process pool was terminated abruptly while the future was running or pending.
exception calling callback for <Future at 0x7fd0b90ac160 state=finished raised BrokenProcessPool>
Traceback (most recent call last):
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 328, in _invoke_callbacks
callback(self)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 437, in result
return self.__get_result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 389, in __get_result
raise self._exception
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 328, in _invoke_callbacks
callback(self)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 437, in result
return self.__get_result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 389, in __get_result
raise self._exception
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 328, in _invoke_callbacks
callback(self)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 437, in result
return self.__get_result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 389, in __get_result
raise self._exception
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 328, in _invoke_callbacks
callback(self)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 437, in result
return self.__get_result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 389, in __get_result
raise self._exception
concurrent.futures.process.BrokenProcessPool: A process in the process pool was terminated abruptly while the future was running or pending.
exception calling callback for <Future at 0x7fd0b9044df0 state=finished raised BrokenProcessPool>
Traceback (most recent call last):
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 328, in _invoke_callbacks
callback(self)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 437, in result
return self.__get_result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 389, in __get_result
raise self._exception
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 328, in _invoke_callbacks
callback(self)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 437, in result
return self.__get_result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 389, in __get_result
raise self._exception
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 328, in _invoke_callbacks
callback(self)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 437, in result
return self.__get_result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 389, in __get_result
raise self._exception
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 328, in _invoke_callbacks
callback(self)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 437, in result
return self.__get_result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 389, in __get_result
raise self._exception
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 328, in _invoke_callbacks
callback(self)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 437, in result
return self.__get_result()
File “/opt/conda/lib/python3.8/concurrent/futures/_base.py”, line 389, in __get_result
raise self._exception
concurrent.futures.process.BrokenProcessPool: A process in the process pool was terminated abruptly while the future was running or pending.

How much memory/cpu are you devoting to the job in Docker?

50 Gb. This was the command:

sudo docker run -ti --rm
-v /home/khushboo/Documents/7T/bids:/data:ro
-v /home/khushboo/Documents/7T/derivatives:/out
-v /home/khushboo/Documents/7T/fmriprep-temp:/work
-v /usr/local/freesurfer/license.txt:/opt/freesurfer/license.txt
nipreps/fmriprep:21.0.1
/data /out/fmriprep-21.0.1
participant --participant-label pilote3a
-w /home/khushboo/Documents/7T/fmriprep-temp
–output-spaces T1w --skull-strip-t1w skip
–fd-spike-threshold 0.5 --dvars-spike-threshold 2.0 --bold2t1w-dof 9
–return-all-components --skip_bids_validation
–mem-mb 50000 --omp-nthreads 10 --nthreads 12
–fs-no-reconall --fs-subjects-dir /home/khushboo/Documents/7T/derivatives/freesurfer

So the 50 GB specified by mem-mb is an fMRIPrep flag telling it not to exceed that much memory, but you would need to make sure that your computing environment has 50 GB to give to it. What operating system are you using? I am going to guess this is a personal MacOS or Linux (as opposed to a Linux HPC which usually do not allow Docker in favor of Singularity). You may need to go into your docker settings to make sure 50GB and the 12 CPU you request are available.

It is a personal Linux laptop on with 16 Gb RAM and 16 CPUs. I corrected my mistake of allotting excess memory than what was available on my device. I added the --cpus="12" -m 12g flags to the docker command and fmriprep is running now. But intermittently it is generating the following errors., while continuing to run.

220218-15:20:36,587 nipype.workflow WARNING:
[Node] Error on “fmriprep_wf.single_subject_pilote3a_wf.func_preproc_task_Interception_run_01_wf.bold_hmc_wf.mcflirt” (/home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/func_preproc_task_Interception_run_01_wf/bold_hmc_wf/mcflirt)
220218-15:20:36,833 nipype.workflow ERROR:
Node mcflirt failed to run on host 97481845581f.
220218-15:20:36,840 nipype.workflow ERROR:
Saving crash info to /out/fmriprep-21.0.1/sub-pilote3a/log/20220218-151506_12c2f193-f6f3-4943-8a96-2b75be95334b/crash-20220218-152036-root-mcflirt-b0edd84b-4725-477b-9fcc-77d294f492c3.txt
Traceback (most recent call last):
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py”, line 67, in run_node
result[“result”] = node.run(updatehash=updatehash)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py”, line 516, in run
result = self._run_interface(execute=True)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py”, line 635, in _run_interface
return self._run_command(execute)
File “/opt/conda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py”, line 741, in _run_command
result = self._interface.run(cwd=outdir)
File “/opt/conda/lib/python3.8/site-packages/nipype/interfaces/base/core.py”, line 428, in run
runtime = self._run_interface(runtime)
File “/opt/conda/lib/python3.8/site-packages/nipype/interfaces/base/core.py”, line 822, in _run_interface
self.raise_exception(runtime)
File “/opt/conda/lib/python3.8/site-packages/nipype/interfaces/base/core.py”, line 749, in raise_exception
raise RuntimeError(
RuntimeError: Command:
mcflirt -in /data/sub-pilote3a/func/sub-pilote3a_task-Interception_run-01_bold.nii.gz -out /home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/func_preproc_task_Interception_run_01_wf/bold_hmc_wf/mcflirt/sub-pilote3a_task-Interception_run-01_bold_mcf.nii.gz -reffile /home/khushboo/Documents/7T/fmriprep-temp/fmriprep_wf/single_subject_pilote3a_wf/func_preproc_task_Interception_run_01_wf/initial_boldref_wf/gen_avg/sub-pilote3a_task-Interception_run-01_bold_average.nii.gz -mats -plots -rmsabs -rmsrel
Standard output:

Standard error:
Killed
Return code: 137
Are these also related to memory? I have specified a temp working directory. I’m still using it for pilot data. Once I finalize the acquisition methods and optional flags for processing, I’ll move the processing to a Singularity container on a Linux HPC in the lab.

Yes, return code 137 is often related to memory error issues. I would say if you got it to the point where this is your only error you should feel good about taking the next step to HPC. Good luck!

I’m gettimg 2 errors in slice time correction and mcflirt, both have the return code 137. I’ll try running the process on HPC now. Thanks for your help : )

I am getting a similar error and I do not know how to resolve this.

	 Building fMRIPrep's workflow:
           * BIDS dataset path: /Users/loukas/Downloads/fair-bids-main/heudiconv_data/bids.
           * Participant list: ['219'].
           * Run identifier: 20220512-130433_5726c148-34a3-47dd-9633-1ea86c5f7ce7.
           * Output spaces: MNI152NLin2009cAsym:res-native.
           * Pre-run FreeSurfer's SUBJECTS_DIR: /Users/loukas/Downloads/fair-bids-main/heudiconv_data/res/sourcedata/freesurfer.
Process Process-2:
Traceback (most recent call last):
  File "/opt/anaconda3/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap
    self.run()
  File "/opt/anaconda3/lib/python3.8/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/opt/anaconda3/lib/python3.8/site-packages/fmriprep/cli/workflow.py", line 118, in build_workflow
    retval["workflow"] = init_fmriprep_wf()
  File "/opt/anaconda3/lib/python3.8/site-packages/fmriprep/workflows/base.py", line 85, in init_fmriprep_wf
    single_subject_wf = init_single_subject_wf(subject_id)
  File "/opt/anaconda3/lib/python3.8/site-packages/fmriprep/workflows/base.py", line 146, in init_single_subject_wf
    from smriprep.workflows.anatomical import init_anat_preproc_wf
  File "/opt/anaconda3/lib/python3.8/site-packages/smriprep/workflows/anatomical.py", line 39, in <module>
    from niworkflows.interfaces.freesurfer import (
  File "/opt/anaconda3/lib/python3.8/site-packages/niworkflows/interfaces/freesurfer.py", line 44, in <module>
    from .reportlets.registration import BBRegisterRPT, MRICoregRPT
  File "/opt/anaconda3/lib/python3.8/site-packages/niworkflows/interfaces/reportlets/registration.py", line 242, in <module>
    if LooseVersion("0.0.0") < fs.Info.looseversion() < LooseVersion("6.0.0"):
TypeError: '<' not supported between instances of 'LooseVersion' and 'LooseVersion'

Can you please let us know:

  1. your version of fmriprep
  2. is your dataset BIDS valid?
  3. how you are running fmriprep (singulairty docker python etc)
  4. the command you used to run fmriprep
  5. is your error subject specific?

Best,
Steven

Hi,

Sure. Here is the info.

  1. fMRIPrep v21.0.2
  2. bids-validator returns:
bids-validator@1.6.2

	1: [WARN] Tabular file contains custom columns not described in a data dictionary (code: 82 - CUSTOM_COLUMN_WITHOUT_DESCRIPTION)
		./sub-219/func/sub-219_task-rest_run-01_events.tsv
			Evidence: Columns: TODO -- fill in rows and add more tab-separated columns if desired not defined, please define in: /events.json, /task-rest_events.json,/run-01_events.json,/task-rest_run-01_events.json,/sub-219/sub-219_events.json,/sub-219/sub-219_task-rest_events.json,/sub-219/sub-219_run-01_events.json,/sub-219/sub-219_task-rest_run-01_events.json,/sub-219/func/sub-219_events.json,/sub-219/func/sub-219_task-rest_events.json,/sub-219/func/sub-219_run-01_events.json,/sub-219/func/sub-219_task-rest_run-01_events.json

	Please visit https://neurostars.org/search?q=CUSTOM_COLUMN_WITHOUT_DESCRIPTION for existing conversations about this issue.


        Summary:                 Available Tasks:                     Available Modalities:
        13 Files, 48.53MB        rest                                 T1w
        1 - Subject              TODO: full task name for rest        bold
        1 - Session                                                   events


	If you have any questions, please post on https://neurostars.org/tags/bids.

  1. python 3.8.8
  2. fmriprep bids/ res/ participant -w work/ inside a directory containing the bids subdirectory.
  3. I only tried with one subject

Can you try a containerized (Docker/Singularity) version? The error message seems to indicate that software in your environment is either not installed or installed with an incompatible version. Using a container, you would not need to worry about that.

I tried with Docker but cannot overcome the freesurfer licence error.

I have downloaded the license txt file and placed in the my environmental directory $FREESURFER_HOME but the docker command still complains even after reboot.

    Running fMRIPREP version 20.2.1:
      * BIDS dataset path: /data.
      * Participant list: ['219'].
      * Run identifier: 20220512-135241_583436bd-c889-4456-bf5e-fdf6a27f4d28.
      * Output spaces: MNI152NLin2009cAsym:res-native.
      * Pre-run FreeSurfer's SUBJECTS_DIR: /out/freesurfer.
220512-13:52:49,232 nipype.workflow INFO:
	 No single-band-reference found for sub-219_task-rest_run-01_bold.nii.gz.
220512-13:52:49,820 nipype.workflow IMPORTANT:
	 Slice-timing correction will be included.
220512-13:52:51,408 nipype.workflow CRITICAL:
	 ERROR: a valid license file is required for FreeSurfer to run. fMRIPrep looked for an existing license file at several paths, in this order: 1) command line argument ``--fs-license-file``; 2) ``$FS_LICENSE`` environment variable; and 3) the ``$FREESURFER_HOME/license.txt`` path. Get it (for free) by registering at https://surfer.nmr.mgh.harvard.edu/registration.html
fMRIPrep: Please report errors to https://github.com/nipreps/fmriprep/issues

Solved. It was the file-sharing settings of Docker. Thanks @Steven !

1 Like