Qsiprep reconstruction

Hi
I was wondering if you could help me. I was doing a test run with qsiprep. I’ve successfully managed to preprocess a scan but during the recon phase I get the following error (see below). I’m running it on Ubuntu 20.04 and have about 12GB free of memory. I have googled the error and noticed from other issues from fmriprep that this may potentially be an issue with memory? If so, do you know any way to get around this on my local machine?
Thanks in advance for any help!
Cheers
Meher

[Node] Running “tractography” (“qsiprep.interfaces.mrtrix.TckGen”), a CommandLine Interface with command:
tckgen -act /tmp/work/qsirecon_wf/sub-A00008326_mrtrix_singleshell_ss3t/anat_ingress_wf/create_5tt/sub-A00008326_desc-preproc_T1w_5tt.mif -algorithm iFOD2 -backtrack -crop_at_gmwmi -maxlength 250.000000 -minlength 30.000000 -samples 4 -nthreads 7 -output_seeds out_seeds.nii.gz -power 0.330000 -seed_dynamic /tmp/work/qsirecon_wf/sub-A00008326_mrtrix_singleshell_ss3t/recon_wf/ss3t_csd/dwi_file…qsiprep-output…sub-A00008326…ses-ALGA…dwi…sub-A00008326_ses-ALGA_space-T1w_desc-preproc_dwi.nii.gz/intensity_norm/sub-A00008326_ses-ALGA_space-T1w_desc-preproc_dwi_wm_mtnorm.mif -select 10000000 /tmp/work/qsirecon_wf/sub-A00008326_mrtrix_singleshell_ss3t/recon_wf/ss3t_csd/dwi_file…qsiprep-output…sub-A00008326…ses-ALGA…dwi…sub-A00008326_ses-ALGA_space-T1w_desc-preproc_dwi.nii.gz/intensity_norm/sub-A00008326_ses-ALGA_space-T1w_desc-preproc_dwi_wm_mtnorm.mif tracked.tck
210728-13:38:55,723 nipype.workflow INFO:
b’’
210728-13:38:55,723 nipype.workflow INFO:
b’’
exception calling callback for <Future at 0x7fdb2e849ef0 state=finished raised BrokenProcessPool>
Traceback (most recent call last):
File “/usr/local/miniconda/lib/python3.7/concurrent/futures/_base.py”, line 324, in _invoke_callbacks
callback(self)
File “/usr/local/miniconda/lib/python3.7/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/usr/local/miniconda/lib/python3.7/concurrent/futures/_base.py”, line 425, in result
return self.__get_result()
File “/usr/local/miniconda/lib/python3.7/concurrent/futures/_base.py”, line 384, in __get_result
raise self._exception
concurrent.futures.process.BrokenProcessPool: A process in the process pool was terminated abruptly while the future was running or pending.
exception calling callback for <Future at 0x7fdb2e849828 state=finished raised BrokenProcessPool>
Traceback (most recent call last):
File “/usr/local/miniconda/lib/python3.7/concurrent/futures/_base.py”, line 324, in _invoke_callbacks
callback(self)
File “/usr/local/miniconda/lib/python3.7/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/usr/local/miniconda/lib/python3.7/concurrent/futures/_base.py”, line 425, in result
return self.__get_result()
File “/usr/local/miniconda/lib/python3.7/concurrent/futures/_base.py”, line 384, in __get_result
raise self._exception
File “/usr/local/miniconda/lib/python3.7/concurrent/futures/_base.py”, line 324, in _invoke_callbacks
callback(self)
File “/usr/local/miniconda/lib/python3.7/site-packages/nipype/pipeline/plugins/multiproc.py”, line 159, in _async_callback
result = args.result()
File “/usr/local/miniconda/lib/python3.7/concurrent/futures/_base.py”, line 425, in result
return self.__get_result()
File “/usr/local/miniconda/lib/python3.7/concurrent/futures/_base.py”, line 384, in __get_result
raise self._exception
concurrent.futures.process.BrokenProcessPool: A process in the process pool was terminated abruptly while the future was running or pending.

If you are unable to devote more memory to the job, you can try to run the MRtrix commands by yourself.
Assuming you are using a fully preprocessed QSIprep derivative, the steps are roughly (for MRtrix recon specs):

dwi2response
ss3t_csd_beta1 or dwi2fod (depending on if single or multi shell)
mtnormalise
tckgen
tcksift2
tck2connectome

Since QSIPrep outputs have codified naming conventions, making a script that does these steps should be pretty feasible. Hope this helps.

Best,
Steven

1 Like

Thanks so much! Really appreciate your time to respond!
Regards
Meher