QSIPrep hangs after executing 'merged_qc'

Summary of what happened:

Hi everyone, I am using QSIPrep to process my diffusion MRI data.
However, it always hangs at ‘merged_qc’ node, not moving forward at all.

Command used (and if a helper script was used, a link to the helper script or the command generated):

Here is the command I was calling:

qsiprep-docker /Users/runjia/Code/data_for_test_qsiprep/THP /Users/runjia/Code/data_for_test_qsiprep/THP_out participant --output-resolution 2 --fs-license-file ~/fsl/license.txt -w /Users/runjia/Code/data_for_test_qsiprep/THP_out/qsiprep/work


Environment (Docker, Singularity, custom installation):

I ran QSIPrep using docker on OSX.

Data formatted according to a validatable standard? Please provide the output of the validator:

Relevant log outputs (up to 20 lines):

Here is the log:

221231-04:39:43,634 nipype.workflow INFO:
	 [Node] Finished "raw_fib_qc", elapsed time 6.986714s.
221231-04:39:44,558 nipype.workflow INFO:
	 [Node] Setting-up "qsiprep_wf.single_subject_01_wf.dwi_preproc_ses_1_wf.pre_hmc_wf.merge_and_denoise_wf.dwi_qc_wf.merged_qc" in "/scratch/qsiprep_wf/single_subject_01_wf/dwi_preproc_ses_1_wf/pre_hmc_wf/merge_and_denoise_wf/dwi_qc_wf/merged_qc".
221231-04:39:44,614 nipype.workflow INFO:
	 [Node] Executing "merged_qc" <qsiprep.interfaces.dsi_studio.DSIStudioMergeQC>
221231-04:39:44,655 nipype.workflow INFO:
	 [Node] Finished "merged_qc", elapsed time 0.034806s.
221231-04:39:46,708 nipype.workflow INFO:
	 [Node] Setting-up "qsiprep_wf.single_subject_01_wf.dwi_preproc_ses_1_wf.pre_hmc_wf.dwi_qc_wf.merged_qc" in "/scratch/qsiprep_wf/single_subject_01_wf/dwi_preproc_ses_1_wf/pre_hmc_wf/dwi_qc_wf/merged_qc".
221231-04:39:46,770 nipype.workflow INFO:
	 [Node] Executing "merged_qc" <qsiprep.interfaces.dsi_studio.DSIStudioMergeQC>
221231-04:39:46,828 nipype.workflow INFO:
	 [Node] Finished "merged_qc", elapsed time 0.051984s.

There is no error log because it did not crash, it just hangs here.

Screenshots / relevant information:

Is it possible to be a memory issue? But the problem still exists even if I set docker memory limit to 16GB.

Does anyone have a clue? Thank you in advance!

Hi @Runjia0124,

I have edited your post to reintroduce the topic template for Software Support questions. You can see there are two items your original question did not address, being the version and whether your data are BIDS valid.

Some other things that would be good to know:

  • How long did you wait?
  • Is this error subject specific?
  • How big are your original DWI files? Memory, resolution, and number of directions would be good to know.
  • How many CPUs are you devoting to the job?