Synthstrip failed

Summary of what happened:

I am running qsiprep on a couple of subjects as a test run, but the pipeline consistently fails at the synthstrip step. I initially suspected a version-related issue, so I tested both QSIPrep 1.0.0 and 1.0.1(latest), but the error persists in both cases. The preprocessing did not complete and the same error message appears each time.

Command used (and if a helper script was used, a link to the helper script or the command generated):

Here is my code:

list=$1
sub=`awk NR==${SLURM_ARRAY_TASK_ID} ${list}`

echo ${sub}

singularity run --cleanenv \
-B /cbica/comp_space/OASIS/:/work/ \
-B /cbica/projects/OASIS/OASIS3/Pipelines/OASIS3_DTI_2024/Data/Init_Downloaded/scans/All/batch_01/:/data/ \
-B /cbica/projects/OASIS/OASIS3/Pipelines/OASIS3_DTI_2024/Protocols/batch_01/:/out/ \
/cbica/projects/OASIS/OASIS3/Pipelines/OASIS3_DTI_2024/qsiprep-1.0.0.sif \
/data/ /out/ participant \
--ignore fieldmaps t2w flair \
--dwi-only \
--separate-all-dwis \
--participant-label ${sub} \
--anat-modality T1w \
--subject-anatomical-reference sessionwise \
--nthreads 4 \
--work-dir /work/ \
--skip-bids-validation \
--unringing-method mrdegibbs \
--no-b0-harmonization \
--output-resolution 2 \

Here is my submit script:

n=$(wc -l ${indir}/Lists/OASIS_Batch01_ | awk '{print $1}')
echo "Count:" $n;

## submit array job
sbatch --array=1-${n} --propagate=NONE --output ${indir}/logs/%x_%j.log --mem=256G --cpus-per-task=4 --time=24:00:00 ${scripts}/qsiprep_wrapper_new.sh ${indir}/Lists/OASIS_Batch01_

Version:

apptainer build /cbica/projects/OASIS/OASIS3/Pipelines/OASIS3_DTI_2024/qsiprep-1.0.0.sif docker://pennlinc/qsiprep:1.0.0

Input data appears valid and follows BIDS structure.
example input:

srinivad@cubic-login3 /cbica/projects/OASIS/OASIS3/Pipelines/OASIS3_DTI_2024 $ ls Data/Init_Downloaded/scans/All/batch_01/sub-OAS30001/
ses-d0757/  ses-d2430/	ses-d3132/  ses-d3746/	ses-d4467/
srinivad@cubic-login3 /cbica/projects/OASIS/OASIS3/Pipelines/OASIS3_DTI_2024 $ tree Data/Init_Downloaded/scans/All/batch_01/sub-OAS30001/ses-d0757/
Data/Init_Downloaded/scans/All/batch_01/sub-OAS30001/ses-d0757/
├── anat
│   ├── sub-OAS30001_ses-d0757_acq-TSE_T2w.json
│   ├── sub-OAS30001_ses-d0757_acq-TSE_T2w.nii.gz
│   ├── sub-OAS30001_ses-d0757_run-01_T1w.json
│   ├── sub-OAS30001_ses-d0757_run-01_T1w.nii.gz
│   ├── sub-OAS30001_ses-d0757_run-02_T1w.json
│   ├── sub-OAS30001_ses-d0757_run-02_T1w.nii.gz
│   ├── sub-OAS30001_ses-d0757_T2star.json
│   ├── sub-OAS30001_ses-d0757_T2star.nii.gz
│   ├── sub-OAS30001_ses-d0757_T2w.json
│   └── sub-OAS30001_ses-d0757_T2w.nii.gz
├── dwi
│   ├── sub-OAS30001_ses-d0757_dwi.bval
│   ├── sub-OAS30001_ses-d0757_dwi.bvec
│   ├── sub-OAS30001_ses-d0757_dwi.json
│   └── sub-OAS30001_ses-d0757_dwi.nii.gz
└── func
    ├── sub-OAS30001_ses-d0757_task-rest_run-01_bold.json
    ├── sub-OAS30001_ses-d0757_task-rest_run-01_bold.nii.gz
    ├── sub-OAS30001_ses-d0757_task-rest_run-02_bold.json
    └── sub-OAS30001_ses-d0757_task-rest_run-02_bold.nii.gz

3 directories, 18 files

Relevant log outputs (up to 20 lines):

250506-12:31:24,758 nipype.workflow INFO:
         [Node] Finished "rigid_acpc_resample_aseg", elapsed time 114.615632s.
250506-12:31:24,790 nipype.workflow INFO:
         [Node] Finished "synthstrip", elapsed time 19.822667s.
250506-12:31:24,790 nipype.workflow WARNING:
         Storing result file without outputs
250506-12:31:24,791 nipype.workflow WARNING:
         [Node] Error on "qsiprep_1_0_wf.sub_OAS30001_ses_d0757_wf.dwi_preproc_s
es_d0757_wf.hmc_sdc_wf.pre_eddy_b0_ref_wf.synthstrip_wf.synthstrip" (/work/qsipr
ep_1_0_wf/sub_OAS30001_ses_d0757_wf/dwi_preproc_ses_d0757_wf/hmc_sdc_wf/pre_eddy
_b0_ref_wf/synthstrip_wf/synthstrip)
250506-12:31:24,797 nipype.workflow ERROR:
         Node synthstrip failed to run on host 2115fmn018.
250506-12:31:24,806 nipype.workflow ERROR:
         Saving crash info to /out/sub-OAS30001/log/20250506-103318_26756fb0-ef6
6-443a-a5f3-75832e96db92/crash-20250506-123124-oasis-synthstrip-5f41af70-56f9-46
e7-8b54-65c2c2571aec.txt
Traceback (most recent call last):
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/plu
gins/multiproc.py", line 66, in run_node
    result["result"] = node.run(updatehash=updatehash)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/eng
ine/nodes.py", line 525, in run
    result = self._run_interface(execute=True)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/eng
ine/nodes.py", line 643, in _run_interface
    return self._run_command(execute)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/eng
ine/nodes.py", line 769, in _run_command
    raise NodeExecutionError(msg)
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executin
g Node synthstrip.

Cmdline:
        mri_synthstrip -i /work/qsiprep_1_0_wf/sub_OAS30001_ses_d0757_wf/dwi_preproc_ses_d0757_wf/hmc_sdc_wf/pre_eddy_b0_ref_wf/synthstrip_wf/pad_before_synthstrip_wf/resample_skulled_to_reference/topup_imain_avg_trans.nii.gz -o topup_imain_avg_trans_brain.nii.gz -m topup_imain_avg_trans_mask.nii.gz
Stdout:
        Configuring model on the CPU
        Running SynthStrip model version 1
        Input image read from: /work/qsiprep_1_0_wf/sub_OAS30001_ses_d0757_wf/dwi_preproc_ses_d0757_wf/hmc_sdc_wf/pre_eddy_b0_ref_wf/synthstrip_wf/pad_before_synthstrip_wf/resample_skulled_to_reference/topup_imain_avg_trans.nii.gz
Stderr:
        DeprecationWarning: the `interpolation=` argument to percentile was renamed to `method=`, which has additional options.
        Users of the modes 'nearest', 'lower', 'higher', or 'midpoint' are encouraged to review the method they used. (Deprecated NumPy 1.22)
        RuntimeWarning: invalid value encountered in divide
        Traceback (most recent call last):
          File "/opt/freesurfer/bin/mri_synthstrip", line 232, in <module>
            mask = (components == (np.argmax(bincount) + 1))
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/numpy/core/fromnumeric.py", line 1216, in argmax
            return _wrapfunc(a, 'argmax', axis=axis, out=out, **kwds)
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/numpy/core/fromnumeric.py", line 57, in _wrapfunc
            return bound(*args, **kwds)
        ValueError: attempt to get argmax of an empty sequence
Traceback:
        Traceback (most recent call last):
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/interfaces/base/core.py", line 401, in run
            runtime = self._run_interface(runtime)
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/interfaces/freesurfer.py", line 192, in _run_interface
            raise Exception('mri_synthstrip failed!')
        Exception: mri_synthstrip failed!

Any guidance on how to resolve this would be much appreciated.

Hi @Dhivya,

Why are you specifying --anat-modality T1w and --subject-anatomical-reference sessionwise if you are using --dwi-only? Can you access the files listed in the error message in your working directory and see if they look okay (e.g., in an image viewer)?

Best,
Steven

Hi Steven,

Thanks for pointing out the contradictory input arguments. I had run into a different error earlier, which led me try using --dwi-only as a workaround,but I will remove that argument and re-run.

/qsiprep_1_0_wf/sub_OAS30001_ses_d0757_wf/dwi_preproc_ses_d0757_wf/hmc_sdc_wf/pre_eddy_b0_ref_wf/synthstrip_wf/pad_before_synthstrip_wf/resample_skulled_to_reference/topup_imain_avg_trans.nii.gz seems empty.

Could you tell me if the remaining input arguments look appropriate?

Hi @Dhivya,

You should only be using --unringing-method mrdegibbs if your PartialFourier is 1 in the DWI json. If it is less than that (e.g., 0.75) then use rpg instead.

Is there a reason you are using --no-b0-harmonization?

Do both T1w images look okay in your dataset?

Best,
Steven

Thanks for clarifying! PartialFourier says ‘0.75’. I will modify that to ‘rpg’. I used no-b0-harmonization because I was trying to avoid any unintended interactions across scans that might have contributed to some previous errors.

And yes, both T1w images look good.

I am getting another error after removing --dwi-only and set ‘rpg’ as unringing method. I am also opting for b0-harmonization now. Here is the error:

       [Node] Error on "qsiprep_1_0_wf.sub_OAS30002_ses_d1680_wf.dwi_preproc_ses_d1680_run_01_wf.pre_hmc_wf.merge_and_denoise_wf.merge_dwis" (/work/qsiprep_1_0_wf/sub_OAS30002_ses_d1680_wf/dwi_preproc_ses_d1680_run_01_wf/pre_hmc_wf/merge_and_denoise_wf/merge_dwis)
250506-15:25:48,302 nipype.workflow ERROR:
         Node merge_dwis failed to run on host 2115fmn024.
250506-15:25:48,316 nipype.workflow ERROR:
         Saving crash info to /out/sub-OAS30002/log/20250506-151439_d90cce61-3d08-406f-a89e-915289067321/crash-20250506-152548-oasis-merge_dwis-4665347c-7703-44b3-b3bb-f5acd92eba72.txt
Traceback (most recent call last):
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/plugins/multiproc.py", line 66, in run_node
    result["result"] = node.run(updatehash=updatehash)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 525, in run
    result = self._run_interface(execute=True)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 643, in _run_interface
    return self._run_command(execute)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 769, in _run_command
    raise NodeExecutionError(msg)
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node merge_dwis.

Traceback:
        Traceback (most recent call last):
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/interfaces/base/core.py", line 401, in run
            runtime = self._run_interface(runtime)
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/interfaces/dwi_merge.py", line 76, in _run_interface
            to_concat, b0_means, corrections = harmonize_b0s(
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/interfaces/dwi_merge.py", line 716, in harmonize_b0s
            harmonized_niis.append(math_img(f'img*{correction:.32f}', img=nii_img))
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nilearn/image/image.py", line 1058, in math_img
            result = eval(formula, data_dict)
          File "<string>", line 1, in <module>
        NameError: ("Input formula couldn't be processed, you provided 'img*nan',", "name 'nan' is not defined")

Did you try a new working directory? Can you try more subjects?

I am trying a different working directory and running for 15 subjects now. It is getting stuck at b0-harmonization step I guess (error in merging dwis). Few subjects have single run and few subjects have more than one run. And when I checked b-values, I can see multiple shells.

One run has:
0 50 350 600 900 1150 100 400 700 950 1250 150 450 700 1000 1300 200 500 800 1050 1350 300 550 850 1100 1400

the other run within the same session has:
0 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000

and it looks different for different subjects and sessions as well. Sorry! I don’t have a protocol for this study.

new_b0_path /work/qsiprep_1_0_wf/sub_OAS30010_ses_d0068_wf/dwi_preproc_ses_d0068_run_0
1_wf/hmc_sdc_wf/gather_inputs/sub-OAS30010_ses-d0068_run-01_dwi_b0-00.nii.gz
250507-10:53:09,789 nipype.workflow INFO:
         [Node] Finished "gather_inputs", elapsed time 0.638378s.
250507-10:53:10,993 nipype.workflow ERROR:
         Node merge_dwis failed to run on host 2115fmn005.
250507-10:53:11,9 nipype.workflow ERROR:
         Saving crash info to /out/sub-OAS30010/log/20250507-104838_8f2ec833-f04b-4d89
-8658-4977d1cd0aa5/crash-20250507-105311-oasis-merge_dwis-0b77fb73-9135-4b66-afb1-23a8
4e13e248.txt
Traceback (most recent call last):
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/plugins/m
ultiproc.py", line 66, in run_node
    result["result"] = node.run(updatehash=updatehash)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/no
des.py", line 525, in run
    result = self._run_interface(execute=True)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/no
des.py", line 643, in _run_interface
    return self._run_command(execute)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/no
des.py", line 769, in _run_command
    raise NodeExecutionError(msg)
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node
 merge_dwis.

Traceback:
        Traceback (most recent call last):
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/interfaces
/base/core.py", line 401, in run
            runtime = self._run_interface(runtime)
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/interface
s/dwi_merge.py", line 76, in _run_interface
            to_concat, b0_means, corrections = harmonize_b0s(
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/interface
s/dwi_merge.py", line 716, in harmonize_b0s
            harmonized_niis.append(math_img(f'img*{correction:.32f}', img=nii_img))
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nilearn/image/ima
ge.py", line 1058, in math_img
            result = eval(formula, data_dict)
          File "<string>", line 1, in <module>
        NameError: ("Input formula couldn't be processed, you provided 'img*nan',", "n
ame 'nan' is not defined")


Hi @Dhivya,

You can try SHORELine for non shelled data although it’s usually meant for much more dense DWI acquisitions so I don’t know how well it would fare for you.

Best,
Steven

Thanks Steven! I tried SHORELine, at least it’s allowing the pipeline to run without any errors for now. I will keep an eye on the results.