QSIPrep error merge_dwis

Summary of what happened:

I am running QSIPrep version 1.0.1 using SLURM on few studies. The pipeline fails saying ''Merge_dwis failed to run on host" for on of the studies. Could you help me to identify what might be wrong or setting up the right parameters?

Command used (and if a helper script was used, a link to the helper script or the command generated):

Here is my code:

list=$1
sub=`awk NR==${SLURM_ARRAY_TASK_ID} ${list}`

singularity run --cleanenv --containall -e \
-B ${WORK_DIR}/:/work/ \
-B ${BIDS_DIR}:/data/ \
-B ${OUT_DIR}/:/out/ \
-B ${PIPELINE_DIR}:/indir/ \
/indir/qsiprep-1.0.1.sif \
/data/ /out/ participant \
--ignore fieldmaps t2w flair phase \
--fs-license-file /indir/license.txt \
--separate-all-dwis \
--participant-label ${sub} \
--distortion-group-merge none \
--anat-modality T1w \
--bids-filter-file /indir/Scripts/bids_filter.json \
--mem-mb 88000 \
--nthreads 4 \
--work-dir /work/ \
--hmc-model 3dSHORE \
--unringing-method rpg \
--output-resolution 2 \


Version:

qsiprep:1.0.1

Environment (Docker, Singularity / Apptainer, custom installation):

apptainer build ${PWD}/qsiprep-1.0.1.sif docker://pennlinc/qsiprep:1.0.1

Data formatted according to a validatable standard? Please provide the output of the validator:

Input data appears valid and follows BIDS structure.
example input:

tree Data/Init_Downloaded/scans/All/batch_01/sub-OAS30001/
Data/Init_Downloaded/scans/All/batch_01/sub-OAS30001/
├── ses-d0757
│   ├── anat
│   │   ├── sub-OAS30001_ses-d0757_acq-TSE_T2w.json
│   │   ├── sub-OAS30001_ses-d0757_acq-TSE_T2w.nii.gz
│   │   ├── sub-OAS30001_ses-d0757_run-01_T1w.json
│   │   ├── sub-OAS30001_ses-d0757_run-01_T1w.nii.gz
│   │   ├── sub-OAS30001_ses-d0757_run-02_T1w.json
│   │   ├── sub-OAS30001_ses-d0757_run-02_T1w.nii.gz
│   │   ├── sub-OAS30001_ses-d0757_T2star.json
│   │   ├── sub-OAS30001_ses-d0757_T2star.nii.gz
│   │   ├── sub-OAS30001_ses-d0757_T2w.json
│   │   └── sub-OAS30001_ses-d0757_T2w.nii.gz
│   ├── dwi
│   │   ├── sub-OAS30001_ses-d0757_dwi.bval
│   │   ├── sub-OAS30001_ses-d0757_dwi.bvec
│   │   ├── sub-OAS30001_ses-d0757_dwi.json
│   │   └── sub-OAS30001_ses-d0757_dwi.nii.gz
│   └── func
│       ├── sub-OAS30001_ses-d0757_task-rest_run-01_bold.json
│       ├── sub-OAS30001_ses-d0757_task-rest_run-01_bold.nii.gz
│       ├── sub-OAS30001_ses-d0757_task-rest_run-02_bold.json
│       └── sub-OAS30001_ses-d0757_task-rest_run-02_bold.nii.gz
├── ses-d2430
│   ├── anat
│   │   ├── sub-OAS30001_ses-d2430_acq-TSE_T2w.json
│   │   ├── sub-OAS30001_ses-d2430_acq-TSE_T2w.nii.gz
│   │   ├── sub-OAS30001_ses-d2430_FLAIR.json
│   │   ├── sub-OAS30001_ses-d2430_FLAIR.nii.gz
│   │   ├── sub-OAS30001_ses-d2430_T1w.json
│   │   ├── sub-OAS30001_ses-d2430_T1w.nii.gz
│   │   ├── sub-OAS30001_ses-d2430_T2star.json
│   │   └── sub-OAS30001_ses-d2430_T2star.nii.gz
│   ├── dwi
│   │   ├── sub-OAS30001_ses-d2430_run-01_dwi.bval
│   │   ├── sub-OAS30001_ses-d2430_run-01_dwi.bvec
│   │   ├── sub-OAS30001_ses-d2430_run-01_dwi.json
│   │   ├── sub-OAS30001_ses-d2430_run-01_dwi.nii.gz
│   │   ├── sub-OAS30001_ses-d2430_run-02_dwi.bval
│   │   ├── sub-OAS30001_ses-d2q430_run-02_dwi.bvec
│   │   ├── sub-OAS30001_ses-d2430_run-02_dwi.json
│   │   └── sub-OAS30001_ses-d2430_run-02_dwi.nii.gz
│   ├── fmap
│   │   ├── sub-OAS30001_ses-d2430_run-01_magnitude1.json
│   │   ├── sub-OAS30001_ses-d2430_run-01_magnitude1.nii.gz
│   │   ├── sub-OAS30001_ses-d2430_run-01_magnitude2.json
│   │   ├── sub-OAS30001_ses-d2430_run-01_magnitude2.nii.gz
│   │   ├── sub-OAS30001_ses-d2430_run-01_phasediff.json
│   │   └── sub-OAS30001_ses-d2430_run-01_phasediff.nii.gz
│   └── func


Relevant log outputs (up to 20 lines):

        [Node] Executing "merge_dwis" <qsiprep.interfaces.dwi_merge.MergeDWIs>
250807-10:55:00,39 nipype.workflow INFO:
         [Node] Finished "merge_dwis", elapsed time 0.142085s.
250807-10:55:00,39 nipype.workflow WARNING:
         Storing result file without outputs
250807-10:55:00,41 nipype.workflow WARNING:
         [Node] Error on "qsiprep_1_0_wf.sub_OAS30001_ses_d0757_d2430_d3132_d3746_d4467_w
f.dwi_preproc_ses_d3132_run_01_wf.pre_hmc_wf.merge_and_denoise_wf.merge_dwis" (/work/qsip
rep_1_0_wf/sub_OAS30001_ses_d0757_d2430_d3132_d3746_d4467_wf/dwi_preproc_ses_d3132_run_01
_wf/pre_hmc_wf/merge_and_denoise_wf/merge_dwis)
250807-10:55:01,718 nipype.workflow ERROR:
         Node merge_dwis failed to run on host 2119fmn004.
250807-10:55:01,731 nipype.workflow ERROR:
         Saving crash info to /out/sub-OAS30001/log/20250807-104944_2a0cf0e8-37e4-48f9-ad79-64929fc4907a/crash-20250807-105501-oasis-merge_dwis-ce4feded-e445-4542-991a-f21cccacf02d.txt
Traceback (most recent call last):
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/plugins/multiproc.py", line 66, in run_node
    result["result"] = node.run(updatehash=updatehash)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 525, in run
    result = self._run_interface(execute=True)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 643, in _run_interface
    return self._run_command(execute)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 769, in _run_command
    raise NodeExecutionError(msg)
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node merge_dwis.

Traceback:
        Traceback (most recent call last):
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/interfaces/base/core.py", line 401, in run
            runtime = self._run_interface(runtime)
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/interfaces/dwi_merge.py", line 76, in _run_interface
            to_concat, b0_means, corrections = harmonize_b0s(
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/interfaces/dwi_merge.py", line 716, in harmonize_b0s
            harmonized_niis.append(math_img(f'img*{correction:.32f}', img=nii_img))
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nilearn/image/image.py", line 1058, in math_img
            result = eval(formula, data_dict)
          File "<string>", line 1, in <module>
        NameError: ("Input formula couldn't be processed, you provided 'img*nan',", "name 'nan' is not defined")


Screenshots / relevant information:


Hi @Dhivya,

That file appears to have an incorrect session ID, unless that was a typo.

Can you confirm these images have B0 volumes that look okay? First by confirming in the bval/bvec and then by visualizing the B0 volume(s) to make sure they look okay in a visualizer and do not contain NaNs?

Best,
Steven

Thanks Steven. Sorry, that was a typo. The session ID is correct. B0 looks okay to me. Different sessions and runs have different bvalues. For example, run-2 for ses-2430 has bvalues of '0 50 350 600 900 1150 100 400 700 950 1250 150 450 700 1000 1300 200 500 800 1050 1350 300 550 850 1100 1400

and run-1 for the same session has ‘0 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000’. Would that be a problem? This is a very old data (OASIS).

This error is happening when attempting to harmonize the signal values of the b=0 images. There are some strange things in your BIDS naming, can you check that the input is actually BIDS valid?

Here is the BIDS validator output 'sub-OAS30001
bids-validator@1.8.4

        ESC[33m1: [WARN] You should define 'SliceTiming' for this file. If you don't prov
ide this information slice time correction will not be possible. 'Slice Timing' is the ti
me at which each slice was acquired within each volume (frame) of the acquisition. Slice 
timing is not slice order -- rather, it is a list of times containing the time (in second
s) of each slice acquisition in relation to the beginning of volume acquisition. (code: 1
3 - SLICE_TIMING_NOT_DEFINED)ESC[39m
                ./sub-OAS30001/ses-d0757/func/sub-OAS30001_ses-d0757_task-rest_run-01_bol
d.nii.gz
                ./sub-OAS30001/ses-d0757/func/sub-OAS30001_ses-d0757_task-rest_run-02_bol
d.nii.gz

ESC[36m Please visit https://neurostars.org/search?q=SLICE_TIMING_NOT_DEFINED for existin
g conversations about this issue.ESC[39m

        ESC[33m2: [WARN] The recommended file /README is missing. See Section 03 (Modalit
y agnostic files) of the BIDS specification. (code: 101 - README_FILE_MISSING)ESC[39m

ESC[36m Please visit https://neurostars.org/search?q=README_FILE_MISSING for existing conversations about this issue.ESC[39m

        ESC[33m3: [WARN] The Authors field of dataset_description.json should contain an array of fields - with one author per field. This was triggered because there are no authors, which will make DOI registration from dataset metadata impossible. (code: 113 - NO_AUTHORS)ESC[39m

ESC[36m Please visit https://neurostars.org/search?q=NO_AUTHORS for existing conversations about this issue.ESC[39m

        ESC[34mESC[4mSummary:ESC[24mESC[39m                   ESC[34mESC[4mAvailable Tasks:ESC[24mESC[39m        ESC[34mESC[4mAvailable Modalities:ESC[24mESC[39m 
        4607 Files, 22.52GB        rest                    MRI                   
        94 - Subjects                                                            
        219 - Sessions '

'250807-10:50:38,279 nipype.workflow INFO:
         Running nonlinear normalization to template
250807-10:50:38,305 nipype.workflow INFO:
         Grouping DWI scans
250807-10:50:38,308 nipype.workflow INFO:
         Found 1 groups of DWI series based on their warp spaces:
[ { 'concatenated_bids_name': 'sub-OAS30001_ses-d2430_run-01',
    'dwi_series': ['/data/sub-OAS30001/ses-d2430/dwi/sub-OAS30001_ses-d2430_run-01_dwi.nii.gz'],
    'dwi_series_pedir': 'j-',
    'fieldmap_info': {'suffix': None}}]
250807-10:50:38,309 nipype.workflow INFO:
         Found 1 groups of DWI series based on their warp spaces:
[ { 'concatenated_bids_name': 'sub-OAS30001_ses-d3132_run-01',
    'dwi_series': ['/data/sub-OAS30001/ses-d3132/dwi/sub-OAS30001_ses-d3132_run-01_dwi.nii.gz'],
    'dwi_series_pedir': 'j-',
    'fieldmap_info': {'suffix': None}}]
250807-10:50:38,311 nipype.workflow INFO:
         Found 1 groups of DWI series based on their warp spaces:
[ { 'concatenated_bids_name': 'sub-OAS30001_ses-d4467_run-01',
    'dwi_series': ['/data/sub-OAS30001/ses-d4467/dwi/sub-OAS30001_ses-d4467_run-01_dwi.nii.gz'],
    'dwi_series_pedir': 'j',
    'fieldmap_info': {'suffix': None}}]
250807-10:50:38,312 nipype.workflow INFO:
         Found 3 groups of DWI series that can be corrected by eddy:
[ { 'concatenated_bids_name': 'sub-OAS30001_ses-d2430_run-01',
    'dwi_series': ['/data/sub-OAS30001/ses-d2430/dwi/sub-OAS30001_ses-d2430_run-01_dwi.nii.gz'],
    'dwi_series_pedir': 'j-',
    'fieldmap_info': {'suffix': None}},
  { 'concatenated_bids_name': 'sub-OAS30001_ses-d3132_run-01',
    'dwi_series': ['/data/sub-OAS30001/ses-d3132/dwi/sub-OAS30001_ses-d3132_run-01_dwi.nii.gz'],
    'dwi_series_pedir': 'j-',
    'fieldmap_info': {'suffix': None}},
  { 'concatenated_bids_name': 'sub-OAS30001_ses-d4467_run-01',
    'dwi_series': ['/data/sub-OAS30001/ses-d4467/dwi/sub-OAS30001_ses-d4467_run-01_dwi.nii.gz'],
    'dwi_series_pedir': 'j',
    'fieldmap_info': {'suffix': None}}]
'

Hi @Dhivya,

Those acquisitions are very different from each other and I wouldn’t analyze them the same way. The second one, with all the b1000 images appears more typical and would work well with typical eddy processes, as opposed to shoreline.

We are also going to think of how to bypass that but you are getting in a software patch.

Best,
Steven

Thanks Steven. Instead of doing all runs, do you think just processing single run would be helpful?

Hi @Dhivya,

It might help to know more about your analysis goals.

Best,
Steven

NiChart’s primary goal is to extract FA,MD values from all studies and build normative models based on these features (example: age prediction using these features). So,the primary focus is specifically on these tensor derived measures.

@Dhivya is there any consideration of longitudinal data structure or is everything treated as cross sectional?

I was hoping to process all sessions. Could you please clarify what you mean by ‘longitudinal data structure’ in the context of running QSIprep?

In your statistical models how are longitudinal data treated across sessions? Combining these different acquisitions across sessions and treating them the same under a single subject might confound estimates of subject specific effects.

Right now, we are planning to use/analyze/incorporate cross-sectional to NiChart if that answers your question. For fMRI, we treated them separately while processing.

I suppose if you are treating these data cross sectionals in your model you could run separately.

Feel feee to try our unstable tag for our proposed fix to the Nan issue you experienced.

Thanks, I will try a new version and let you know how it goes.