QSIPrep error merge_dwis

Summary of what happened:

I am running QSIPrep version 1.0.1 using SLURM on few studies. The pipeline fails saying ''Merge_dwis failed to run on host" for on of the studies. Could you help me to identify what might be wrong or setting up the right parameters?

Command used (and if a helper script was used, a link to the helper script or the command generated):

Here is my code:

list=$1
sub=`awk NR==${SLURM_ARRAY_TASK_ID} ${list}`

singularity run --cleanenv --containall -e \
-B ${WORK_DIR}/:/work/ \
-B ${BIDS_DIR}:/data/ \
-B ${OUT_DIR}/:/out/ \
-B ${PIPELINE_DIR}:/indir/ \
/indir/qsiprep-1.0.1.sif \
/data/ /out/ participant \
--ignore fieldmaps t2w flair phase \
--fs-license-file /indir/license.txt \
--separate-all-dwis \
--participant-label ${sub} \
--distortion-group-merge none \
--anat-modality T1w \
--bids-filter-file /indir/Scripts/bids_filter.json \
--mem-mb 88000 \
--nthreads 4 \
--work-dir /work/ \
--hmc-model 3dSHORE \
--unringing-method rpg \
--output-resolution 2 \


Version:

qsiprep:1.0.1

Environment (Docker, Singularity / Apptainer, custom installation):

apptainer build ${PWD}/qsiprep-1.0.1.sif docker://pennlinc/qsiprep:1.0.1

Data formatted according to a validatable standard? Please provide the output of the validator:

Input data appears valid and follows BIDS structure.
example input:

tree Data/Init_Downloaded/scans/All/batch_01/sub-OAS30001/
Data/Init_Downloaded/scans/All/batch_01/sub-OAS30001/
β”œβ”€β”€ ses-d0757
β”‚   β”œβ”€β”€ anat
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d0757_acq-TSE_T2w.json
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d0757_acq-TSE_T2w.nii.gz
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d0757_run-01_T1w.json
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d0757_run-01_T1w.nii.gz
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d0757_run-02_T1w.json
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d0757_run-02_T1w.nii.gz
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d0757_T2star.json
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d0757_T2star.nii.gz
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d0757_T2w.json
β”‚   β”‚   └── sub-OAS30001_ses-d0757_T2w.nii.gz
β”‚   β”œβ”€β”€ dwi
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d0757_dwi.bval
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d0757_dwi.bvec
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d0757_dwi.json
β”‚   β”‚   └── sub-OAS30001_ses-d0757_dwi.nii.gz
β”‚   └── func
β”‚       β”œβ”€β”€ sub-OAS30001_ses-d0757_task-rest_run-01_bold.json
β”‚       β”œβ”€β”€ sub-OAS30001_ses-d0757_task-rest_run-01_bold.nii.gz
β”‚       β”œβ”€β”€ sub-OAS30001_ses-d0757_task-rest_run-02_bold.json
β”‚       └── sub-OAS30001_ses-d0757_task-rest_run-02_bold.nii.gz
β”œβ”€β”€ ses-d2430
β”‚   β”œβ”€β”€ anat
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_acq-TSE_T2w.json
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_acq-TSE_T2w.nii.gz
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_FLAIR.json
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_FLAIR.nii.gz
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_T1w.json
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_T1w.nii.gz
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_T2star.json
β”‚   β”‚   └── sub-OAS30001_ses-d2430_T2star.nii.gz
β”‚   β”œβ”€β”€ dwi
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_run-01_dwi.bval
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_run-01_dwi.bvec
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_run-01_dwi.json
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_run-01_dwi.nii.gz
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_run-02_dwi.bval
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2q430_run-02_dwi.bvec
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_run-02_dwi.json
β”‚   β”‚   └── sub-OAS30001_ses-d2430_run-02_dwi.nii.gz
β”‚   β”œβ”€β”€ fmap
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_run-01_magnitude1.json
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_run-01_magnitude1.nii.gz
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_run-01_magnitude2.json
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_run-01_magnitude2.nii.gz
β”‚   β”‚   β”œβ”€β”€ sub-OAS30001_ses-d2430_run-01_phasediff.json
β”‚   β”‚   └── sub-OAS30001_ses-d2430_run-01_phasediff.nii.gz
β”‚   └── func


Relevant log outputs (up to 20 lines):

        [Node] Executing "merge_dwis" <qsiprep.interfaces.dwi_merge.MergeDWIs>
250807-10:55:00,39 nipype.workflow INFO:
         [Node] Finished "merge_dwis", elapsed time 0.142085s.
250807-10:55:00,39 nipype.workflow WARNING:
         Storing result file without outputs
250807-10:55:00,41 nipype.workflow WARNING:
         [Node] Error on "qsiprep_1_0_wf.sub_OAS30001_ses_d0757_d2430_d3132_d3746_d4467_w
f.dwi_preproc_ses_d3132_run_01_wf.pre_hmc_wf.merge_and_denoise_wf.merge_dwis" (/work/qsip
rep_1_0_wf/sub_OAS30001_ses_d0757_d2430_d3132_d3746_d4467_wf/dwi_preproc_ses_d3132_run_01
_wf/pre_hmc_wf/merge_and_denoise_wf/merge_dwis)
250807-10:55:01,718 nipype.workflow ERROR:
         Node merge_dwis failed to run on host 2119fmn004.
250807-10:55:01,731 nipype.workflow ERROR:
         Saving crash info to /out/sub-OAS30001/log/20250807-104944_2a0cf0e8-37e4-48f9-ad79-64929fc4907a/crash-20250807-105501-oasis-merge_dwis-ce4feded-e445-4542-991a-f21cccacf02d.txt
Traceback (most recent call last):
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/plugins/multiproc.py", line 66, in run_node
    result["result"] = node.run(updatehash=updatehash)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 525, in run
    result = self._run_interface(execute=True)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 643, in _run_interface
    return self._run_command(execute)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 769, in _run_command
    raise NodeExecutionError(msg)
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node merge_dwis.

Traceback:
        Traceback (most recent call last):
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/interfaces/base/core.py", line 401, in run
            runtime = self._run_interface(runtime)
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/interfaces/dwi_merge.py", line 76, in _run_interface
            to_concat, b0_means, corrections = harmonize_b0s(
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/interfaces/dwi_merge.py", line 716, in harmonize_b0s
            harmonized_niis.append(math_img(f'img*{correction:.32f}', img=nii_img))
          File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nilearn/image/image.py", line 1058, in math_img
            result = eval(formula, data_dict)
          File "<string>", line 1, in <module>
        NameError: ("Input formula couldn't be processed, you provided 'img*nan',", "name 'nan' is not defined")


Screenshots / relevant information:


1 Like

Hi @Dhivya,

That file appears to have an incorrect session ID, unless that was a typo.

Can you confirm these images have B0 volumes that look okay? First by confirming in the bval/bvec and then by visualizing the B0 volume(s) to make sure they look okay in a visualizer and do not contain NaNs?

Best,
Steven

Thanks Steven. Sorry, that was a typo. The session ID is correct. B0 looks okay to me. Different sessions and runs have different bvalues. For example, run-2 for ses-2430 has bvalues of '0 50 350 600 900 1150 100 400 700 950 1250 150 450 700 1000 1300 200 500 800 1050 1350 300 550 850 1100 1400
’
and run-1 for the same session has β€˜0 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000’. Would that be a problem? This is a very old data (OASIS).

This error is happening when attempting to harmonize the signal values of the b=0 images. There are some strange things in your BIDS naming, can you check that the input is actually BIDS valid?

Here is the BIDS validator output 'sub-OAS30001
bids-validator@1.8.4

        ESC[33m1: [WARN] You should define 'SliceTiming' for this file. If you don't prov
ide this information slice time correction will not be possible. 'Slice Timing' is the ti
me at which each slice was acquired within each volume (frame) of the acquisition. Slice 
timing is not slice order -- rather, it is a list of times containing the time (in second
s) of each slice acquisition in relation to the beginning of volume acquisition. (code: 1
3 - SLICE_TIMING_NOT_DEFINED)ESC[39m
                ./sub-OAS30001/ses-d0757/func/sub-OAS30001_ses-d0757_task-rest_run-01_bol
d.nii.gz
                ./sub-OAS30001/ses-d0757/func/sub-OAS30001_ses-d0757_task-rest_run-02_bol
d.nii.gz

ESC[36m Please visit https://neurostars.org/search?q=SLICE_TIMING_NOT_DEFINED for existin
g conversations about this issue.ESC[39m

        ESC[33m2: [WARN] The recommended file /README is missing. See Section 03 (Modalit
y agnostic files) of the BIDS specification. (code: 101 - README_FILE_MISSING)ESC[39m

ESC[36m Please visit https://neurostars.org/search?q=README_FILE_MISSING for existing conversations about this issue.ESC[39m

        ESC[33m3: [WARN] The Authors field of dataset_description.json should contain an array of fields - with one author per field. This was triggered because there are no authors, which will make DOI registration from dataset metadata impossible. (code: 113 - NO_AUTHORS)ESC[39m

ESC[36m Please visit https://neurostars.org/search?q=NO_AUTHORS for existing conversations about this issue.ESC[39m

        ESC[34mESC[4mSummary:ESC[24mESC[39m                   ESC[34mESC[4mAvailable Tasks:ESC[24mESC[39m        ESC[34mESC[4mAvailable Modalities:ESC[24mESC[39m 
        4607 Files, 22.52GB        rest                    MRI                   
        94 - Subjects                                                            
        219 - Sessions '

'250807-10:50:38,279 nipype.workflow INFO:
         Running nonlinear normalization to template
250807-10:50:38,305 nipype.workflow INFO:
         Grouping DWI scans
250807-10:50:38,308 nipype.workflow INFO:
         Found 1 groups of DWI series based on their warp spaces:
[ { 'concatenated_bids_name': 'sub-OAS30001_ses-d2430_run-01',
    'dwi_series': ['/data/sub-OAS30001/ses-d2430/dwi/sub-OAS30001_ses-d2430_run-01_dwi.nii.gz'],
    'dwi_series_pedir': 'j-',
    'fieldmap_info': {'suffix': None}}]
250807-10:50:38,309 nipype.workflow INFO:
         Found 1 groups of DWI series based on their warp spaces:
[ { 'concatenated_bids_name': 'sub-OAS30001_ses-d3132_run-01',
    'dwi_series': ['/data/sub-OAS30001/ses-d3132/dwi/sub-OAS30001_ses-d3132_run-01_dwi.nii.gz'],
    'dwi_series_pedir': 'j-',
    'fieldmap_info': {'suffix': None}}]
250807-10:50:38,311 nipype.workflow INFO:
         Found 1 groups of DWI series based on their warp spaces:
[ { 'concatenated_bids_name': 'sub-OAS30001_ses-d4467_run-01',
    'dwi_series': ['/data/sub-OAS30001/ses-d4467/dwi/sub-OAS30001_ses-d4467_run-01_dwi.nii.gz'],
    'dwi_series_pedir': 'j',
    'fieldmap_info': {'suffix': None}}]
250807-10:50:38,312 nipype.workflow INFO:
         Found 3 groups of DWI series that can be corrected by eddy:
[ { 'concatenated_bids_name': 'sub-OAS30001_ses-d2430_run-01',
    'dwi_series': ['/data/sub-OAS30001/ses-d2430/dwi/sub-OAS30001_ses-d2430_run-01_dwi.nii.gz'],
    'dwi_series_pedir': 'j-',
    'fieldmap_info': {'suffix': None}},
  { 'concatenated_bids_name': 'sub-OAS30001_ses-d3132_run-01',
    'dwi_series': ['/data/sub-OAS30001/ses-d3132/dwi/sub-OAS30001_ses-d3132_run-01_dwi.nii.gz'],
    'dwi_series_pedir': 'j-',
    'fieldmap_info': {'suffix': None}},
  { 'concatenated_bids_name': 'sub-OAS30001_ses-d4467_run-01',
    'dwi_series': ['/data/sub-OAS30001/ses-d4467/dwi/sub-OAS30001_ses-d4467_run-01_dwi.nii.gz'],
    'dwi_series_pedir': 'j',
    'fieldmap_info': {'suffix': None}}]
'

Hi @Dhivya,

Those acquisitions are very different from each other and I wouldn’t analyze them the same way. The second one, with all the b1000 images appears more typical and would work well with typical eddy processes, as opposed to shoreline.

We are also going to think of how to bypass that but you are getting in a software patch.

Best,
Steven

Thanks Steven. Instead of doing all runs, do you think just processing single run would be helpful?

Hi @Dhivya,

It might help to know more about your analysis goals.

Best,
Steven

NiChart’s primary goal is to extract FA,MD values from all studies and build normative models based on these features (example: age prediction using these features). So,the primary focus is specifically on these tensor derived measures.

@Dhivya is there any consideration of longitudinal data structure or is everything treated as cross sectional?

I was hoping to process all sessions. Could you please clarify what you mean by β€˜longitudinal data structure’ in the context of running QSIprep?

In your statistical models how are longitudinal data treated across sessions? Combining these different acquisitions across sessions and treating them the same under a single subject might confound estimates of subject specific effects.

Right now, we are planning to use/analyze/incorporate cross-sectional to NiChart if that answers your question. For fMRI, we treated them separately while processing.

I suppose if you are treating these data cross sectionals in your model you could run separately.

Feel feee to try our unstable tag for our proposed fix to the Nan issue you experienced.

Thanks, I will try a new version and let you know how it goes.

Got a similar problem here with version 1.0.1.
Dataset is BIDS compatible, there are b0 volumes in each of the files. Data were acquired by keeping all parameters the same (including prescan parameters) so there should really be no difference except for phase direction (AP et PA). Image do not contain NaN’s.
bval for dir-AP is

0 0 0 0 0 0 0 0 0 0 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200

and bval for dir-PA is

2000 0 2000 2000 2000

While this DTI scheme admittedly feels a bit weird there is definitely b0 volumes in each files so I would guess qsiprep should manage to find them.

I tried with proposed solution, namely using the unstable branch. Got a similar (though different) error:

Traceback:
	Traceback (most recent call last):
	  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/interfaces/base/core.py", line 401, in run
	    runtime = self._run_interface(runtime)
	  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/interfaces/dwi_merge.py", line 76, in _run_interface
	    to_concat, b0_means, corrections = harmonize_b0s(
	  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/interfaces/dwi_merge.py", line 717, in harmonize_b0s
	    harmonized_niis.append(math_img(f'img*{correction:.32f}', img=nii_img))
	  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nilearn/image/image.py", line 1058, in math_img
	    result = eval(formula, data_dict)
	  File "<string>", line 1, in <module>
	NameError: ("Input formula couldn't be processed, you provided 'img*inf',", "name 'inf' is not defined"). Did you mean: 'int'?

It feels here that inf should maybe be replaced by np.inf ?

Any guess?

Summary of what happened:

I am running QSIPrepon several studies. For one of my studies, which is a DTI dataset with 25 directions, the pipeline fails with the error: β€œMerge_dwis failed to run on host.” Could you help me identify what might be causing this issue, or advise on the correct parameters to use? Thank you for your assistance.

I would also like to mention that I have another similar study with a DTI dataset using 27 directions( a single runοΌ‰, and QSIPrep processes that dataset without any issues. The problem only occurs with the study that has a single DWI run and 25 directions.

b value
0 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000
b vector

0 -1 -0.849 0.108 -0.884 0.003 0.868 -0.799 0.162 -0.866 0.212 -0.068 -0.55 0.435 0.599 0.525 0.6 -0.653 -0.207 0.413 0.436 -0.462 0.503 -0.824 -0.297 0.04
0 0 0.528 0.565 -0.345 -0.736 -0.238 0.37 0.987 -0.129 -0.936 -0.892 -0.544 -0.422 0.78 0.03 -0.688 -0.06 -0.076 -0.699 0.822 0.874 0.488 -0.53 0.349 0.318
0 0 0 0.818 -0.315 0.677 0.436 0.475 0 0.483 0.281 -0.446 -0.634 0.795 0.182 -0.851 0.409 -0.755 -0.975 -0.584 -0.366 0.148 -0.713 0.202 0.889 -0.947

Command used (and if a helper script was used, a link to the helper script or the command generated):

(
    docker run --rm -m 50GB \
        -v "$INPUT_DIR:/input" \
        -v "$WORK_DIR:/work" \
        -v "$OUTPUT_DIR:/output" \
        pennlinc/qsiprep:latest \
        /input /output participant \
        --participant-label "$subj" \
        --output-resolution 2 \
        --b0-threshold 10 \
        --hmc-model eddy \
        --skip-bids-validation \
        --anatomical-template MNI152NLin2009cAsym \
    > "$LOG_DIR/${subj}.log" 2>&1
) && echo "[$current/$total_subjects] Subject $subj processed successfully - $(date)" || echo "[$current/$total_subjects] Subject $subj processing failed - $(date)"
done

echo "All subjects processed: $(date)"

Version:

QSIPrep v1.0.1

Environment (Docker, Singularity / Apptainer, custom installation):

Docker

Data formatted according to a validatable standard? Please provide the output of the validator:

image

Relevant log outputs (up to 20 lines):

**First Error Part:**
	 [Node] Finished "denoising_confounds", elapsed time 0.000268s.
251109-08:50:41,649 nipype.workflow INFO:
	 [Node] Setting-up "qsiprep_1_0_wf.sub_KW73020170617_wf.dwi_preproc_acq_dti25_wf.pre_hmc_wf.merge_and_denoise_wf.merge_dwis" in "/tmp/work/qsiprep_1_0_wf/sub_KW73020170617_wf/dwi_preproc_acq_dti25_wf/pre_hmc_wf/merge_and_denoise_wf/merge_dwis".
251109-08:50:41,656 nipype.workflow INFO:
	 [Node] Executing "merge_dwis" <qsiprep.interfaces.dwi_merge.MergeDWIs>
251109-08:50:42,924 nipype.workflow INFO:
	 [Node] Finished "merge_dwis", elapsed time 1.2482630000000001s.
251109-08:50:42,924 nipype.workflow WARNING:
	 Storing result file without outputs
251109-08:50:42,925 nipype.workflow WARNING:
	 [Node] Error on "qsiprep_1_0_wf.sub_KW73020170617_wf.dwi_preproc_acq_dti25_wf.pre_hmc_wf.merge_and_denoise_wf.merge_dwis" (/tmp/work/qsiprep_1_0_wf/sub_KW73020170617_wf/dwi_preproc_acq_dti25_wf/pre_hmc_wf/merge_and_denoise_wf/merge_dwis)
251109-08:50:44,873 nipype.workflow ERROR:
	 Node merge_dwis failed to run on host 315d6148e1fc.
251109-08:50:44,887 nipype.workflow ERROR:
	 Saving crash info to /output/sub-KW73020170617/log/20251109-084902_5f03d30c-0c8d-41af-b15c-42fa78c2fc87/crash-20251109-085044-root-merge_dwis-99b43d5b-8880-4858-ae2b-abb370b392e6.txt
Traceback (most recent call last):
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/plugins/multiproc.py", line 66, in run_node
    result["result"] = node.run(updatehash=updatehash)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 525, in run
    result = self._run_interface(execute=True)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 643, in _run_interface
    return self._run_command(execute)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 769, in _run_command
    raise NodeExecutionError(msg)
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node merge_dwis.

Traceback:
	Traceback (most recent call last):
	  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/interfaces/base/core.py", line 401, in run
	    runtime = self._run_interface(runtime)
	  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/interfaces/dwi_merge.py", line 76, in _run_interface
	    to_concat, b0_means, corrections = harmonize_b0s(
	  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/interfaces/dwi_merge.py", line 716, in harmonize_b0s
	    harmonized_niis.append(math_img(f'img*{correction:.32f}', img=nii_img))
	  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nilearn/image/image.py", line 1058, in math_img
	    result = eval(formula, data_dict)
	  File "<string>", line 1, in <module>
	NameError: ("Input formula couldn't be processed, you provided 'img*nan',", "name 'nan' is not defined")

**Second Error Part:**
251109-09:39:20,282 nipype.workflow INFO:
	 [Node] Finished "anat_nlin_normalization", elapsed time 2528.112167s.
251109-09:39:22,934 nipype.workflow ERROR:
	 could not run node: qsiprep_1_0_wf.sub_KW73020170617_wf.dwi_preproc_acq_dti25_wf.pre_hmc_wf.merge_and_denoise_wf.merge_dwis
251109-09:39:23,41 nipype.workflow CRITICAL:
	 QSIPrep failed: Traceback (most recent call last):
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/plugins/multiproc.py", line 66, in run_node
    result["result"] = node.run(updatehash=updatehash)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 525, in run
    result = self._run_interface(execute=True)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 643, in _run_interface
    return self._run_command(execute)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 769, in _run_command
    raise NodeExecutionError(msg)
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node merge_dwis.

Traceback:
	Traceback (most recent call last):
	  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/interfaces/base/core.py", line 401, in run
	    runtime = self._run_interface(runtime)
	  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/interfaces/dwi_merge.py", line 76, in _run_interface
	    to_concat, b0_means, corrections = harmonize_b0s(
	  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/interfaces/dwi_merge.py", line 716, in harmonize_b0s
	    harmonized_niis.append(math_img(f'img*{correction:.32f}', img=nii_img))
	  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nilearn/image/image.py", line 1058, in math_img
	    result = eval(formula, data_dict)
	  File "<string>", line 1, in <module>
	NameError: ("Input formula couldn't be processed, you provided 'img*nan',", "name 'nan' is not defined")

Screenshots / relevant information:


@HanyuShao @arovai, @mattcieslak has seen this issue when the reverse phase-encoded run should more properly be organized in the fmap folder. Instead of organizing your data like this:

sub-<label>/
    ses-<label>/
        dwi/
            sub-<label>_ses-<label>_dir-AP_dwi.nii.gz
            sub-<label>_ses-<label>_dir-AP_dwi.bval
            sub-<label>_ses-<label>_dir-AP_dwi.bvec
            sub-<label>_ses-<label>_dir-AP_dwi.json
            sub-<label>_ses-<label>_dir-PA_dwi.nii.gz
            sub-<label>_ses-<label>_dir-PA_dwi.bval
            sub-<label>_ses-<label>_dir-PA_dwi.bvec
            sub-<label>_ses-<label>_dir-PA_dwi.json

Please try organizing it like this:

sub-<label>/
    ses-<label>/
        dwi/
            sub-<label>_ses-<label>_dir-AP_dwi.nii.gz
            sub-<label>_ses-<label>_dir-AP_dwi.bval
            sub-<label>_ses-<label>_dir-AP_dwi.bvec
            sub-<label>_ses-<label>_dir-AP_dwi.json
        fmap/
            sub-<label>_ses-<label>_dir-PA_epi.nii.gz
            sub-<label>_ses-<label>_dir-PA_epi.bval
            sub-<label>_ses-<label>_dir-PA_epi.bvec
            sub-<label>_ses-<label>_dir-PA_epi.json  # Add IntendedFor field here

Thank you for your response and suggestions.

In my case, I can confirm that this issue is not caused by data organization or formatting problems.

To clarify, for the DTI dataset with 25 directions, there is no image available for distortion correctionβ€”there is only a single DWI run, and no reverse phase-encoded images or fieldmaps are present. What puzzles me is that I have another similar dataset with 27 directions (also a single DWI run, without additional correction images), and QSIPrep processes that dataset without any issues. The problem only occurs with the 25-direction dataset.

Given this, I would like to understand why QSIPrep is unable to process the 25-direction dataset, while the 27-direction dataset works fine under seemingly identical conditions. Could you please advise on what checks I should perform, or what might be causing this discrepancy? Any guidance on troubleshooting steps or parameter adjustments would be greatly appreciated.

Thank you very much for your assistance.