Module merge_dwis has no output called out_dwi_phase

Summary of what happened:

I am getting an error that has to do with a phase variable; however, I don’t have a phase file for my dwi images. I tried to tell it to ignore phase, but the error persists. I also get an error about a tmp directory not being empty. I am new to both DWI image processing and qsiprep, so my knowledge on how to troubleshoot this is limited.

My data does have three issues with BIDS, which I am now fixing: (1) there was an error with our ASL data labeling, (2) we have MRS acquisitions (no guidance for BIDS yet), and (3) the README file needs to be fixed. However, I don’t think that any of these should cause this error. Help would be appreciated.

Command used (and if a helper script was used, a link to the helper script or the command generated):

Bash script:

#!/bin/bash

#run qsiprep with patch2self, eddy, SyN-based susceptibility distortion correction

qsiprep \
/mnt/share/Bipolar_MDD/Bipolar_MDD_Data /mnt/share/Bipolar_MDD/derivatives/qsiprep/00_preprocessed_dwi participant \
--ignore phase \
--skip_bids_validation \
--separate_all_dwis \
--participant-label 23522001 \
--anat-modality T1w \
--output-resolution 1.7 \
--denoise-method patch2self \
--unringing-method mrdegibbs \
--b1-biascorrect-stage final \
--denoise-after-combining \
--distortion-group-merge average \
--anatomical-template MNI152NLin2009cAsym \
--b0-to-t1w-transform Affine \
--intramodal-template-transform Affine \
--fs-license-file /mnt/share/Programs/FreeSurfer7.2/freesurfer/license.txt \
--b0-motion-corr-to iterative \
--hmc-transform Affine \
--hmc_model eddy \
--use-syn-sdc \
--verbose \
--work_dir /mnt/share/Bipolar_MDD/derivatives/qsiprep/temp 

HPC job script:

#!/bin/bash
#$ -pe smp 16
#$ -q PINC,UI
#$ -m bea
#$ -M gail-harmata@uiowa.edu
#$ -j y
#$ -o /Shared/MRRCdata/Bipolar_MDD/scripts/qsiprep//00_preprocessing/out
singularity exec -c -W /nfsscratch/Users/gharmata/ -B /Shared/MRRCdata:/mnt/share /Shared/MRRCdata/Programs/qsiprep/qsiprep-v0.21.4.sif /mnt/share/Bipolar_MDD/scripts/qsiprep//00_preprocessing/job_scripts/sub-23522001_ses-20230127s3T_0.sh

Version:

qsiprep v0.21.5.dev0+g36b93fe.d20240504

Environment (Docker, Singularity / Apptainer, custom installation):

I am using qsiprep in an Apptainer/Singularity container, which I built pulling from docker://pennbbl/qsiprep:0.21.4.

Data formatted according to a validatable standard? Please provide the output of the validator:

bids-validator@1.8.4

	e[31m1: [ERR] Files with such naming scheme are not part of BIDS specification. This error is most commonly caused by typos in file names that make them not BIDS compatible. Please consult the specification and make sure your files are named correctly. If this is not a file naming issue (for example when including files not yet covered by the BIDS specification) you should include a ".bidsignore" file in your dataset (see https://github.com/bids-standard/bids-validator#bidsignore for details). Please note that derived (processed) data should be placed in /derivatives folder and source data (such as DICOMS or behavioural logs in proprietary formats) should be placed in the /sourcedata folder. (code: 1 - NOT_INCLUDED)e[39m
		./sub-23522001/ses-20230127s3T/asl/sub-23522001_ses-20230127s3T_CBF.json
			Evidence: sub-23522001_ses-20230127s3T_CBF.json
		./sub-23522001/ses-20230127s3T/asl/sub-23522001_ses-20230127s3T_CBF.nii.gz
			Evidence: sub-23522001_ses-20230127s3T_CBF.nii.gz
		./sub-23522001/ses-20230127s3T/asl/sub-23522001_ses-20230127s3T_asl.json
			Evidence: sub-23522001_ses-20230127s3T_asl.json
		./sub-23522001/ses-20230127s3T/asl/sub-23522001_ses-20230127s3T_asl.nii.gz
			Evidence: sub-23522001_ses-20230127s3T_asl.nii.gz
		./sub-23522001/ses-20230127s3T/asl/sub-23522001_ses-20230127s3T_asla.json
			Evidence: sub-23522001_ses-20230127s3T_asla.json
		./sub-23522001/ses-20230127s3T/asl/sub-23522001_ses-20230127s3T_asla.nii.gz
			Evidence: sub-23522001_ses-20230127s3T_asla.nii.gz
		./sub-23522001/ses-20230127s7T/mrs/sub-23522001_ses-20230127s7T_acq-sLASERSV30Vermis_H.json
			Evidence: sub-23522001_ses-20230127s7T_acq-sLASERSV30Vermis_H.json
		./sub-23522001/ses-20230127s7T/mrs/sub-23522001_ses-20230127s7T_acq-sLASERSV30Vermis_H.nii.gz
			Evidence: sub-23522001_ses-20230127s7T_acq-sLASERSV30Vermis_H.nii.gz

e[36m	Please visit https://neurostars.org/search?q=NOT_INCLUDED for existing conversations about this issue.e[39m

	e[33m1: [WARN] The recommended file /README is missing. See Section 03 (Modality agnostic files) of the BIDS specification. (code: 101 - README_FILE_MISSING)e[39m

e[36m	Please visit https://neurostars.org/search?q=README_FILE_MISSING for existing conversations about this issue.e[39m

        e[34me[4mSummary:e[24me[39m                    e[34me[4mAvailable Tasks:e[24me[39m        e[34me[4mAvailable Modalities:e[24me[39m 
        5717 Files, 332.42GB                                MRI                   
        159 - Subjects                                                            
        307 - Sessions                                                            


e[36m	If you have any questions, please post on https://neurostars.org/tags/bids.e[39m
Making sure the input data is BIDS compliant (warnings can be ignored in most cases).

Relevant log outputs:

240703-16:06:45,407 nipype.workflow INFO:
	 Running with omp_nthreads=8, nthreads=56
240703-16:06:45,447 nipype.workflow IMPORTANT:
	 
    Running qsiprep version 0.21.5.dev0+g36b93fe.d20240504:
      * BIDS dataset path: /mnt/share/Bipolar_MDD/Bipolar_MDD_Data.
      * Participant list: ['23522001'].
      * Run identifier: 20240703-155819_1a1f0185-d6ec-4956-bc3a-1c7e2f804d08.
    
240703-16:08:55,333 nipype.workflow INFO:
	 Running nonlinear normalization to template
240703-16:09:09,952 nipype.workflow INFO:
	 [{'dwi_series': ['/mnt/share/Bipolar_MDD/Bipolar_MDD_Data/sub-23522001/ses-20230127s3T/dwi/sub-23522001_ses-20230127s3T_acq-Multi_dwi.nii.gz'], 'fieldmap_info': {'suffix': None}, 'dwi_series_pedir': 'j-', 'concatenated_bids_name': 'sub-23522001_ses-20230127s3T_acq-Multi'}]
240703-16:09:10,452 nipype.workflow IMPORTANT:
	 Creating dwi processing workflow "dwi_preproc_ses_20230127s3T_acq_Multi_wf" to produce output sub-23522001_ses-20230127s3T_acq-Multi (1.58 GB / 110 DWIs). Memory resampled/largemem=3.33/3.91 GB.
Process Process-2:
Traceback (most recent call last):
  File "/opt/conda/envs/qsiprep/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/opt/conda/envs/qsiprep/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/cli/run.py", line 1091, in build_qsiprep_workflow
    retval["workflow"] = init_qsiprep_wf(
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/workflows/base.py", line 237, in init_qsiprep_wf
    single_subject_wf = init_single_subject_wf(
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/workflows/base.py", line 748, in init_single_subject_wf
    dwi_preproc_wf = init_dwi_preproc_wf(
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/workflows/dwi/base.py", line 363, in init_dwi_preproc_wf
    pre_hmc_wf = init_dwi_pre_hmc_wf(
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/workflows/dwi/pre_hmc.py", line 320, in init_dwi_pre_hmc_wf
    merge_dwis = init_merge_and_denoise_wf(
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsiprep/workflows/dwi/merge.py", line 345, in init_merge_and_denoise_wf
    workflow.connect([
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/nipype/pipeline/engine/workflows.py", line 239, in connect
    raise Exception("\n".join(["Some connections were not found"] + infostr))
Exception: Some connections were not found
Module merge_dwis has no output called out_dwi_phase

Traceback (most recent call last):
  File "/opt/conda/envs/qsiprep/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/opt/conda/envs/qsiprep/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/opt/conda/envs/qsiprep/lib/python3.10/multiprocessing/managers.py", line 599, in _run_server
    server.serve_forever()
  File "/opt/conda/envs/qsiprep/lib/python3.10/multiprocessing/managers.py", line 184, in serve_forever
    sys.exit(0)
SystemExit: 0

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/conda/envs/qsiprep/lib/python3.10/multiprocessing/util.py", line 300, in _run_finalizers
    finalizer()
  File "/opt/conda/envs/qsiprep/lib/python3.10/multiprocessing/util.py", line 224, in __call__
    res = self._callback(*self._args, **self._kwargs)
  File "/opt/conda/envs/qsiprep/lib/python3.10/multiprocessing/util.py", line 133, in _remove_temp_dir
    rmtree(tempdir)
  File "/opt/conda/envs/qsiprep/lib/python3.10/shutil.py", line 725, in rmtree
    _rmtree_safe_fd(fd, path, onerror)
  File "/opt/conda/envs/qsiprep/lib/python3.10/shutil.py", line 681, in _rmtree_safe_fd
    onerror(os.unlink, fullname, sys.exc_info())
  File "/opt/conda/envs/qsiprep/lib/python3.10/shutil.py", line 679, in _rmtree_safe_fd
    os.unlink(entry.name, dir_fd=topfd)
OSError: [Errno 16] Device or resource busy: '.nfs00000000051647cb0000042c'
Traceback (most recent call last):
  File "/opt/conda/envs/qsiprep/lib/python3.10/multiprocessing/util.py", line 300, in _run_finalizers
    finalizer()
  File "/opt/conda/envs/qsiprep/lib/python3.10/multiprocessing/util.py", line 224, in __call__
    res = self._callback(*self._args, **self._kwargs)
  File "/opt/conda/envs/qsiprep/lib/python3.10/multiprocessing/util.py", line 133, in _remove_temp_dir
    rmtree(tempdir)
  File "/opt/conda/envs/qsiprep/lib/python3.10/shutil.py", line 731, in rmtree
    onerror(os.rmdir, path, sys.exc_info())
  File "/opt/conda/envs/qsiprep/lib/python3.10/shutil.py", line 729, in rmtree
    os.rmdir(path)
OSError: [Errno 39] Directory not empty: '/tmp/pymp-w6bcbgw0'

Screenshots / relevant information:

I am utilizing our institution’s HPC, which uses SGE (specifically the Son of Grid Engine version). I don’t have root access to this linux-based setup, which is why I went with the Apptainer/Singularity container.