No MNI to T1w transform - QSIrecon

Hi,

I am encountering the attached error from qsirecon. I am running into this on some subjects while for others, I am able to successfully run reconstruction. I compared the anat directory from two subjects (that ran successfully and unsuccessfully) from qsiprep (input to recon) and am not able to discern differences in the files within them.

Can anyone advise on what is happening?

Thanks in advance.

Hi,

What was the command you used to run QSIPrep? And for QSIRecon? Did you run the anatomical workflow during preprocessing (that is, no --dwi-only flag)? What version of QSIPrep?

Best,
Steven

Hi,

I am using 0.16.1 version of qsiprep.

The qsiprep command I used to run preprocessing is as follows:

singularity run --cleanenv -B /mnt/cina/WMH:/WMH ${qsiprep_img) ${BIDS_dir} ${output_dir} participant -w ${work_dir}--participant-label ${subj} --output-resolution 1.0 --fs_license_file ${FS_LICENSE} --skips -validation--nthreads 16 --mem-mb 16000

The qsirecon command I used for reconstruction is as follows:

singularity run --cleanenv -B /mnt/cina/WMH:/WMH /mnt/cina/WMH/lib/qsiprep-0.16.1.simg /WMH/data/BIDS/ADNI/ /WMH/data/BIDS/ADNI/derivatives/qsiprep participant --participant_label sub-036S6179--output-resolution 1.0 --recon-only --recon-spec amico_noddi --recon-spec dsi_studio_gqi --recon-input /WMH/data/BIDS/ADNI/derivatives/qsiprep/qsiprep --fs_license_file /WMH/lib/fs_license.txt --skip-bids-validation

Thanks in advance.

For your QSIRecon command, I would set the output dir and recon-input as /WMH/data/BIDS/ADNI/derivatives/, unless the qsiprep outputs are in a folder called derivatives/qsiprep/qsiprep

Also, lets try only having one recon-spec. You can combine the amico_noddi and dsi_studio_gqi workflows into one recon_spec by combining their jsons, like so:

{ "description": "Runs the AMICO implementation of NODDI",
  "space": "T1w",
  "name": "amico_noddi",
  "atlases": ["schaefer100", "schaefer200", "schaefer400", "brainnetome246", "aicha384", "gordon333", "aal116"],
  "nodes": [
    {
      "name": "fit_noddi",
      "action": "fit_noddi",
      "software": "AMICO",
      "input": "qsiprep",
      "output_suffix": "NODDI",
      "parameters": {
        "isExvivo": false,
        "dPar": 1.7E-3,
        "dIso": 3.0E-3
      }
    {
      "name": "dsistudio_gqi",
      "software": "DSI Studio",
      "action": "reconstruction",
      "input": "qsiprep",
      "output_suffix": "gqi",
      "parameters": {"method": "gqi"}
    },
    {
      "name": "scalar_export",
      "software": "DSI Studio",
      "action": "export",
      "input": "dsistudio_gqi",
      "output_suffix": "gqiscalar"
    },
    {
      "name": "tractography",
      "software": "DSI Studio",
      "action": "tractography",
      "input": "dsistudio_gqi",
      "parameters": {
        "turning_angle": 35,
        "method": 0,
        "smoothing": 0.0,
        "step_size": 1.0,
        "min_length": 30,
        "max_length": 250,
        "seed_plan": 0,
        "interpolation": 0,
        "initial_dir": 2,
        "fiber_count": 5000000
      }
    },
    {
      "name": "streamline_connectivity",
      "software": "DSI Studio",
      "action": "connectivity",
      "input": "tractography",
      "output_suffix": "gqinetwork",
      "parameters": {
        "connectivity_value": "count,ncount,mean_length,gfa",
        "connectivity_type": "pass,end"
      }
    }
  ]
}

You can save that as a .json file, and put the path to that as your --recon-spec, and make sure it is contained in a path that is bound to the container (somewhere in /mnt/cina/WMH/).

So, after checking to make sure QSIPrep ran successfully for these subjects (e.g., checking the HTML outputs, I would run this command for QSIRecon, using a fresh work directory:

singularity run --cleanenv -B /mnt/cina/WMH:/WMH /mnt/cina/WMH/lib/qsiprep-0.16.1.simg /WMH/data/BIDS/ADNI/ /WMH/data/BIDS/ADNI/derivatives/ participant --participant_label sub-036S6179 --recon-only --recon-spec /PATH/TO/YOUR/RECON/SPEC.json --recon-input /WMH/data/BIDS/ADNI/derivatives/qsiprep/ --fs_license_file /WMH/lib/fs_license.txt --skip-bids-validation -w SPECIFY/WORK/DIR

Thank you for your reply.

I ran the above command after pointing the recon spec to the .json file above. I am getting the error attached to this post.

My qsiprep outputs are in /derivatives/qsiprep/qsiprep.

Hi,

You bound /mnt/cina/WMH as /WMH in the container, so QSIPrep will not find a file in /mnt/cina/WMH because /mnt/cina/WMH was renamed to just /WMH. So if you remove the /mnt/cina/ prefix for the recon-spec it should be found.

Additionally, it would help to return the full outputs from a subject, for example using the tree terminal command on one of the subject output folders.

Best,
Steven

Hi Steven,

Thank you for your help thus far.

I am still getting an error - likely due to the json file (I copied your previous message and created a .json file) and passed that through the command and removed the /mnt/cina prefix to the recon-spec command.

Ah, I think there’s a missing comma between the fit_noddi and dsistudio_gqi nodes.

Is this the edit you mean?

{ “description”: “Runs the AMICO implementation of NODDI”,
“space”: “T1w”,
“name”: “amico_noddi”,
“atlases”: [“schaefer100”, “schaefer200”, “schaefer400”, “brainnetome246”, “aicha384”, “gordon333”, “aal116”],
“nodes”: [
{
“name”: “fit_noddi”,
“action”: “fit_noddi”,
“software”: “AMICO”,
“input”: “qsiprep”,
“output_suffix”: “NODDI”,
“parameters”: {
“isExvivo”: false,
“dPar”: 1.7E-3,
“dIso”: 3.0E-3
},
{
“name”: “dsistudio_gqi”,
“software”: “DSI Studio”,
“action”: “reconstruction”,
“input”: “qsiprep”,
“output_suffix”: “gqi”,
“parameters”: {“method”: “gqi”}
},
{
“name”: “scalar_export”,
“software”: “DSI Studio”,
“action”: “export”,
“input”: “dsistudio_gqi”,
“output_suffix”: “gqiscalar”
},
{
“name”: “tractography”,
“software”: “DSI Studio”,
“action”: “tractography”,
“input”: “dsistudio_gqi”,
“parameters”: {
“turning_angle”: 35,
12,21 Top

Yes that was the missing comma.

Still getting the same error. It’s unable to read the .json.

Can you confirm the right kind of quotes are in your JSON? There may be unicode errors from copying and pasting. E.g., straight vs curly quotes.

Hi Steven,

I was able to run the reconstruction without pointing qsi to the .json. I think by specifying the working directory, the issue was resolved.

Thanks again for your help with this.

I’m actually getting the exact same error if i try to use any recon-spec json that is not provided with qsiprep as well.

I just copied the mrtrix_singleshell_ss3t_noACT and just changed the tracktography type and atlases setup. It fails every time with the same error.

         [Node] Error on "qsirecon_wf.sub-0010_mrtrix_singleshell_ss3t_noACT_sd_stream.sub_0010_ses_02_dir_AP_space_T1w_desc_preproc_recon_wf.qsirecon_anat_wf.get_atlases" (/mnt/munin/Song/Lab/Chris/LIFU/work2/qsirecon_wf/sub-0010_mrtrix_singleshell_ss3t_noACT_sd_stream/sub_0010_ses_02_dir_AP_space_T1w_desc_preproc_recon_wf/qsirecon_anat_wf/get_atlases)
230130-15:56:20,538 nipype.workflow ERROR:
         Node get_atlases failed to run on host blade14.dhe.duke.edu.
230130-15:56:20,611 nipype.workflow ERROR:
         Saving crash info to /mnt/munin/Song/Lab/Chris/LIFU/derivatives/qsirecon/sub-0010/log/20230130-155448_d36c0411-e5cb-488f-be71-b4b83656734a/crash-20230130-155620-cmp12-get_atlases-65790b95-fb59-4bd8-9790-a63bec76e814.txt
Traceback (most recent call last):
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py", line 344, in _send_procs_to_workers
    self.procs[jobid].run(updatehash=updatehash)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 527, in run
    result = self._run_interface(execute=True)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 645, in _run_interface
    return self._run_command(execute)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 771, in _run_command
    raise NodeExecutionError(msg)
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node get_atlases.

Traceback:
        Traceback (most recent call last):
          File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/interfaces/base/core.py", line 398, in run
            runtime = self._run_interface(runtime)
          File "/usr/local/miniconda/lib/python3.8/site-packages/qsiprep/interfaces/utils.py", line 51, in _run_interface
            raise Exception("No MNI to T1w transform found in anatomical directory")
        Exception: No MNI to T1w transform found in anatomical directory


230130-15:56:22,399 nipype.workflow INFO:
         [Node] Executing "resample_mask" <nipype.interfaces.afni.utils.Resample>
230130-15:56:22,507 nipype.workflow WARNING:
         [Node] Error on "qsirecon_wf.sub-0010_mrtrix_singleshell_ss3t_noACT_sd_stream.sub_0010_ses_02_dir_AP_space_T1w_desc_preproc_recon_wf.qsirecon_anat_wf.resample_mask" (/mnt/munin/Song/Lab/Chris/LIFU/work2/qsirecon_wf/sub-0010_mrtrix_singleshell_ss3t_noACT_sd_stream/sub_0010_ses_02_dir_AP_space_T1w_desc_preproc_recon_wf/qsirecon_anat_wf/resample_mask)
230130-15:56:22,568 nipype.workflow INFO:
         [Node] Executing "odf_rois" <nipype.interfaces.ants.resampling.ApplyTransforms>
exception calling callback for <Future at 0x2b03b7b68520 state=finished raised FileNotFoundError>
concurrent.futures.process._RemoteTraceback: 

Any idea how i can get a custom json to run?

[cmp12@blade09 LIFU]$ diff  --suppress-common-lines -y -W 200  mrtrix_singleshell_ss3t_noACT_sd_stream.json mrtrix_singleshell_ss3t_noACT.json 
  "name": "mrtrix_singleshell_ss3t_noACT_sd_stream",						   |	  "name": "mrtrix_singleshell_ss3t_noACT",
  "atlases": ["aal116"],									   |	  "atlases": ["schaefer100", "schaefer200", "schaefer400", "brainnetome246", "aicha384", "gordon
      "name": "track_sd_stream",								   |	      "name": "track_ifod2",
      "output_suffix": "sd_stream",								   |	      "output_suffix": "ifod2",
          "algorithm": "SD_Stream",								   |	          "algorithm": "iFOD2",
          "min_length": 30									   |	          "min_length": 30,
												   >	          "power":0.33,
												   >	          "quiet": true
      "input": "track_sd_stream",								   |	      "input": "track_ifod2",

I have all of our NFS share bound into all singularity runs by default, but here’s my call.

${PACKDIR}/singularity/bin/singularity run --cleanenv -B $PACKDIR:/mnt/packages -B ${SINGULARITYENV_TEMPLATEFLOW_HOME}:/opt/templateflow \
 ${PACKDIR}/singularity/images/qsiprep_v0.16.1.sif \
 /mnt/munin/Song/Lab/Chris/LIFU/BIDS \
 /mnt/munin/Song/Lab/Chris/LIFU/derivatives \
 participant --participant-label ${SUBJ} \
 --nthreads 12 --omp-nthreads 12 --skip_bids_validation \
 --work-dir /mnt/munin/Song/Lab/Chris/LIFU/work2 \
 --output-resolution 2 \
 --fs-license-file /mnt/packages/freesurfer_v7.1.1/license.txt \
 --recon_spec /mnt/munin/Song/Lab/Chris/LIFU/mrtrix_singleshell_ss3t_noACT_sd_stream.json \
 --output-space T1w

i’ve tried every iteration of recon_spec i could think of

Hi @cmpetty and welcome to neurostars!

May you provide the output of tree on one of your failing subject’s QSIPrep outputs?

Best,
Steven

[cmp12@blade09 LIFU]$ tree derivatives/qsiprep/sub-0010
derivatives/qsiprep/sub-0010
├── anat
│   ├── sub-0010_desc-brain_mask.nii.gz
│   ├── sub-0010_desc-preproc_T1w.nii.gz
│   ├── sub-0010_dseg.nii.gz
│   ├── sub-0010_from-MNI152NLin2009cAsym_to-T1w_mode-image_xfm.h5
│   ├── sub-0010_from-T1w_to-MNI152NLin2009cAsym_mode-image_xfm.h5
│   ├── sub-0010_label-CSF_probseg.nii.gz
│   ├── sub-0010_label-GM_probseg.nii.gz
│   ├── sub-0010_label-WM_probseg.nii.gz
│   ├── sub-0010_space-MNI152NLin2009cAsym_desc-brain_mask.nii.gz
│   ├── sub-0010_space-MNI152NLin2009cAsym_desc-preproc_T1w.nii.gz
│   ├── sub-0010_space-MNI152NLin2009cAsym_dseg.nii.gz
│   ├── sub-0010_space-MNI152NLin2009cAsym_label-CSF_probseg.nii.gz
│   ├── sub-0010_space-MNI152NLin2009cAsym_label-GM_probseg.nii.gz
│   └── sub-0010_space-MNI152NLin2009cAsym_label-WM_probseg.nii.gz
├── figures
│   ├── sub-0010_seg_brainmask.svg
│   ├── sub-0010_ses-01_dir-AP_carpetplot.svg
│   ├── sub-0010_ses-01_dir-AP_coreg.svg
│   ├── sub-0010_ses-01_dir-AP_desc-resampled_b0ref.svg
│   ├── sub-0010_ses-01_dir-AP_dwi_denoise_ses_01_dir_AP_dwi_wf_biascorr.svg
│   ├── sub-0010_ses-01_dir-AP_dwi_denoise_ses_01_dir_AP_dwi_wf_denoising.svg
│   ├── sub-0010_ses-01_dir-AP_sampling_scheme.gif
│   ├── sub-0010_ses-02_dir-AP_carpetplot.svg
│   ├── sub-0010_ses-02_dir-AP_coreg.svg
│   ├── sub-0010_ses-02_dir-AP_desc-resampled_b0ref.svg
│   ├── sub-0010_ses-02_dir-AP_dwi_denoise_ses_02_dir_AP_dwi_wf_biascorr.svg
│   ├── sub-0010_ses-02_dir-AP_dwi_denoise_ses_02_dir_AP_dwi_wf_denoising.svg
│   ├── sub-0010_ses-02_dir-AP_sampling_scheme.gif
│   └── sub-0010_t1_2_mni.svg
├── ses-01
│   ├── anat
│   │   └── sub-0010_ses-01_from-orig_to-T1w_mode-image_xfm.txt
│   └── dwi
│       ├── adc.nii.gz
│       ├── fa.nii.gz
│       ├── stem2.nii.gz
│       ├── sub-0010_ses-01_dir-AP_confounds.tsv
│       ├── sub-0010_ses-01_dir-AP_desc-ImageQC_dwi.csv
│       ├── sub-0010_ses-01_dir-AP_desc-SliceQC_dwi.json
│       ├── sub-0010_ses-01_dir-AP_dwiqc.json
│       ├── sub-0010_ses-01_dir-AP_space-T1w_desc-brain_mask.nii.gz
│       ├── sub-0010_ses-01_dir-AP_space-T1w_desc-eddy_cnr.nii.gz
│       ├── sub-0010_ses-01_dir-AP_space-T1w_desc-preproc_dwi.b
│       ├── sub-0010_ses-01_dir-AP_space-T1w_desc-preproc_dwi.bval
│       ├── sub-0010_ses-01_dir-AP_space-T1w_desc-preproc_dwi.bvec
│       ├── sub-0010_ses-01_dir-AP_space-T1w_desc-preproc_dwi.nii.gz
│       ├── sub-0010_ses-01_dir-AP_space-T1w_dwiref.nii.gz
│       ├── tensors.mif.gz
│       ├── vecs.mif
│       └── ventricles.nii.gz
└── ses-02
    ├── anat
    │   └── sub-0010_ses-02_from-orig_to-T1w_mode-image_xfm.txt
    └── dwi
        ├── sub-0010_ses-02_dir-AP_confounds.tsv
        ├── sub-0010_ses-02_dir-AP_desc-ImageQC_dwi.csv
        ├── sub-0010_ses-02_dir-AP_desc-SliceQC_dwi.json
        ├── sub-0010_ses-02_dir-AP_dwiqc.json
        ├── sub-0010_ses-02_dir-AP_space-T1w_desc-brain_mask.nii.gz
        ├── sub-0010_ses-02_dir-AP_space-T1w_desc-eddy_cnr.nii.gz
        ├── sub-0010_ses-02_dir-AP_space-T1w_desc-preproc_dwi.b
        ├── sub-0010_ses-02_dir-AP_space-T1w_desc-preproc_dwi.bval
        ├── sub-0010_ses-02_dir-AP_space-T1w_desc-preproc_dwi.bvec
        ├── sub-0010_ses-02_dir-AP_space-T1w_desc-preproc_dwi.nii.gz
        └── sub-0010_ses-02_dir-AP_space-T1w_dwiref.nii.gz

8 directories, 58 files
[

It looks to me like there are some issues with binding data directories. One easy way to get this running is to install the qsiprep-container package and use qsiprep-singularity. This will handle all the binding

OK, i installed the python helper wrapper to see what it would give me. It said it would run the following.

#RUNNING: singularity run --cleanenv -B /usr/local/packages/freesurfer_v7.1.1:/mnt -B /mnt/munin/Song/Lab/Chris/LIFU/BIDS:/sngl/data -B :/sngl/spec -B /mnt/munin/Song/Lab/Chris/LIFU/derivatives:/sngl/out -B /mnt/munin/Song/Lab/Chris/LIFU/work2:/sngl/scratch /usr/local/packages/singularity/images/qsiprep_v0.16.1.sif /sngl/data /sngl/out participant --fs-license-file /mnt/license.txt --recon-spec /sngl/spec/mrtrix_singleshell_ss3t_noACT_sd_stream.json 0010 --output-resolution 2 --output-space T1w -w /sngl/scratch

So i updated my submission to the following:

${PACKDIR}/singularity/bin/singularity run --cleanenv -B $PACKDIR:/mnt/packages -B ${SINGULARITYENV_TEMPLATEFLOW_HOME}:/opt/templateflow \
 -B /mnt/munin/Song/Lab/Chris/LIFU/BIDS:/sngl/data \
 -B /mnt/munin/Song/Lab/Chris/LIFU:/sngl/spec \
 -B /mnt/munin/Song/Lab/Chris/LIFU/work2:/sngl/scratch \
 -B /mnt/munin/Song/Lab/Chris/LIFU/derivatives2:/sngl/out \
 ${PACKDIR}/singularity/images/qsiprep_v0.16.1.sif \
 /sngl/data \
 /sngl/out \
 participant --participant-label ${SUBJ} \
 --nthreads 12 --omp-nthreads 12 --skip_bids_validation \
 --work-dir /sngl/scratch \
 --output-resolution 2 \
 --fs-license-file /mnt/packages/freesurfer_v7.1.1/license.txt \
 --recon_spec mrtrix_singleshell_ss3t_noACT_sd_stream.json \
 --output-space T1w

I still am getting the exact same issue with transforms:

	 [Node] Error on "qsirecon_wf.sub-0010_mrtrix_singleshell_ss3t_noACT_sd_stream.sub_0010_ses_02_dir_AP_space_T1w_desc_preproc_recon_wf.qsirecon_anat_wf.get_atlases" (/sngl/scratch/qsirecon_wf/sub-0010_mrtrix_singleshell_ss3t_noACT_sd_stream/sub_0010_ses_02_dir_AP_space_T1w_desc_preproc_recon_wf/qsirecon_anat_wf/get_atlases)
230131-12:08:57,205 nipype.workflow ERROR:
	 Node get_atlases failed to run on host blade10.dhe.duke.edu.
230131-12:08:57,217 nipype.workflow ERROR:
	 Saving crash info to /sngl/out/qsirecon/sub-0010/log/20230131-120834_ea5dacef-5096-43b0-a2de-01704ab9c0dc/crash-20230131-120857-cmp12-get_atlases-62d4110b-ff9f-4d91-90c7-ea447a27d119.txt
Traceback (most recent call last):
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py", line 344, in _send_procs_to_workers
    self.procs[jobid].run(updatehash=updatehash)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 527, in run
    result = self._run_interface(execute=True)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 645, in _run_interface
    return self._run_command(execute)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 771, in _run_command
    raise NodeExecutionError(msg)
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node get_atlases.

Traceback:
	Traceback (most recent call last):
	  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/interfaces/base/core.py", line 398, in run
	    runtime = self._run_interface(runtime)
	  File "/usr/local/miniconda/lib/python3.8/site-packages/qsiprep/interfaces/utils.py", line 51, in _run_interface
	    raise Exception("No MNI to T1w transform found in anatomical directory")
	Exception: No MNI to T1w transform found in anatomical directory

Is 0.16.1 now making some assumptions about singularity and everything being bound to “/sngl” ? In my singularity environment i am auto-binding all my NFS mounts to that they are at the same path inside the container as outside.

I am getting the same error with 0.16.1.
I am using -B $custom_atlases:/atlas/qsirecon_atlases to mount a directory that includes my custom atlas along with the standard atlases and -B $recon_spec_dir together with recon_spec$recon_spec_dir/dsi_studio_gqi_with_GlasserTianS1.json to pass a .json file. This file is just a copy of the dsi_studio_gqiworfklow where I added my atlas to the atlases field.

Did anything end up working out here?