QSIprep Reconstruction - want to make sure output files are correctly reconstructed

Summary of what happened:

Outputs from QSIprep derived, no .html file output, want to make sure that the reconstructed files are fully processed. When viewed using fsleyes, the output seems correct. However, there are no .html outputs and there are crashes in calc_connectivity, create_src, get_atlases, plot_peaks, tracking. I have posted the crash at plot_connectivity. Secondly, there is a consistent error that it is not finding the preproc file (which I have confirmed exists in the qsiprep preprocessing output).

Command used (and if a helper script was used, a link to the helper script or the command generated):

singularity run --cleanenv -B /mnt/cina/WMH:/WMH /mnt/cina/WMH/lib/qsiprep-0.16.1.simg /WMH/data/BIDS/ADNI/ /WMH/data/BIDS/ADNI/derivatives/qsiprep participant --participant_label sub-007S1222 --output-resolution 1.0 --recon-only --recon-spec amico_noddi --recon-only --recon-spec dsi_studio_gqi --recon-input /WMH/data/BIDS/ADNI/derivatives/qsiprep/qsiprep --fs_license_file /WMH/lib/fs_license.txt --skip-bids-validation -w /WMH/data/intermediates_test/qsiprep

Version:

<qsiprep-0.16.1.simg >

Environment (Docker, Singularity, custom installation):

Data formatted according to a validatable standard? Please provide the output of the validator:

Relevant log outputs (up to 20 lines):

Screenshots / relevant information:



Hi,

A few things:

  1. In the future, may you please copy and paste the terminal error outputs into your post? The screenshots are not as easy to read.

You specify two recon-specs, which I am not sure is allowed. If you want to do both of these workflows, you can either do two QSIRecon commands or combine the jsons into one file, and pass the resulting JSON into the --recon-spec argument. (You can find the pipeline JSONs here:qsiprep/qsiprep/data/pipelines at master · PennLINC/qsiprep · GitHub)
3) Is this error subject specific or for everyone? I see some subjects have .htmls and others do not.

Yes, I will post the terminal error outputs going forward.

For the reconstruction - I was able to specify two recon specs and it did not crash for 61 subjects. It did crash for the others.

I will try running one recon-spec at a time. However, why would it work for for some subjects and not for others?

Thanks in advance.

What does the error that relates to the not finding preprocessed files look like?

It is highlighted in white - sorry I don’t have the terminal output currently to paste it from there.

That it not indicating that the preprocessed files were not found. That is something in the work directory not being found. Does this error persist in a new directory, and are there errors that precede this?

Yes, below is the error copied from the terminal.

	 [Node] Error on "qsirecon_wf.sub-007S1222_amico_noddi.sub_007S1222_ses_004_run_01_space_T1w_desc_preproc_recon_wf.qsirecon_anat_wf.resample_mask" (/WMH/data/intermediates/qsiprep/qsirecon_wf/sub-007S1222_amico_noddi/sub_007S1222_ses_004_run_01_space_T1w_desc_preproc_recon_wf/qsirecon_anat_wf/resample_mask)
exception calling callback for <Future at 0x7f759c9eeb20 state=finished raised FileNotFoundError>
concurrent.futures.process._RemoteTraceback: 
"""
Traceback (most recent call last):
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py", line 67, in run_node
    result["result"] = node.run(updatehash=updatehash)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 527, in run
    result = self._run_interface(execute=True)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 645, in _run_interface
    return self._run_command(execute)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 722, in _run_command
    result = self._interface.run(cwd=outdir, ignore_exception=True)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/interfaces/base/core.py", line 388, in run
    self._check_mandatory_inputs()
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/interfaces/base/core.py", line 275, in _check_mandatory_inputs
    raise ValueError(msg)
ValueError: Resample requires a value for input 'in_file'. For a list of required inputs, see Resample.help()

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/miniconda/lib/python3.8/concurrent/futures/process.py", line 239, in _process_worker
    r = call_item.fn(*call_item.args, **call_item.kwargs)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py", line 70, in run_node
    result["result"] = node.result
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 223, in result
    return _load_resultfile(
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/utils.py", line 291, in load_resultfile
    raise FileNotFoundError(results_file)
FileNotFoundError: /WMH/data/intermediates/qsiprep/qsirecon_wf/sub-007S1222_amico_noddi/sub_007S1222_ses_004_run_01_space_T1w_desc_preproc_recon_wf/qsirecon_anat_wf/resample_mask/result_resample_mask.pklz
"""
The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/miniconda/lib/python3.8/concurrent/futures/_base.py", line 328, in _invoke_callbacks
    callback(self)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py", line 159, in _async_callback
    result = args.result()
  File "/usr/local/miniconda/lib/python3.8/concurrent/futures/_base.py", line 437, in result
    return self.__get_result()
  File "/usr/local/miniconda/lib/python3.8/concurrent/futures/_base.py", line 389, in __get_result
    raise self._exception
FileNotFoundError: /WMH/data/intermediates/qsiprep/qsirecon_wf/sub-007S1222_amico_noddi/sub_007S1222_ses_004_run_01_space_T1w_desc_preproc_recon_wf/qsirecon_anat_wf/resample_mask/result_resample_mask.pklz
221209-13:10:54,578 nipype.workflow INFO:
	 [Node] Executing "resample_mask" <nipype.interfaces.afni.utils.Resample>
221209-13:10:54,737 nipype.workflow INFO:
	 [Node] Executing "odf_rois" <nipype.interfaces.ants.resampling.ApplyTransforms>
221209-13:10:54,738 nipype.workflow WARNING:
	 [Node] Error on "qsirecon_wf.sub-007S1222_amico_noddi.sub_007S1222_ses_004_run_01_space_T1w_desc_preproc_recon_wf.qsirecon_anat_wf.odf_rois" (/WMH/data/intermediates/qsiprep/qsirecon_wf/sub-007S1222_amico_noddi/sub_007S1222_ses_004_run_01_space_T1w_desc_preproc_recon_wf/qsirecon_anat_wf/odf_rois)
exception calling callback for <Future at 0x7f759c9a99a0 state=finished raised FileNotFoundError>
concurrent.futures.process._RemoteTraceback: 
"""
Traceback (most recent call last):
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py", line 67, in run_node
    result["result"] = node.run(updatehash=updatehash)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 527, in run
    result = self._run_interface(execute=True)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 645, in _run_interface
    return self._run_command(execute)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 722, in _run_command
    result = self._interface.run(cwd=outdir, ignore_exception=True)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/interfaces/base/core.py", line 388, in run
    self._check_mandatory_inputs()
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/interfaces/base/core.py", line 275, in _check_mandatory_inputs
    raise ValueError(msg)
ValueError: ApplyTransforms requires a value for input 'transforms'. For a list of required inputs, see ApplyTransforms.help()

How did you run QSIPrep?

singularity run --cleanenv -B /mnt/cina/WMH:/WMH /mnt/cina/WMH/lib/qsiprep-0.16.1.simg /WMH/data/BIDS/ADNI/ /WMH/data/BIDS/ADNI/derivatives/qsiprep participant --participant_label sub-002S4213--output-resolution 1.0 --fs_license_file /WMH/lib/fs_license.txt --freesurfer_input /WMH/data/BIDS/ADNI/derivatives/sourcedata/freesurfer --skip-bids-validation

Have you tried running preprocessing and reconstruction in the same step? I will also note that 1.0mm is pretty fine resolution, making your files larger and needing more computational power to process. What is the original resolution of your image? Does it warrant that level of resampling? At some point you get diminishing returns on resampling.

singularity run --cleanenv -B /mnt/cina/WMH:/WMH /mnt/cina/WMH/lib/qsiprep-0.16.1.simg /WMH/data/BIDS/ADNI/ /WMH/data/BIDS/ADNI/derivatives/qsiprep participant --participant_label sub-002S4213 --stop-on-first-crash --notrack --output-resolution 1.2 --fs_license_file /WMH/lib/fs_license.txt --freesurfer_input /WMH/data/BIDS/ADNI/derivatives/sourcedata/freesurfer --recon-spec dsi_studio_gqi --skip-bids-validation

Would I need to run them simultaneously? I have all subjects preprocessed using qsiprep’s preprocessing pipeline without error. So I don’t want to run it again.

Is there a drawback to not getting .html outputs? Will the subjects that are processed without an .html output have compromised outputs in your experience?

As long as you use the same working directory, most things should be skipped. Running these in the same command will avoid any potential errors of QSIRecon not finding the QSIPrep outputs.

This indicates that there were errors. And looking at the HTMLs are important for quality control.

Hi Steven,

I am finding that qsirecon errors out with multi-session subjects.

This is the error message I am getting (as above):

Node: qsirecon_wf.sub-007S4272_dsistudio_pipeline.sub_007S4272_ses_006_run_01_space_T1w_desc_preproc_recon_wf.qsirecon_anat_wf.get_atlases
Working directory: /WMH/data/intermediates/qsiprep/qsirecon_wf/sub-007S4272_dsistudio_pipeline/sub_007S4272_ses_006_run_01_space_T1w_desc_preproc_recon_wf/qsirecon_anat_wf/get_atlases

Node inputs:

atlas_names = ['schaefer100', 'schaefer200', 'schaefer400', 'brainnetome246', 'aicha384', 'gordon333', 'aal116']
forward_transform = <undefined>
reference_image = /WMH/data/BIDS/ADNI/derivatives/qsiprep/qsiprep/sub-007S4272/ses-006/dwi/sub-007S4272_ses-006_run-01_space-T1w_desc-preproc_dwi.nii.gz
space = T1w

Traceback (most recent call last):
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py", line 344, in _send_procs_to_workers
    self.procs[jobid].run(updatehash=updatehash)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 527, in run
    result = self._run_interface(execute=True)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 645, in _run_interface
    return self._run_command(execute)
  File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 771, in _run_command
    raise NodeExecutionError(msg)
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node get_atlases.

Traceback:
      Traceback (most recent call last):
        File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/interfaces/base/core.py", line 398, in run
          runtime = self._run_interface(runtime)
        File "/usr/local/miniconda/lib/python3.8/site-packages/qsiprep/interfaces/utils.py", line 51, in _run_interface
          raise Exception("No MNI to T1w transform found in anatomical directory")
      Exception: No MNI to T1w transform found in anatomical directory

And I have attached two screenshots - 1) indicating the successfully preprocessed multi-session subject (8 sessions) and 2) qsirecon successfully processed session 2 (with dwi) but is crashing on sessions 4, 6, and 8 (all of which have dwi).


Can qsirecon take subjects with multiple dwi sessions?

Presence of HTML doesn’t necessarily mean there were no errors. Are the contents of the QSIPrep and/or QSIRecon anat folder different in sessions that are crashing? Are you using the combined command I mentioned in my last answer with a fresh working directory?

Best,
Steven