Error in xcp_d: error in load_atlases_wf_copy_atlas

Summary of what happened:

Hello,

I am running an xcp_d post-processing pipeline using resting state scans that were preprocessed through fmriprep, by another colleague last year.

About half of the participants are running with no errors popping up - but half keep getting the same error codes - see below

Perhaps this is a left over error from the fmriprep preprocessing, but then surely that would impact all participants ?

Any support would be really helpful :slight_smile:

Command used (and if a helper script was used, a link to the helper script or the command generated):

docker run --rm -it \
-v /mnt/mpathymri/NAC_RS/fmriprep/prep_derivatives/fmriprep:/fmriprep:ro \
-v /mnt/mpathymri/mpathy_mri_docker/xcptmp/nac_neuro:/work:rw \
-v /mnt/mpathymri/mpathy_mri_docker/xcp_d/derivatives:/out:rw \
-v /mnt/mpathymri/mpathy_mri_docker/xcp_d/prep_derivates/freesurfer:/freesurfer:ro \
-v /mnt/mpathymri/mpathy_mri_docker/ref_files/freesurfer.txt:/license.txt:ro \
pennlinc/xcp_d:latest /fmriprep /out participant -Β«
-mode none \
--task-id restingstate \
--participant_label 027 \
--despike \
--head_radius 50 \
-w /work \
--smoothing 6 \
-f 0.5 \
--fs-license-file /license.txt \
--abcc-qc n \
--combine-runs n \
--input-type fmriprep \
--file-format nifti \
--linc-qc n \
--motion-filter-type none \
--output-type censored \
--nuisance-regressors 36P \
--warp-surfaces-native2std n \
--min-coverage 0.5 \
--create-matrices all

Version:

0.10.6

Environment (Docker, Singularity / Apptainer, custom installation):

Docker

Data formatted according to a validatable standard? Please provide the output of the validator:

PASTE VALIDATOR OUTPUT HERE

Relevant log outputs (up to 20 lines):

250305-02:39:01,4 nipype.workflow ERROR:
         Node _copy_atlas13 failed to run on host e2c638333ee6.
250305-02:39:01,15 nipype.workflow ERROR:
         Saving crash info to /out/sub-026/log/20250305-023034_61711f6d-7917-4fac-8d90-725d85fd867c/crash-20250305-023901-root-_copy_atlas13-67a34162-2c6d-4921-8cf3-d92abd9bad25.txt
Traceback (most recent call last):
  File "/usr/local/miniconda/lib/python3.10/site-packages/nipype/pipeline/plugins/multiproc.py", line 66, in run_node
    result["result"] = node.run(updatehash=updatehash)
  File "/usr/local/miniconda/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 525, in run
    result = self._run_interface(execute=True)
  File "/usr/local/miniconda/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 643, in _run_interface
    return self._run_command(execute)
  File "/usr/local/miniconda/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 769, in _run_command
    raise NodeExecutionError(msg)
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node _copy_atlas13.

Traceback:
        Traceback (most recent call last):
          File "/usr/local/miniconda/lib/python3.10/site-packages/nipype/interfaces/base/core.py", line 401, in run
            runtime = self._run_interface(runtime)
          File "/usr/local/miniconda/lib/python3.10/site-packages/xcp_d/interfaces/bids.py", line 285, in _run_interface
            raise ValueError(
        ValueError: Existing '4S956Parcels' atlas affine (/out/atlases/atlas-4S956Parcels/atlas-4S956Parcels_space-MNI152NLin2009cAsym_dseg.nii.gz) is different from the input file affine (/work/xcp_d_0_10_wf/sub_026_wf/load_atlases_wf/warp_atlases_to_bold_space/mapflow/_warp_atlases_to_bold_space13/atlas-4S956Parcels_space-MNI152NLin6Asym_res-01_dseg_trans.nii.gz).


250305-02:39:02,938 nipype.workflow ERROR:
         could not run node: xcp_d_0_10_wf.sub_026_wf.load_atlases_wf.copy_atlas
250305-02:39:02,947 nipype.workflow ERROR:
         could not run node: xcp_d_0_10_wf.sub_026_wf.load_atlases_wf.copy_atlas
250305-02:39:02,952 nipype.workflow ERROR:
         could not run node: xcp_d_0_10_wf.sub_026_wf.load_atlases_wf.copy_atlas
250305-02:39:02,956 nipype.workflow ERROR:
         could not run node: xcp_d_0_10_wf.sub_026_wf.load_atlases_wf.copy_atlas
250305-02:39:02,960 nipype.workflow ERROR:
         could not run node: xcp_d_0_10_wf.sub_026_wf.load_atlases_wf.copy_atlas
250305-02:39:02,964 nipype.workflow ERROR:
         could not run node: xcp_d_0_10_wf.sub_026_wf.load_atlases_wf.copy_atlas
250305-02:39:02,968 nipype.workflow ERROR:
         could not run node: xcp_d_0_10_wf.sub_026_wf.load_atlases_wf.copy_atlas
250305-02:39:02,970 nipype.workflow ERROR:
         could not run node: xcp_d_0_10_wf.sub_026_wf.load_atlases_wf.copy_atlas
250305-02:39:02,973 nipype.workflow ERROR:
         could not run node: xcp_d_0_10_wf.sub_026_wf.load_atlases_wf.copy_atlas
250305-02:39:02,976 nipype.workflow ERROR:
         could not run node: xcp_d_0_10_wf.sub_026_wf.load_atlases_wf.copy_atlas
250305-02:39:02,980 nipype.workflow ERROR:
         could not run node: xcp_d_0_10_wf.sub_026_wf.load_atlases_wf.copy_atlas
250305-02:39:02,983 nipype.workflow ERROR:
         could not run node: xcp_d_0_10_wf.sub_026_wf.load_atlases_wf.copy_atlas
250305-02:39:02,986 nipype.workflow ERROR:
         could not run node: xcp_d_0_10_wf.sub_026_wf.load_atlases_wf.copy_atlas
250305-02:39:02,989 nipype.workflow ERROR:
         could not run node: xcp_d_0_10_wf.sub_026_wf.load_atlases_wf.copy_atlas
250305-02:39:03,303 nipype.workflow CRITICAL:
         XCP-D failed: 14 raised. Re-raising first.

Screenshots / relevant information:


Hi @etow6030 and welcome to neurostars!

In the future please do not delete the software support post category template. You can see I edited it back into your post this time.

Additional information that would help us debug would be:

  1. Your fmriprep invocation details (version, command, whether all subjects were successful)
  2. Ensure fmriprep command was same between failing subjects and passing subjects
  3. Provide tree outputs showing fmriprep outputs for a failing and passing subject
  4. Does the error persist when using a fresh working directory?

Best,
Steven

This error typically comes up when people run XCP-D on native-resolution volumetric BOLD data. Basically, XCP-D tries to warp the atlases to the same space and resolution as the BOLD data, but if the BOLD scans have different resolutions (typically because there are small variations in the acquisition parameters across the dataset and the user ran fMRIPrep without specifying a resolution in their output spaces), then XCP-D will raise an error. You can see a bit of a summary of why XCP-D raises an error instead of doing something more elegant in Processing native-resolution runs with different voxel sizes will fail Β· Issue #1069 Β· PennLINC/xcp_d Β· GitHub.

Hi Steven,

Sorry to be a luddite but I can not figure out how to edit the original post.

Data formatted according to a validatable standard? Please provide the output of the validator:

BIDS

[etow6030@login2 sub-022]$ tree
.
β”œβ”€β”€ anat
β”‚   β”œβ”€β”€ sub-022_T1w.json
β”‚   └── sub-022_T1w.nii.gz
β”œβ”€β”€ fmap
β”‚   β”œβ”€β”€ sub-022_dir-AP_epi.json
β”‚   β”œβ”€β”€ sub-022_dir-AP_epi.nii.gz
β”‚   β”œβ”€β”€ sub-022_dir-PA_epi.json
β”‚   └── sub-022_dir-PA_epi.nii.gz
└── func
    β”œβ”€β”€ sub-022_task-restingstate_bold.json
    └── sub-022_task-restingstate_bold.nii.gz

fmriprep invocation details (version, command, whether all subjects were successful)

  • fMRIPrep version: 20.2.7

Command


# working code - this extracted the subject number to use as number object in participant_label
for pp in /scratch/AlcNeuro/MPATHY_MRI/fmriprep/bids/sub-???; do number=$(echo "${pp#*-}"); 
# do number=$(echo $person | cut -d"(" -f1 | cut -d"-" -f2); 
echo $number; jobname=`qsub -N $number fmriprep_singularity_MPATHY_RS_ET_V0.pbs`;
jobid=`echo $jobname | cut -d "." -f1`;done

#Set up directories
root_path=/scratch/AlcNeuro/MPATHY_MRI/fmriprep/
cd $root_path
data_path=${root_path}bids
output_path=${root_path}prep_derivatives
ref_path=${root_path}Ref_Files/
#mriqc_path=${ref_path}ro_mriqc.simg
fmriprep_img=${ref_path}fmriprep-20.2.7.simg
freesurfer_file=${ref_path}freesurfer.txt
freesurfer_output=${output_path}/freesurfer

echo "Data input ${data_path}"
echo "Data output ${output_path}"

echo "singularity run --cleanenv -B ${root_path} $fmriprep_img ${data_path} ${output_path} participant --participant_label $PBS_JOBNAME --fs-license-file ${freesurfer_file} \
--bold2t1w-dof 6 --force-bbr --dummy-scans 10  --skull-strip-t1w auto --n-cpus 8 --cifti-output" 

singularity run --cleanenv -B ${root_path} $fmriprep_img ${data_path} ${output_path} participant --participant_label $PBS_JOBNAME --fs-license-file ${freesurfer_file} \
--bold2t1w-dof 6 --force-bbr --dummy-scans 10 --skull-strip-t1w auto --n-cpus 8 --cifti-output
 # --fs-subjects-dir ${freesurfer_output}

All subjects were successful in fmriprep - no errors to log in output .html file

I have also tried re-running the fmriprep, and the problem persists

Provide tree outputs showing fmriprep outputs for a failing and passing subject


[etow6030@login2 sub-022]$ tree
.
β”œβ”€β”€ anat
β”‚   β”œβ”€β”€ sub-022_desc-aparcaseg_dseg.nii.gz
β”‚   β”œβ”€β”€ sub-022_desc-aseg_dseg.nii.gz
β”‚   β”œβ”€β”€ sub-022_desc-brain_mask.json
β”‚   β”œβ”€β”€ sub-022_desc-brain_mask.nii.gz
β”‚   β”œβ”€β”€ sub-022_desc-preproc_T1w.json
β”‚   β”œβ”€β”€ sub-022_desc-preproc_T1w.nii.gz
β”‚   β”œβ”€β”€ sub-022_dseg.nii.gz
β”‚   β”œβ”€β”€ sub-022_from-fsnative_to-T1w_mode-image_xfm.txt
β”‚   β”œβ”€β”€ sub-022_from-MNI152NLin2009cAsym_to-T1w_mode-image_xfm.h5
β”‚   β”œβ”€β”€ sub-022_from-MNI152NLin6Asym_to-T1w_mode-image_xfm.h5
β”‚   β”œβ”€β”€ sub-022_from-T1w_to-fsnative_mode-image_xfm.txt
β”‚   β”œβ”€β”€ sub-022_from-T1w_to-MNI152NLin2009cAsym_mode-image_xfm.h5
β”‚   β”œβ”€β”€ sub-022_from-T1w_to-MNI152NLin6Asym_mode-image_xfm.h5
β”‚   β”œβ”€β”€ sub-022_hemi-L_inflated.surf.gii
β”‚   β”œβ”€β”€ sub-022_hemi-L_midthickness.surf.gii
β”‚   β”œβ”€β”€ sub-022_hemi-L_pial.surf.gii
β”‚   β”œβ”€β”€ sub-022_hemi-L_smoothwm.surf.gii
β”‚   β”œβ”€β”€ sub-022_hemi-R_inflated.surf.gii
β”‚   β”œβ”€β”€ sub-022_hemi-R_midthickness.surf.gii
β”‚   β”œβ”€β”€ sub-022_hemi-R_pial.surf.gii
β”‚   β”œβ”€β”€ sub-022_hemi-R_smoothwm.surf.gii
β”‚   β”œβ”€β”€ sub-022_label-CSF_probseg.nii.gz
β”‚   β”œβ”€β”€ sub-022_label-GM_probseg.nii.gz
β”‚   β”œβ”€β”€ sub-022_label-WM_probseg.nii.gz
β”‚   β”œβ”€β”€ sub-022_space-MNI152NLin2009cAsym_desc-brain_mask.json
β”‚   β”œβ”€β”€ sub-022_space-MNI152NLin2009cAsym_desc-brain_mask.nii.gz
β”‚   β”œβ”€β”€ sub-022_space-MNI152NLin2009cAsym_desc-preproc_T1w.json
β”‚   β”œβ”€β”€ sub-022_space-MNI152NLin2009cAsym_desc-preproc_T1w.nii.gz
β”‚   β”œβ”€β”€ sub-022_space-MNI152NLin2009cAsym_dseg.nii.gz
β”‚   β”œβ”€β”€ sub-022_space-MNI152NLin2009cAsym_label-CSF_probseg.nii.gz
β”‚   β”œβ”€β”€ sub-022_space-MNI152NLin2009cAsym_label-GM_probseg.nii.gz
β”‚   └── sub-022_space-MNI152NLin2009cAsym_label-WM_probseg.nii.gz
β”œβ”€β”€ figures
β”‚   β”œβ”€β”€ sub-022_desc-about_T1w.html
β”‚   β”œβ”€β”€ sub-022_desc-conform_T1w.html
β”‚   β”œβ”€β”€ sub-022_desc-reconall_T1w.svg
β”‚   β”œβ”€β”€ sub-022_desc-summary_T1w.html
β”‚   β”œβ”€β”€ sub-022_dseg.svg
β”‚   β”œβ”€β”€ sub-022_space-MNI152NLin2009cAsym_T1w.svg
β”‚   β”œβ”€β”€ sub-022_space-MNI152NLin6Asym_T1w.svg
β”‚   β”œβ”€β”€ sub-022_task-restingstate_desc-bbregister_bold.svg
β”‚   β”œβ”€β”€ sub-022_task-restingstate_desc-carpetplot_bold.svg
β”‚   β”œβ”€β”€ sub-022_task-restingstate_desc-compcorvar_bold.svg
β”‚   β”œβ”€β”€ sub-022_task-restingstate_desc-confoundcorr_bold.svg
β”‚   β”œβ”€β”€ sub-022_task-restingstate_desc-rois_bold.svg
β”‚   β”œβ”€β”€ sub-022_task-restingstate_desc-sdc_bold.svg
β”‚   β”œβ”€β”€ sub-022_task-restingstate_desc-summary_bold.html
β”‚   └── sub-022_task-restingstate_desc-validation_bold.html
β”œβ”€β”€ func
β”‚   β”œβ”€β”€ sub-022_task-restingstate_desc-confounds_timeseries.json
β”‚   β”œβ”€β”€ sub-022_task-restingstate_desc-confounds_timeseries.tsv
β”‚   β”œβ”€β”€ sub-022_task-restingstate_from-scanner_to-T1w_mode-image_xfm.txt
β”‚   β”œβ”€β”€ sub-022_task-restingstate_from-T1w_to-scanner_mode-image_xfm.txt
β”‚   β”œβ”€β”€ sub-022_task-restingstate_space-fsLR_den-91k_bold.dtseries.json
β”‚   β”œβ”€β”€ sub-022_task-restingstate_space-fsLR_den-91k_bold.dtseries.nii
β”‚   β”œβ”€β”€ sub-022_task-restingstate_space-fsLR_den-91k_bold.json
β”‚   β”œβ”€β”€ sub-022_task-restingstate_space-MNI152NLin2009cAsym_boldref.nii.gz
β”‚   β”œβ”€β”€ sub-022_task-restingstate_space-MNI152NLin2009cAsym_desc-aparcaseg_dseg.nii.gz
β”‚   β”œβ”€β”€ sub-022_task-restingstate_space-MNI152NLin2009cAsym_desc-aseg_dseg.nii.gz
β”‚   β”œβ”€β”€ sub-022_task-restingstate_space-MNI152NLin2009cAsym_desc-brain_mask.json
β”‚   β”œβ”€β”€ sub-022_task-restingstate_space-MNI152NLin2009cAsym_desc-brain_mask.nii.gz
β”‚   β”œβ”€β”€ sub-022_task-restingstate_space-MNI152NLin2009cAsym_desc-preproc_bold.json
β”‚   └── sub-022_task-restingstate_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz
└── log
    └── 20250311-155254_5bfa14f0-0514-4bd9-902d-740957b5e731
        └── fmriprep.toml

**Does the error persist when using a fresh working directory?
Sorry to sound like a luddite again but by fresh working directory do you mean,

-v /mnt/mpathymri/mpathy_mri_docker/xcptmp/mpathy_v2:/work:rw \ 

changing this to a new directory (tried that and have run into the same issue)

Thanks!

Hi @etow6030,

I would follow @tsalo’s advice and try to specify a non native resolution. Also I recommend updating fmriprep versions.

Finally, please format your code output as code with the </> button in the text editor. You can see I edited your last post for readability this time.

Thanks,
Steven

Hi Tsalo,

Okay great - so would your recommendation be to follow #1076?

Thanks :slight_smile:

Hi @etow6030,

Those features aren’t implemented yet. You would reprocess with fmriprep.

Best,
Steven

Hi Steven,

Sorry, I am new to this. So I need to reprocess with copy_atlas for fmriprep? As suggested in #1075?

Thanks :slight_smile:

Hi @etow6030,

copy_atlas is an internal function, not a callable command line argument. You would specify a different output space with fmriprep, perhaps adding a resolution modifier to hard code an isotropic resolution. See here Defining standard and nonstandard spaces where data will be resampled β€” fmriprep version documentation

Best,
Steven