Question about one error in using xcp_d related to its atlas use

Hi xcp_d experts,

Summary of what happened:

I recently started to use xcp_d to analyze resting-state functional MRI data, but I have faced the following error. Until the error, the scripts of xcp_d from the start to that point (just before the error line) worked well locally on my Mac Pro, so I assumed atlas information could not be used appropriately and I noticed some difference between the recent change of codes and some files announced in Replace AtlasPack 0.0.5 atlases with AtlasPack 0.1.0 atlases by tsalo · Pull Request #991 · PennLINC/xcp_d · GitHub and the present codes in my local xcp_d in python site-pacakges (Actually, I tried to change the codes downloaded in my PC (atlas.py, doc.py, connectivity.py, etc.) to ones announced in the above page, they showed other errors and did not work perhaps because of the lack of my knowledge about the way of changing python codes. ).

I am also wondering if the error was caused by that difference. On the other hand, my guess in the above contexts may not be on target. It would be greatly appreciated if you could tell me some advice about what I mentioned above. I look forward to hearing from you.

Command used (and if a helper script was used, a link to the helper script or the command generated):

xcp_d $Home/output_sub-001_re3 $Home/output_sub-001_xcp3 participant --despike -w $Home/work_sub-001_xcp --smoothing 8 --clean-workdir --input_type fmriprep --exact-time 216

Version:

xcp_d v0.6.1.dev17+g9fbc46c

Environment (Docker, Singularity, custom installation):

python 3.10.13
fMRIPrep v23.1.4

Data formatted according to a validatable standard? Please provide the output of the validator:

I am not sure so far about what answer I should provide for this question, but I used the raw output folder of fMRIPrep (v23.1.4) as the input of xcp_d.

Relevant log outputs (up to 20 lines):

(the following error sentences repeated from _atlas_file_grabber0 to _atlas_file_grabber9.)

231225-04:47:02,211 nipype.workflow WARNING:
	 [Node] Error on "_atlas_file_grabber1" ($Home/work_sub-001_xcp/xcpd_wf/single_subject_001_wf/load_atlases_wf/atlas_file_grabber/mapflow/_atlas_file_grabber1)
231225-04:47:02,212 nipype.workflow ERROR:
	 Saving crash info to $Home/output_sub-001_xcp3/xcp_d/sub-001/log/crash-20231225-044702-t_yatomi-_atlas_file_grabber0-a39ce0f0-ae09-474d-89bf-f258e3d9f49b.txt
Traceback (most recent call last):
  File "$Home/.pyenv/versions/3.10.13/lib/python3.10/site-packages/nipype/pipeline/plugins/multiproc.py", line 67, in run_node
    result["result"] = node.run(updatehash=updatehash)
  File "$Home/.pyenv/versions/3.10.13/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 527, in run
    result = self._run_interface(execute=True)
  File "$Home/.pyenv/versions/3.10.13/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 645, in _run_interface
    return self._run_command(execute)
  File "$Home/.pyenv/versions/3.10.13/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 771, in _run_command
    raise NodeExecutionError(msg)
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node _atlas_file_grabber0.

Traceback:
	Traceback (most recent call last):
	  File "$Home/.pyenv/versions/3.10.13/lib/python3.10/site-packages/nipype/interfaces/base/core.py", line 397, in run
	    runtime = self._run_interface(runtime)
	  File "$Home/.pyenv/versions/3.10.13/lib/python3.10/site-packages/nipype/interfaces/utility/wrappers.py", line 142, in _run_interface
	    out = function_handle(**args)
	  File "<string>", line 51, in get_atlas_nifti
	FileNotFoundError: File(s) DNE:
		/AtlasPack/tpl-MNI152NLin6Asym_atlas-4S1056Parcels_res-01_dseg.nii.gz
		/AtlasPack/atlas-4S1056Parcels_dseg.tsv
		/AtlasPack/tpl-MNI152NLin6Asym_atlas-4S1056Parcels_dseg.json

Screenshots / relevant information:

(In the 9 crash log files (from the one related to _atlas_file_grabber0 to the one related to _atlas_file_grabber9), there were following output sentences (almot the same for each other) .
After running the above command, there were already denoised rsfMRI, ALFF, and ReHo or other outputs, but there were no correlation matrix files (etc.) using some atlases in my xcp_d output folder what I thought should emerge if the codes properly worked. )

Node: _atlas_file_grabber0
Working directory: $Home/work_sub-001_xcp/xcpd_wf/single_subject_001_wf/load_atlases_wf/atlas_file_grabber/mapflow/_atlas_file_grabber0

Node inputs:

atlas_name = 4S1056Parcels
function_str = def get_atlas_nifti(atlas_name):
    """Select atlas by name from xcp_d/data using pkgrf.

    All atlases are in MNI space.

    NOTE: This is a Node function.

    Parameters
    ----------
    atlas_name : {"4S156Parcels", "4S256Parcels", "4S356Parcels", "4S456Parcels", \
                  "4S556Parcels", "4S656Parcels", "4S756Parcels", "4S856Parcels", \
                  "4S956Parcels", "4S1056Parcels", "Glasser", "Gordon", \
                  "Tian", "HCP"}
        The name of the NIFTI atlas to fetch.

    Returns
    -------
    atlas_file : :obj:`str`
        Path to the atlas file.
    atlas_labels_file : :obj:`str`
        Path to the atlas labels file.
    atlas_metadata_file : :obj:`str`
        Path to the atlas metadata file.
    """
    from os.path import isfile, join

    from pkg_resources import resource_filename as pkgrf

    if "4S" in atlas_name or atlas_name in ("Glasser", "Gordon"):
        # 1 mm3 atlases
        atlas_fname = f"tpl-MNI152NLin6Asym_atlas-{atlas_name}_res-01_dseg.nii.gz"
        tsv_fname = f"atlas-{atlas_name}_dseg.tsv"
    else:
        # 2 mm3 atlases
        atlas_fname = f"tpl-MNI152NLin6Asym_atlas-{atlas_name}_res-02_dseg.nii.gz"
        tsv_fname = f"atlas-{atlas_name}_dseg.tsv"

    if "4S" in atlas_name:
        atlas_file = join("/AtlasPack", atlas_fname)
        atlas_labels_file = join("/AtlasPack", tsv_fname)
        atlas_metadata_file = f"/AtlasPack/tpl-MNI152NLin6Asym_atlas-{atlas_name}_dseg.json"
    else:
        atlas_file = pkgrf("xcp_d", f"data/atlases/{atlas_fname}")
        atlas_labels_file = pkgrf("xcp_d", f"data/atlases/{tsv_fname}")
        atlas_metadata_file = pkgrf(
            "xcp_d",
            f"data/atlases/tpl-MNI152NLin6Asym_atlas-{atlas_name}_dseg.json",
        )

    if not (isfile(atlas_file) and isfile(atlas_labels_file) and isfile(atlas_metadata_file)):
        raise FileNotFoundError(
            f"File(s) DNE:\n\t{atlas_file}\n\t{atlas_labels_file}\n\t{atlas_metadata_file}"
        )

    return atlas_file, atlas_labels_file, atlas_metadata_file


Traceback (most recent call last):
  File "$Home/.pyenv/versions/3.10.13/lib/python3.10/site-packages/nipype/pipeline/plugins/multiproc.py", line 67, in run_node
    result["result"] = node.run(updatehash=updatehash)
  File "$Home/.pyenv/versions/3.10.13/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 527, in run
    result = self._run_interface(execute=True)
  File "$Home/.pyenv/versions/3.10.13/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 645, in _run_interface
    return self._run_command(execute)
  File "$Home/.pyenv/versions/3.10.13/lib/python3.10/site-packages/nipype/pipeline/engine/nodes.py", line 771, in _run_command
    raise NodeExecutionError(msg)
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node _atlas_file_grabber0.

Traceback:
	Traceback (most recent call last):
	  File "$Home/.pyenv/versions/3.10.13/lib/python3.10/site-packages/nipype/interfaces/base/core.py", line 397, in run
	    runtime = self._run_interface(runtime)
	  File "$Home/.pyenv/versions/3.10.13/lib/python3.10/site-packages/nipype/interfaces/utility/wrappers.py", line 142, in _run_interface
	    out = function_handle(**args)
	  File "<string>", line 51, in get_atlas_nifti
	FileNotFoundError: File(s) DNE:
		/AtlasPack/tpl-MNI152NLin6Asym_atlas-4S1056Parcels_res-01_dseg.nii.gz
		/AtlasPack/atlas-4S1056Parcels_dseg.tsv
		/AtlasPack/tpl-MNI152NLin6Asym_atlas-4S1056Parcels_dseg.json 

Sincerely,
Taisuke

Hi @Taisuke and welcome to neurostars!

Thanks for the report. In general, it is recommended to run BIDS-app such as XCP in containers (Docker/Singularity). That way you can be sure that you have all the correct dependencies installed. Based on your report, it looks like you are running it via Python installation (also called a “bare metal” installation). I’d recommend trying Docker. Read more here: Installation — xcp_d 0.6.1.dev17+g9fbc46c documentation

Best,
Steven

Hi Steven,

I appreciate your very helpful advice so much, and xcp_d successfully worked with Docker! On the other hand, could I have one more question? I would like to use volumetric data of 4S atlases from GitHub - PennLINC/AtlasPack: Combined cortical/subcortical atlases for xcp-d and qsiprep, because of comparison between rsfMRI data and other modality volumetric data, but I could not get raw atlases from the following scripts, even after requirements were filled out (also, I could only see a few atlases as only aliases in the AtlasPack folder locally cloned in my PC, and with bash 02_supplement_schaefers.sh, there was only the message " **run**(**error**): $Home/AtlasPack (**dataset**) [Input did not match existing file: subcortical_merged/tpl-MNI152NLin6Asym_atlas-SubcorticalMerged_res-01_dseg.nii.gz] after the following bash script). Could you please teach me some keys assumed to resolve this situation? (Or are there any other ways to get the above atlases?) If I should ask this in Github, please tell me. Thank you in advance.

bash 01_combine_subcortical.sh

Instead, error messages as follows emerged.

----- stderr -----

prepare_atlas.sh: line 27: 64558 Segmentation fault: 11 antsApplyTransforms -d 3 -i atl-MDTB10_space-MNI_dseg.nii -o tpl-MNI152NLin6Asym_atlas-MDTB10_res-01_dseg.nii.gz -r ${TEMPLATEFLOW_HOME}/tpl-MNI152NLin6Asym/tpl-MNI152NLin6Asym_res-01_desc-brain_mask.nii.gz --interpolation GenericLabel -v 1

------------------

---------------------------------------------------------------------------

CalledProcessError Traceback (most recent call last)

Cell In[1], line 1

----> 1 get_ipython().run_cell_magic('bash', '', '**\n**set -eux**\n\n**for atlasname in CIT168 cerebellum Schaefer thalamus hippocampus_and_amygdala**\n**do**\n** cd $**{atlasname}****\n** bash -ex prepare_atlas.sh**\n** cd ..**\n**done**\n**')

File ~/opt/anaconda3/lib/python3.9/site-packages/IPython/core/interactiveshell.py:2493, in InteractiveShell.run_cell_magic(self, magic_name, line, cell)

**2491** **with** self.builtin_trap:

**2492** args = (magic_arg_s, cell)

-> 2493 result = fn(*args, **kwargs)

**2495** *# The code below prevents the output from being displayed*

**2496** *# when using magics with decorator @output_can_be_silenced*

**2497** *# when the last Python token in the expression is a ';'.*

**2498** **if** getattr(fn, magic.MAGIC_OUTPUT_CAN_BE_SILENCED, **False**):

File ~/opt/anaconda3/lib/python3.9/site-packages/IPython/core/magics/script.py:154, in ScriptMagics._make_script_magic.<locals>.named_script_magic(line, cell)

**152** **else**:

**153** line = script

--> 154 **return** self.shebang(line, cell)

File ~/opt/anaconda3/lib/python3.9/site-packages/IPython/core/magics/script.py:314, in ScriptMagics.shebang(self, line, cell)

**309** **if** args.raise_error **and** p.returncode != 0:

**310** *# If we get here and p.returncode is still None, we must have*

**311** *# killed it but not yet seen its return code. We don't wait for it,*

**312** *# in case it's stuck in uninterruptible sleep. -9 = SIGKILL*

**313** rc = p.returncode **or** -9

--> 314 **raise** CalledProcessError(rc, cell)

CalledProcessError: Command 'b'\nset -eux\n\nfor atlasname in CIT168 cerebellum Schaefer thalamus hippocampus_and_amygdala\ndo\n cd ${atlasname}\n bash -ex prepare_atlas.sh\n cd ..\ndone\n'' returned non-zero exit status 139.

Warm regards,
Taisuke

Hi @Taisuke,

If you are just looking to download the atlases, I believe you can use Datalad. You can clone the AtlasPack repo using Datalad, cd in to the cloned repository, then use datalad get PATH/TO/FOLDER to pull the desired files from the cloud.

Best,
Steven

1 Like

Hi Steven,

Thank you so much for your wonderful information, your method enabled me to get the atlases in AtlasPack. I hope you have a great new year.

Kind regards,
Taisuke

1 Like

Thanks, happy to help! Hope you have a happy new year too!

1 Like