Summary of what happened:
Hi all,
I am using fmriprep, Tedana, and xcp_d to preprocess multi-echo resting state BOLD data.
For this particular dataset, xcp_d is removing 204 out of 886 volumes from the resting state data. We are using Tedana to generate a custom confound file to regress out using xcp_d, and then using the xcp_d output to run FSL FEAT, and this seems like too many rejected volumes. Of our other data sets, they are being reduced by 10-20 volumes, not 200.
Is this more likely an issue with Tedana, or with xcp_d? I am not familiar with how xcp_d makes the decision to remove volumes, but I want to make sure that my data is accurate before doing functional connectivity analysis.
Command used (and if a helper script was used, a link to the helper script or the command generated):
These are the Tedana and xcp calls, and we are also using a singularity container to adjust parameters and submitting to a cluster
#adjusted_func_denoise_meica.sh ->
tedana -d "$Subdir"/func/rest/"$DataDirs"/Rest_E*_acpc.nii.gz -e $(cat "$Subdir"/func/rest/"$DataDirs"/TE.txt) --out-dir "$Subdir"/func/rest/"$DataDirs"/Tedana/ --tedpca "$MEPCA" --fittype curvefit --mask "$Subdir"/func/rest/"$DataDirs"/brain_mask.nii.gz --maxit "$MaxIterations" --maxrestart "$MaxRestarts" --seed 42 --verbose
#xcp_d singularity call
input_dir=/path/to/fmriprep/output
output_dir=/path/to/xcp_d/output
singularity run --cleanenv \
--no-home \
-B ${jobTmpDir}:/tmp \
-B "/appl/freesurfer-7.1.1:/freesurfer" \
-B "/path/to/scripts/Liston-Laboratory-MultiEchofMRI-Pipeline-master/MultiEchofMRI-Pipeline:/bidsfilt" \
-B "/path/toTemplate/templateflow:/templateflow" \
-B ${input_dir}:/data/input \
-B "/path/to/wd/working_dir:/data/working_dir" \
-B "/path/to/confound/ica_mixing_files:/customconfounds" \
-B ${output_dir}:/data/output \
/appl/containers/xcp_d-0.6.3.simg \
/data/input /data/output participant --participant_label $1 \
--work_dir /data/working_dir \
--bids-filter-file /bidsfilt/bids_filter_file.json \
--fs-license-file /freesurfer/license.txt \
--nthreads $2 \
--omp_nthreads $3 \
-vvv \
--input-type fmriprep \
--smoothing 6 \
-c /customconfounds \
-p custom \
--despike
Version:
Tedana - 24.0.1
xcp_d - 0.6.3
Environment (Docker, Singularity / Apptainer, custom installation):
Tedana - conda environment in bash terminal
xcp_d - singularity container
Screenshots / relevant information:
As mentioned above, all input files start with 886 volumes. For this set, the output had 682 volumes. For our other three datasets, the xcp_d output generated 867 vols, 865 vols, and 876 vols.
Our final result that we’re looking for is the cope1.nii.gz from the FSL FEAT stats output using the sgACC for seed-based functional connectivity.
If any other outputs, logs, etc would be helpful in figuring this out, please let me know. Any help is greatly appreciated.
Thanks!