Best practices or tools for objectively comparing different nuisance regression strategies (e.g., QC-FC metrics) after XCP-D processing?

Summary of what happened:

I am currently using XCP-D to post-process my fMRI data (preprocessed with fMRIPrep). I am interested in testing different nuisance regression strategies (e.g., motion-filter-type notch vs. LP) to see which one works best for my specific dataset (one is mutilband TR<1s, one is singleband TR=2s).

I have successfully run XCP-D with different parameters, but I am looking for a standardized or “objective” way to compare the denoising efficacy across these runs. Specifically, I am looking for methods to calculate metrics like QC-FC correlations or distance-dependent motion effects.

Or any other way I can compare the different regression methods using the QC-review generated by x-cpd.

Best

Command used (and if a helper script was used, a link to the helper script or the command generated):

    --motion-filter-type notch \
    --band-stop-min 12 \
    --band-stop-max 18 \

    --motion-filter-type lp \
    --band-stop-min 6 \

Version:

x-cpd-0.14.1

Environment (Docker, Singularity / Apptainer, custom installation):

Apptainer


I’m not aware of any established tools, but wonkyconn appears to be under active development and seems like it might do what you want. I’m not sure if the wonkyconn developers are planning to support XCP-D outputs, but there are probably functions in the package that would be appropriate.

@HaoTing_Wang I believe you’re one of the devs- does that sound right?

Thanks for the mention Taylor!

We are aiming to support BIDS-connectome formatted outputs so there’s no direct plans to support XCP-D.

documentations, etc., still need a lot of work in wonkyconn but the project is installable as a standard python library, and here’s relevant function for QC-FC: wonkyconn/wonkyconn/features/quality_control_connectivity.py at main · HALFpipe/wonkyconn · GitHub

1 Like

Thanks for you provided information, that’s helpful.