I want to make sure that does the xcp_d do the standardization of the connectivity matrices and resting-state derivatives by Fisher r-to-Z transformations? If I need to perform some group-analyses, I would standardize the output functional connectivity and derivatives first, right?
xcp_d outputs the Pearson correlation coefficients. You’ll need to do the r-to-z transform yourself (e.g., with numpy.arctanh). I agree that that is probably a good first step before any group-level analyses.
Sorry that I’m just getting into python, could you please show me how to use numpy.arctanh to perform the r-to-z transform of the connectivity matrices and the derivatives like Reho/Alff. Or could I directly use nilearn to perfrom group analyses on these outputs of xcp_d? By which I consider that a r-to-z transformation was made at the beginning of group analysis(fit the GLM models) in nliearn. (I’m not sure about this…).
Assuming you’re not using the --cifti flag, you’ll have correlation matrices in tsv format. You could do something like the following:
import numpy as np
correlations_file = "/path/to/file.tsv"
# The current files don't have a header row or defined index,
# so the only thing in the file should be the data.
correlations_arr = np.loadtxt(correlations_file, delimiter="\t")
z_arr = np.arctanh(correlations_arr)
I don’t think you need to apply any transforms to the ReHo or ALFF values, but I could be wrong.
If you’re working with the correlation matrices, it’s probably easier to go directly to a pure stats tool like statsmodels for your analysis. Nilearn is great for creating the correlation matrices from NIfTI files or for directly analyzing NIfTI files, but once you have TSVs regular statistical analysis packages (whether in Python or R) are your best bet.
Thank you so much! I understand now.
I also want to deal with the cifti outputs at surface level. could you please give me some suggestions?
Thank you again for your help.