Combining runs in nilearn resting state analysis

I am working on a resting state pipeline that does preprocessing in fmriprep and then nipype/nilearn for smoothing and confound regression. After regression, I plan to compute correlation matrices for some ROIs, as well as create seed to voxel maps for group analysis.

Each subject has 2 or three runs. I have been able to do confound regression using nilearns clean_img function. However, now I am unsure how to best combine my runs in order to end up with a single correlation matrix (or single set of seed to voxel maps). Do I run confound regression, concatenate my runs into a single timeseries and then compute correlations from that? Or should I compute correlation matrices/correlation maps for each run and then average the resulting values?

Both options are possible, but I would advise to compute corrlation matrices firstn then average them. At least, you can check visually whether they are consistent enough (detect outlier sessions etc).
HTH,
Bertrand

1 Like

So for ROI to ROI correlations, compute them for each run and compute a mean; for seed to voxel make a correlation map for each run and then compute a mean image? Should I transform to fisher’s z scores first? I don’t know if it matters, but I know seemingly benign choices can sometimes have unintended consequences in neuroimaging.

Also, should I weight be number of time points, so if one run has more clean data (after scrubbing high motion), it is being weighted more heavily in the average?

Yes, this is what I would advise.

1 Like

Yes, I feel it’s better to transform-then-average, since the transformation makes statistical behavior more standard, so that operations like averaging are better posed.

1 Like

No experience with that. This sounds OK. My expectation is that these decisions have limited impact on the outcome (if they do, it means that the results should be regarded with caution).

1 Like

Sounds good. I’ll be throwing out any runs with too many TRs missing anyway, so the weighting thing is probably overkill. Thank you!