Nilearn has an option to standardize both the signal and the confounds (eg with signal.clean
). Is this generally recommended? What is the rationale for standardizing vs not standardizing both the signal and confounds, and could it affect my results in a negative way?
This is a question I had (have) as well. I think it depends on the input data (signal and confounds) and if it is already standardized.
If your data is fmriprep output, I believe it is already standardized so no more standardization is needed See also: Fmriprep pipeline and output and Denoising strategy with fMRIPrep confounds. This however concerns standardizing the signal; I am not sure about standardizing confounds.
Standardizing the confounds is a safe thing to do.
Standardizing the signal is OK, unless you want to express it in percent of change of the baseline.
HTH,
Bertrand
Thank you @lvbrussel and @bthirion.
Would it also be ok not to standardize the signal but standardize the confounds?
What effect, if any, has standardization in nuisance variable regression? Does it only help in interpretation, or is there a statistical reason for doing it?
AFAICT this is OK. Bertrand
To me this mostly avoids weird behavior if some extreme values were present in these regressors. You always your design matrix to be well-behaved.
Hi I was searching for similar questions and your answer seemed very relevant. Can you maybe explain a bit more why standardizing the signal is not recommended if I want to calculate percent change?
I’m having a wierd problem where the BOLD percent change result is completely different using standardization in clean_img vs. not, and that’s really confusing. Huge thanks!