I’m working on my first searchlight analysis on data from fMRI project. Since I collaborate on it, we use the rsa-toolbox from MRC CBSU in matlab to setup the analysis on computing cluster. The experiment had 4 runs (612 scans each) per participant and 36 participants in total. All data were prepared for SL according to standards and 4 runs were concatenated before beta extraction. However, when I run SL analysis it seems that after 21 participants the memory requirements peaks above 512GB (!) limit of our cluster.
Does any one have experience with such problems? Is it normal that searchlight has such gigantic memory requirements? What can be done?
Any advice will be much appreciated!
PS. Do you have any suggestions/opinions for alternative MVPA/RSA toolboxes (matlab, python, fsl…anything else by that matter) that will work on SPM data?