I need to concatenate 45 epis (each of dimension 73, 92, 70, 738) in order to estimate a GLM for a within-subjects design (multiple sessions). I’ve been using nilearn.concat_imgs() and I’ve been running into memory errors.
Has anyone encountered this problem? If so, how did you solve it? (Working on a cluster computing solution, but memory resources are scarce because everyone’s on the cluster!)
Obviously, you’re creating a huge image that is too large for your hardware.
May advice is simple: Don’t do it.
IIUC, there is actually no reason to do it: GLMs can be run per-session, followed by a fixed effects model.
If for some reason you really want to create a big file with all the data, you may create a huge hdf5 file, but this is not going to make your life easy.
That’s what I ended up planning to do. Thanks, Bertrand!