Hello,
I’m using nilearn to concatenate 40 large, high-resolution functional images (3.8 GB each, for 40*3.8 = 152 GB total). I’m having a memory issue and I’m wondering if there is any way arround it. Even though I have ~400 GB memory, I’m running out of space in my program after about 12 sessions.
sessions = 40
for i in range(1, sessions + 1):
if i < 10:
next_data = image.load_img("betas_session0" + str(i) + ".nii.gz",mask_image)
if i == 1:
all_data = next_data
else:
all_data = nilearn.image.concat_imgs([all_data,next_data],dtype="float16")
else:
next_data = image.load_img("betas_session" + str(i) + ".nii.gz",mask_image)
all_data = nilearn.image.concat_imgs([all_data,next_data],dtype="float16")
Is there a more efficient way to perform this task without wasting memory? I apologize if my question is not pointed enough.
Thanks,
Tom