I have a gray matter mask that I am trying to resample from 2mm voxels to 4mm voxels. In the past, I have been using MATLAB’s imresize3 function with nearest neighbor interpolation to accomplish that. I am trying to move my workflow into python + Nilearn, and am using the following code snippet to do the downsampling:
aff = np.identity(4, dtype = None) aff = np.multiply(aff, 4) aff[0,3] = 90 aff[1,3] = -126 aff[2,3] = -72 aff[3,3] = 1 aff[0,0] = -4 gm_img = nib.load('avg152T1_gray.img') gm_rs = image.resample_img(gm_img, target_affine=aff, interpolation='nearest')
The result from MATLAB has 26776 voxels, while the Nilearn output is 26718 voxels. The differences between the two resampling methods is even more pronounced when trying to process 4d images.
What is the difference between how Nilearn handles resampling vs how MATLAB (and scipy.interpolate) handles resampling?