Transforming a mask from MNI to native space (python environment)

Hi,
I intend to use MNI defined masks in the participant’s native (BOLD functional) space. I looked at previous communication on this subject and @chrisgorgolewski responded in Nov2018: …“All you will need to do is to resample the mask to the same shape and voxel size”. I thought of using Niftimasker for this purpose.
Please see the following command I thought to apply for this purpose (python environment):

resampled_mask_img = resample_to_img(MNImask, meanNativeSpaceImage, interpolation =“nearest”),

in which: the meanNativeSpaceImage is taken from the native space preprocessed EPI images of the participant.
Is that the right approach?
Am I missing necessary additional information such as the other direction native-space to MNI transformation matrix?

Any guidance is appreciated!

Hi @AmYac and welcome to neurostars,

This command is only going to resample the image such that they have the same resolution. It does not apply warping, which would be required to go between MNI and subject space.

That being said, why not just use a brain mask calculated in subject space? I am not sure I see the value in warping the MNI mask.

Best,
Steven

Hi Steven,
Thanks to your response!
I wish to do an ROI analysis in the participant’s space whereas the ROI mask is given in MNI space.
Any guidance what would be a possible script to perform the resampling of the ROI mask with the required warping?
Thanks a lot
Best
Amnon

Hi @AmYac,

Ah got it, it wasn’t clear this was an ROI, and not a total brain mask.

Using ANTsPy

from ants import image_read, apply_transforms

# Load MNI ROI
roi_img_path = 'path/to/your/roi_MNI.nii.gz'

# Warp it to subject space
MNI_to_native_xfm_path = 'path/to/your/xfm.h5'
subject_template_path = 'path/to/subject/t1.nii.gz' # could also be BOLD ref/brain mask
ROI_in_native_space = apply_transforms(fixed=image_read(subject_template_path), moving=image_read(roi_img_path),
                                  transformlist=MNI_to_native_xfm_path, interpolator='nearestNeighbor')
# save it out
ROI_in_native_space.to_filename('/path/to/roi_subject.nii.gz')

If you have run fMRIPrep, the MNI to subject space transform can be found in the anatomical output folder.

Otherwise, if you do not have a transform handy you can calculate a registration with

xfm = ants.registration(fixed=T1_IMAGE_SUBJECT_SPACE, moving=T1_IMAGE_MNI_SPACE,
type_of_transform='SyN', outprefix='/PATH/TO/OUTDIR/')

(you’ll have to load in the T1 in subject and MNI space)
Then, in apply_transforms in the first code block, you would change the argument of transformlist to be transformlist=xfm['fwdtransforms'].

Best,
Steven

Hi Steven,
Again, thanks a lot for your prompt and effective response!
I am using SPM12 preprocessing and the warping information is stored in the anatomic folder but it is in a different format (in a 5xdimensional nii image and in matlab structure file).
I am going to use the ANTsPy approach as suggested including the registration calculation and update with the outcome.
Best,
Amnon

Hi @AmYac

You might be able to use the path to the .mat file as the registration path. Worth a shot.

Best,
Steven

Hi Steven,
I tried to use the path to the matlab file as the registration path, but it failed:

Description: ITK ERROR: TransformFileReaderTemplate(0x55e30c109b40): Transform IO: MatlabTransformIOTemplate
failed to read file: image001_seg8.mat

I used your alternative suggestion, i.e., computing the registration by ants. Registration and it worked: for the subject fixed path I used two options - the subject T1 and the mean native EPI image. Both references provided almost the same outcome mask (1 mm diff solely in the Z axis).

Thanks a lot for your effective help!!
Amnon

1 Like