Transforming a mask from MNI to native space (python environment)

Hi,
I intend to use MNI defined masks in the participant’s native (BOLD functional) space. I looked at previous communication on this subject and @chrisgorgolewski responded in Nov2018: …“All you will need to do is to resample the mask to the same shape and voxel size”. I thought of using Niftimasker for this purpose.
Please see the following command I thought to apply for this purpose (python environment):

resampled_mask_img = resample_to_img(MNImask, meanNativeSpaceImage, interpolation =“nearest”),

in which: the meanNativeSpaceImage is taken from the native space preprocessed EPI images of the participant.
Is that the right approach?
Am I missing necessary additional information such as the other direction native-space to MNI transformation matrix?

Any guidance is appreciated!

Hi @AmYac and welcome to neurostars,

This command is only going to resample the image such that they have the same resolution. It does not apply warping, which would be required to go between MNI and subject space.

That being said, why not just use a brain mask calculated in subject space? I am not sure I see the value in warping the MNI mask.

Best,
Steven

Hi Steven,
Thanks to your response!
I wish to do an ROI analysis in the participant’s space whereas the ROI mask is given in MNI space.
Any guidance what would be a possible script to perform the resampling of the ROI mask with the required warping?
Thanks a lot
Best
Amnon

Hi @AmYac,

Ah got it, it wasn’t clear this was an ROI, and not a total brain mask.

Using ANTsPy

from ants import image_read, apply_transforms

# Load MNI ROI
roi_img_path = 'path/to/your/roi_MNI.nii.gz'

# Warp it to subject space
MNI_to_native_xfm_path = 'path/to/your/xfm.h5'
subject_template_path = 'path/to/subject/t1.nii.gz' # could also be BOLD ref/brain mask
ROI_in_native_space = apply_transforms(fixed=image_read(subject_template_path), moving=image_read(roi_img_path),
                                  transformlist=MNI_to_native_xfm_path, interpolator='nearestNeighbor')
# save it out
ROI_in_native_space.to_filename('/path/to/roi_subject.nii.gz')

If you have run fMRIPrep, the MNI to subject space transform can be found in the anatomical output folder.

Otherwise, if you do not have a transform handy you can calculate a registration with

xfm = ants.registration(fixed=T1_IMAGE_SUBJECT_SPACE, moving=T1_IMAGE_MNI_SPACE,
type_of_transform='SyN', outprefix='/PATH/TO/OUTDIR/')

(you’ll have to load in the T1 in subject and MNI space)
Then, in apply_transforms in the first code block, you would change the argument of transformlist to be transformlist=xfm['fwdtransforms'].

Best,
Steven

Hi Steven,
Again, thanks a lot for your prompt and effective response!
I am using SPM12 preprocessing and the warping information is stored in the anatomic folder but it is in a different format (in a 5xdimensional nii image and in matlab structure file).
I am going to use the ANTsPy approach as suggested including the registration calculation and update with the outcome.
Best,
Amnon

Hi @AmYac

You might be able to use the path to the .mat file as the registration path. Worth a shot.

Best,
Steven

Hi Steven,
I tried to use the path to the matlab file as the registration path, but it failed:

Description: ITK ERROR: TransformFileReaderTemplate(0x55e30c109b40): Transform IO: MatlabTransformIOTemplate
failed to read file: image001_seg8.mat

I used your alternative suggestion, i.e., computing the registration by ants. Registration and it worked: for the subject fixed path I used two options - the subject T1 and the mean native EPI image. Both references provided almost the same outcome mask (1 mm diff solely in the Z axis).

Thanks a lot for your effective help!!
Amnon

1 Like

Hi Steven, I’m new to neuro-imaging and I have a similar issue. The dataset I have (preprocessed using ANTs) is in native space (I checked the affine matrix) and I want to extract ROIs using an atlas (such as AAL) but the atlas is in MNI space. How does this code change for this scenario?

Hi @shahmir_chad, and welcome to neurostars!

In that case you’ll want to use your atlas instead of the ROI img, and use the genericLabel interpolation.

Best,
Steven

Ok so when I am loading the atlas, where do I get the file for “MNI_to_native_xfm_path”? Is this a generic file or do I have to generate this as per my dataset? The files I have are for cortical thickness and not for for T1w images.

Hi @shahmir_chad

This is the transformation between MNI T1w and your native space T1w. It can be calculated with the ants.registration command if you do not have already from preprocessing (e.g., fmriprep calculates as part of the pipeline).

Best,
Steven

I tried this using the nearestneighbor interpolator and the output image had a bad resolution. When using genericlabel its giving an error “interpolator not set” which apparently is a common error specific to using generic label interpolating. I am not resampling anywhere before using the “apply_ants_transform_to_image” function. Any idea how to clear this?

Would need to know more about what images you used to calculate and apply the registration.

Did you capitalize the L in genericLabel ?

Best,
Steven