Template to native space

I have an anatomical image (in native space) and a template of the hippocampus (not in MNI). I’d like to bring the template to the native space of each subject in order to conduct subsequent analysis. I’m not sure how to do this though, because the two images have different voxel size and dimensions. I suppose I’ll have to use normalization, but should I normalize both the anatomical and template to MNI first? How do I proceed next so that both the anatomical and the template are put in native space?

So assuming I understand your use case, it should be fairly straightforward… you may want to use FSL Flirt command (note the capital) which has a UI that can help you with the syntax.

So I am assuming you have the hippocampus in space X as a mask (i.e. all one’s and zeros). You’ll need the original brain image you used to generate the atlas (Which it sounds like you do).

So all you should have to do is generate the registration of ATLAS BRAIN–>AB to Subject Space. You can also “apply” the transformation to a secondary image (in this case, it would be the Hippocampal mask).

A couple of caveats— make sure you use nearest neighbor interpolation otherwise it will produce very odd behavior where it puts non-integer values at the edge of the mask.

You can also look into running FNIRT to generate a non-linear registration for your template, but that’s a whole other level of complication.

You should be able to simply repeat/script the process … but since you are not working in MNI Space, there’s probably no need …

Thanks @dagutman, and sorry for being unclear. The hippocampus atlas is actually not a binary image but split into different sub-regions with an integer corresponding to each sub-region. I only have this atlas so I’m not sure what you mean by “original brain image you used to generate the atlas”. Does that mean I cannot use Flirt to bring the atlas to native space?

Also Flirt will apply linear registration, but since the images are in difference spaces and have different dimensions and shapes, shouldn’t I be using non-linear registration (fnirt)? I thought one way would be to convert the atlas to MNI and then do an inverse transformation to native space, is this not appropriate?

I was coincidentally just recently doing this, but going in the other direction. Anyhow, either way I think it applies the same.

For completeness, in case this helps this is what I did (getting everything in MNI space):

The hippocampus masks I used came from https://datadryad.org/resource/doi:10.5061/dryad.gc72v, the paper being: https://www.nature.com/articles/sdata201559#cite1

those will be in MNI space.

if the voxel sizes are different you will first have to resample to get them to the desired size, I used nilearns resample_to_img for that http://nilearn.github.io/modules/generated/nilearn.image.resample_to_img.html

now since you’ll have a brain template + the rois in the same space and voxel size you can:

  1. register the template to your subject (this should give you a transformation file)
  2. register the mask to your subject using the transformation information that came from step 1.

here’s the script I used for my version from @mgxd (note I haven’t adjusted it at all to your case but the general logic holds - I think).

import os
from nilearn.image import resample_to_img
from niworkflows.interfaces.mni import RobustMNINormalization
from nipype.interfaces.ants import ApplyTransforms
import utils

def make_nii_gz(file_dir, write_dir):
    file_to_cp = os.path.join(file_dir, "brain.mgz")
    cp_path = os.path.join(write_dir, "brain_copy.mgz")
    print(utils.shell("cp {} {}".format(file_to_cp, cp_path).split()))
    cmd = "mri_convert {brain_copy_mgz} {brain_nii}".format(
        brain_copy_mgz = "brain_copy.mgz",
        brain_nii = "brain.nii.gz"
    print(utils.shell("rm brain_copy.mgz".split()))

def transform_data(subject_surf_dir, mni_template, roi_name, write_dir):
    subj = subject_surf_dir.split("/")[-1]
    if not os.path.isdir(subject_surf_dir):
        raise Exception("provide full path to subject dir")
    work_dir = os.path.join(write_dir, subj)
    if not os.path.isdir(work_dir):
    subj_dir_mri = os.path.join(subject_surf_dir, "mri")
    brain_mgz = os.path.join(subj_dir_mri, "brain.mgz")
    if not os.path.isfile(brain_mgz):
        raise Exception("subject does not have brain.mgz")
    roi_file_path = os.path.join(subj_dir_mri, roi_name)
    if not os.path.isfile(roi_file_path):
        raise Exception("subject does not have {} roi".format(roi_name))
    make_nii_gz(subj_dir_mri, work_dir)
    roi_resampled = resample_to_img(roi_file_path, os.path.join(work_dir, "brain.nii.gz"))
    resampled_roi_path = os.path.join(work_dir, roi_name.split(".")[0] + "_resampled.nii.gz")
    norm = RobustMNINormalization()
    norm.inputs.moving_image = os.path.join(work_dir, "brain.nii.gz")
    norm.inputs.template = "mni_icbm152_linear"
    norm.inputs.template_resolution = 2
    print("running normalization for:\n{}".format(subject_surf_dir))
    res = norm.run()
    applyt = ApplyTransforms(dimension=3, interpolation="NearestNeighbor")
    applyt.inputs.input_image = resampled_roi_path
    applyt.inputs.reference_image = mni_template
    applyt.inputs.transforms = res.outputs.composite_transform
    print("running apply transform for:\n{}".format(subject_surf_dir))

Rereading your question, I’m not sure what “native” space means, how I described above(minus the script, which transforms to mni space) would put everything in the subject space, but I think you should be able to adapt it to put things in native space.

Oh, and utils.shell is :

def shell(cmd_split):
    process = subprocess.Popen(cmd_split, stdout=subprocess.PIPE)
    out, err = process.communicate()
    return out.decode("utf-8").split("\n")