Using juliech atlas in MNI152NLin6Asym_res-02 template

Dear community,
I have done all my preprocessing and level-1 analyses using the MNI152NLin6Asym_res-02 template. Then, I created a ROI mask with fsleyes by extracting a brain region according to the Juliech built-in atlas.

My problem is that, to my understanding, the Juliech atlas is built in the symmetric linear MNI152 space (i.e., MNI152_T1_2mm), which is slightly different than the non-linear space I used.

My questions:

  1. Is there a Juliech atlas adapted to the MNI152NLin6Asym_res-02 template?

  2. Alternatively, I can use the applywarp command to transform my mask from one space to another. How do I get the warp file (i.e., the transformation coordinates) to transform from the ROI in standard MNI1522mm to a ROI in the MNI152NLin6Asym_res-02 template?

Thank you very much,
Lior

1 Like

Hi @LiorAbramsom,

Isn’t the Juliech atlas present in FSL, which would mean that it is defined in the MNI space from FSL, i.e. the MNI152NLin6 Asym space?

FSL says the following:

The atlas contains 52 grey matter structures and 10 white matter structures. This is an update to the data used in Eickhoff’s Anatomy Toolbox v1.5. The atlas is based on the miscroscopic and quantitative histological examination of ten human post-mortem brains. The histological volumes of these brains were 3D reconstructed and spatially normalised into the space of the MNI single subject template to create a probabilistic map of each area. For the FSL version of this atlas, these probabilistic maps were then linearly transformed into MNI152 space.

https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/Atlases

Thank you @jsein for the response!
To my understanding there are several MNI152 templates, and whereas the atlases are in the linear space (not sure if symmetric or not), I used the non-linear space.

Do I understand it correctly?

Thanks again

Hi,

I found this summary about the MNI spaces interesting here.

Yes, FSL is using the MNI152NLin6Asym space, so if you took the Juelich atlas from FSL, I would go with that space.

1 Like

I have been trying to make the most recent versions of the JulichBrain maps available in AFNI, so I am working with that group at Julich under Katrin Amunts to get that done. While I’m not sure about the FSL version, I believe that the JulichBrain 2.9 and now 3.0, distributed through the eBrains site, are the first versions available in MNI 2009c asymmetric template space. The Julich group also makes versions of the atlas that are compatible with the MNI-N27 brain too.

The previous versions that had been distributed with the SPM anatomy toolbox (“cytoarchitectonic Eickhoff-Zilles atlases”) were in various spaces depending on the version from the original MNI-152, the 2006 MNI nonlinear template space or the N27 brain in MNI space, but these were typically shifted away from the standard MNI space along two axes (MNI_ANAT space in AFNI-speak). The differences among the various MNI templates are not obvious, and it’s tricky to notice for atlases that have only partial brain coverage.

image

The atlas is undergoing some changes, but this link provides a kind of alpha/beta release version.

https://afni.nimh.nih.gov/pub/dist/atlases/JulichBrain/

2 Likes

Thank you for looking that information!

I am sorry to go back to this point, but the problem is that when I look in fsleyes on the MNI152NLin6sym_brain file and the Juliech ROI mask, I get the warning “displaying images with different orientations/fields of view!”. In contrast, when I upload the MNI_152_T1_2mm_brain file together with the ROI, I don’t get this warning. This makes me think that there still a bit of a difference between the ROI template and the asymmetric non-linear template (although they look almost identical).

Am I interpreting this warning correctly?

Thank you

Where did you get the Julich ROI mask and the MNI152NLin6sym_brain files?

If you wants to use the MNI152NLin6Asym space, I would rather use the Julich atlas files given from FSL.

Also, the resource pointed out by @dglen is interesting. Perhaps you could use the most recent version of the Juliech atlas defined in MNI152NLin2009cAsym space and convert it to MNI152NLin6Asym space through the transformation from template flow: tpl-MNI152NLin2009cAsym_from-MNI152NLin6Asym_mode-image_xfm.h5

@jsein - using template flow sounds like a good idea for this. The newer version of the JulichBrain is based on more subjects and has more complete brain coverage than the older versions - in general. It is missing the cerebellum atlases provided by Diedrichsen, so that would require either the older atlas with it or a separate download of the SUIT cerebellum atlas.

It looks like there is a mismatch of the atlas mask with the template, so it may have come from a different place. If it is a legitimate version or if you are using a new version, then you may need to match the orientation and grid with a resampling onto the grid of the template or onto the grid of the particular dataset for analysis.

Dear @jsein and @dglen,

I would be interested in performing the conversion of the Juliech atlas in the MNI152NLin2009cAsym space (downloaded from here) to the MNI152NLin6Asym space to match my fMRI data. Can you suggest me a way to do this conversion using the tpl-MNI152NLin2009cAsym_from-MNI152NLin6Asym_mode-image_xfm.h5 transformation file? In which cases I can use the tpl-MNI152NLin2009cAsym_from-MNI152NLin6Asym_mode-image_xfm.mat instead?

Sorry for my naive questions, and thank you for your help.

Valeria

Hi @valeria ,

If I am not mistaken, tpl-MNI152NLin2009cAsym_from-MNI152NLin6Asym_mode-image_xfm.mat is an affine transform (linear transform with 12 degrees of freedom) while tpl-MNI152NLin2009cAsym_from-MNI152NLin6Asym_mode-image_xfm.h5 is a “composite” transform, including an affine and a warping field.

I would use the composite transform in this way:

antsApplyTransforms --float --default-value 0  \
 		--input JULICH_BRAIN_CYTOARCHITECTONIC_MAPS_2_9_MNI152_2009C_NONL_ASYM.pmaps.nii.gz -d 3 -e 3 \
 		--interpolation LanczosWindowedSinc \
 		--output JULICH_BRAIN_CYTOARCHITECTONIC_MAPS_2_9_MNI152_NONL6_ASYM.pmaps.nii.gz \
 		--reference-image tpl-MNI152NLin2009cAsym_res-01_T1w.nii.gz \
 		-t tpl-MNI152NLin2009cAsym_from-MNI152NLin6Asym_mode-image_xfm.h5

I hope that helps.

Hi @jsein,

thanks very much for your help.
I tried to run antsApplyTransforms as you specified, and this is the result for one ROI:

This is instead the same ROI from the original atlas (in MNI152NLin2009cAsym space) on the corresponding template:

I also tried to run antsApplyTransforms changing the --reference-image and --transform arguments as follows:

antsApplyTransforms --float --default-value 0  \
 		--input JULICH_BRAIN_CYTOARCHITECTONIC_MAPS_2_9_MNI152_2009C_NONL_ASYM.pmaps.nii.gz -d 3 -e 3 \
 		--interpolation LanczosWindowedSinc \
 		--output JULICH_BRAIN_CYTOARCHITECTONIC_MAPS_2_9_MNI152_NONL6_ASYM.pmaps.nii.gz \
 		--reference-image tpl-MNI152NLin6Asym_res-01_T1w.nii.gz \
        -t tpl-MNI152NLin6Asym_from-MNI152NLin2009cAsym_mode-image_xfm.h5

and getting similar results as before.
To me it seems that the conversion is not good, but I’m not sure if I have to do some steps more after the conversion or not.

Thanks again!

Valeria

Hi @valeria ,

Thank you for showing this. I tried myself and It turns out that my answer was wrong indeed! Sorry about that!
I took the time to test myself and it turns out the command I wrote was not handling 4D images such as the Julich atlas image (at least with my AntS version). I was obliged to split the 4D image in 3D images to be able to see the output.

As for the aspect of warped template, this is due to the interpolation method chosen (LanczosWindowedSinc) which may not the best for this kind of image.

Besides, it looks like the warping field coming from tpl-MNI152NLin6Asym_from-MNI152NLin2009cAsym_mode-image_xfm.h5 composite trnasfirm does not work well. I tried to verify it by directly warping the MNI152NLin2009cAsym template to MNI152NLin6Asym template and the result is not what I expected: instead of shrinking a bit (the template
MNI152NLin2009cAsym is slightly bigger than the MNI152NLin6Asym template), the warped MNI152NLin2009cAsym template was even bigger than its original size!

Finally, I was not able to use the tpl-MNI152NLin6Asym_from-MNI152NLin2009cAsym_mode-image_xfm.mat file.

Many things to understand yet in this topic!

Let’s continue testing and learning about this and hopefully we will find a solution soon!

Dear @jsein,

Thank you so much for taking the time to test this. The information you provided has been extremely helpful.

In my case, by using the -d 3 and -e 3 options, antsApplyTransforms was able to handle the 4D atlas image (it applied the transform on each 3D matrix at a time, progressing along the 4th dimension). However, the important thing to know is that the tpl-MNI152NLin6Asym_from-MNI152NLin2009cAsym_mode-image_xfm.h5 is not working properly.

I’ll try to perform other conversions using the atlas in a different space, and I’ll test different interpolation methods. If anything interesting comes up, I will post an update.
Hopefully, by trial and error, we will eventually find a solution.

Once again, thank you for your help!

Valeria

On the other hand, I tried to warp MNI152NLin6Asym template into MNI152NLin2009cAsymtemplate using tpl-MNI152NLin2009cAsym_from-MNI152NLin6Asym_mode-image_xfm.h5 composite transform, and this one works as expected!

One could think of inverting the transform to fit your needs but from what I read from forums, it is not feasible easily.

While I don’t know how to invert the transformation with TemplateFlow or with ANTs, it is simple to do the equivalent with AFNI.

First, with the NLin6Asym template from the TemplateFlow.org site, one can compute the affine and nonlinear transformation to align that with the 2009c template. Here just treating the NLin6 template as any other anatomical dataset for alignment:

  @SSwarper -base ~/abin/MNI152_2009_template_SSW.nii.gz \
  -input tpl-MNI152NLin6Asym_res-01_T1w.nii.gz \
  -odir sswMNINL6_to_2009c  -subid MNINL6

Then apply the transformation to the desired atlas to move to be in alignment with the NLin6 template. Here I used the maximum probability map of 157 regions. To move this kind of dataset, use nearest neighbor interpolation (NN). The dataset is one that has labels in its header, so it can make use of the whereami feature and the overlay labels in AFNI. The second command below copies the labels over from the 2009c atlas to the new NLin6 atlas.

  3dNwarpApply -iwarp -interp NN  \
    -prefix JulichBrain3_MNINL6.nii.gz \
    -source ~/JulichBrain/maximum_probability_maps_MPMs_157areas/JulichBrainAtlas_3.0_areas_MPM_b_N10_nlin2ICBM152asym2009c_public_11035603b4744231e17e87fd8ebcaf1a.nii.gz \
    -nwarp  'sswMNINL6_to_2009c/anatQQ.MNINL6_WARP.nii sswMNINL6_to_2009c/anatQQ.MNINL6.aff12.1D'

  3drefit -cmap INT_CMAP \
    -copytables \
 ~/JulichBrain/maximum_probability_maps_MPMs_157areas/Julich_MNI2009c.nii.gz \
    JulichBrain3_MNINL6_labeled.nii.gz

The transformations and the labeled dataset are available here:
https://afni.nimh.nih.gov/pub/dist/atlases/JulichBrain/MNINL6.tgz

1 Like

Great answer, thank you @dglen !

Very nice to learn about the whole process and to discover these commands in AFNI!

Following up with the tests with AntS, I ended up using the same strategy as @dglen :

Calculate the composite transform from 2009cAsym to NLin6:

antsRegistration --collapse-output-transforms 1 --dimensionality 3 --float 1 \
	--initialize-transforms-per-stage 0 --interpolation LanczosWindowedSinc \
	--output [ ants_2009cAsym_to_NLin6, ants_2009cAsym_to_NLin6_Warped.nii.gz ] \
	--transform SyN[ 0.1, 3.0, 0.0 ]  --convergence [ 100x70x50x20, 1e-06, 10 ] \
	--smoothing-sigmas 3.0x2.0x1.0x0.0vox --shrink-factors 8x4x2x1 --use-histogram-matching 1 \
	--winsorize-image-intensities [ 0.005, 0.995 ]  --write-composite-transform 1 \
	--metric Mattes[ tpl-MNI152NLin6Asym_res-01_T1w.nii.gz, tpl-MNI152NLin2009cAsym_res-01_T1w.nii.gz, 1, 56, Regular, 0.25 ] \
	--convergence [ 100x100, 1e-06, 20 ] --smoothing-sigmas 2.0x1.0vox --shrink-factors 2x1 --use-histogram-matching 1 

Apply the transform to the atlas:

antsApplyTransforms --float --default-value 0 -d 3 -e 3 \
	--input hbp-d000001_jubrain-cytoatlas_pub-PM-collections-29-julichbrain-2.9-pmaps-4d/JULICH_BRAIN_CYTOARCHITECTONIC_MAPS_2_9_MNI152_2009C_NONL_ASYM.pmaps.nii.gz \
	--interpolation NearestNeighbor \
	--output hbp-d000001_jubrain-cytoatlas_pub-PM-collections-29-julichbrain-2.9-pmaps-4d/JULICH_BRAIN_CYTOARCHITECTONIC_MAPS_2_9_MNI152_NONL6_ASYM.pmaps.nii.gz \
	--interpolation NearestNeighbor \
	--reference-image tpl-MNI152NLin6Asym_res-01_T1w.nii.gz\
	-t ants_2009cAsym_to_NLin6Composite.h5

Here is the result on tpl-MNI152NLin6Asym_res-01_T1w.nii.gz:

For comparison, here is the atlas at the same position in tpl-MNI152NLin2009cAsym:

It doesn’t look perfect yet, some additional work may be needed.
The solution from @dglen in AFNI seems to be more accurate.

Thank you both, @jsein and @dglen!
I will definitely try with both AntS and AFNI as you suggested, and provide an update.
Your help has been extremely precious, thanks very much.

Valeria

Hi @jsein @dglen and valeria! (sorry, new user, can’t mention more than 2 users)

We are trying to do the same thing but with a different atlas: MMP 1.0 MNI projections (MMP 1.0 MNI projections)

Indeed we have several atlases that we will need to register from 2009c to MNI152NLin6Asym.

We’ve tried running @jsein command using Ants 2.5.0, but it does not run. There are a couple of duplicated arguments (smoothing-sigmas, shrink-factors, etc).

In the end, I removed the second occurence of each and ran this:

antsRegistration \
    - v 1 \
    --collapse-output-transforms 1 \
    --dimensionality 3 \
    --float 1 \
    --initialize-transforms-per-stage 0 \
    --interpolation LanczosWindowedSinc \
    --output '[ ants_2009cAsym_to_NLin6, ants_2009cAsym_to_NLin6_Warped.nii.gz ]' \
    --transform 'SyN[ 0.1, 3.0, 0.0 ]'  \
    --convergence [ 100x70x50x20, 1e-06, 10 ] \
    --smoothing-sigmas 3.0x2.0x1.0x0.0vox \
    --shrink-factors 8x4x2x1 \
    --use-histogram-matching 1 \
    --winsorize-image-intensities '[ 0.005, 0.995 ]' \
    --write-composite-transform 1 \
    --metric 'Mattes[ tpl-MNI152NLin6Asym_res-01_T1w.nii.gz, tpl-MNI152NLin2009cAsym_res-01_T1w.nii.gz, 1, 56, Regular, 0.25 ]' \

However, we still find some issues with the atlas after we applied the transform. In this image you can see on the left, the MNI152NLin6Asym template and atlas (warped). On the right, the MNI152NLin2009cAsym template and atlas (original)

The concerns comes mostly because we are “loosing” voxels, in some ROIs, more than 30% of them.

Is there any parametrisation that we can do to improve the accuracy of the transform?

FYI: we are still running @dglen approach, I’ll update as soon as it’s done.

Hi @Fede_Raimondo ,

Did you apply the transformation with antsApplyTransforms with --interpolation NearestNeighbor ?

yes, like this:

antsApplyTransforms \
    -r tpl-MNI152NLin6Asym_res-01_T1w.nii.gz \
    -i MMP_in_MNI_corr.nii.gz \
    -t MNI152NLin2009cAsym_2_MNI152NLin6.h5 \
    -n NearestNeighbor \
    -o ants_warped_6th_glasser.nii.gz