Transform freesurfer mesh coordinates to volume coordinates

Hello neuro friends!

I was given an anatomical template with (a) volume images and (b) surface meshes extracted from these volumes with freesurfer. The volumes are in nifti format, the meshes in gifti format.

Opening the volume and meshes in freeview will let you think that they are in the same coordinate system, while doing the same thing in fsleyes or other softwares seems to prove otherwise :scream_cat: (cf screens 1 & 2).
I tried to modify the affine matrix in the gifti header but it seems that these softwares don’t take it into account (same for nibabel and consequently nilearn).

I am very confused with what these coordinate systems are and would love to find a clear explanation about all this! In particular:

  • what space do freesurfer mesh coordinates live in? (I read it’s RAS and that it’s not centered the same way other coordinate systems are ; see screenshot 3, showing point (0,0,0) in my images)
  • what space do nifti volume images live in?
  • where can I find the affine transform to by applied to my freesurfer coordinates that will take them to the same space as my volume images? It feels like I need a simple translation
  • why don’t these softwares take gifti headers into account?

Sorry if these questions don’t make sense, I’m doing my best to untangle all this knowledge :sweat_smile:

[1] Freeview screenshot

[2] Niivue screenshot ; I get a similar result with FSLEyes
Screenshot from 2022-09-26 19-09-38

[3] FSLEyes screenshot showing point (0,0,0)

I eventually found RAS-related information in the header of the aforementioned Gifti files and was able to derive a translation which seem to bring my data to the same space :tada: (I am still not sure what space it is though haha, so I’m still interested in some explanation)

I cut a python gist in case someone else needs this.

1 Like

@alexisthual would you mind sharing your dataset with me so we can update NiiVue to support this? A voxel plus mesh dataset sent to my institutional email would be great.

1 Like

FreeSurfer-generated GIFTIs have two affines. One is the GIFTI-specified one, and the other is a GIFTI-metadata encoding of the FreeSurfer volume geometry (VolGeom{X,Y,Z,C}_{R,A,S} off of memory; this might not be exactly right). FreeSurfer knows to apply the VolGeom affine, pretty much nobody else does.

You can provide the --to-scanner option to mris_convert to have FreeSurfer update the coordinates with the VolGeom, but this will also leave in the VolGeom affine, so everybody but FreeSurfer will handle the file correctly.

The way to get a generic GIFTI from a FreeSurfer surface appears to be to use mris_convert --to-scanner and then remove the VolGeom metadata from the file yourself.

I just read your gist, @alexisthual. You’re doing the same thing that HCP Pipelines and fMRIPrep have done, which is just applying the translation encoded in C_RAS. If the image is oblique, you can get a non-identity XYZ_RAS matrix, in which case this will fail.

FWIW as of fMRIPrep 22.0, we now use the method I describe above: FIX: Use `mris_convert --to-scanner`, and update normalization step by effigies · Pull Request #295 · nipreps/smriprep · GitHub


In Freesurfer dev branch, mris_convert now works as following:

  1. ‘mris_convert –to-scanner’,
    NIFTI_INTENT_POINTSET in scanner space
    = transform matrix go from scanner space to Freesurfer tkregister space
    = NIFTI_XFORM_UNKNOWN (Freesurfer tkregister space)
  2. ‘mris_convert’ w/o –to-scanner,
    NIFTI_INTENT_POINTSET in Freesurfer tkregister space
    = NIFTI_XFORM_UNKNOWN (Freesurfer tkregister space)
    = transform matrix go from Freesurfer tkregister space to scanner space
  3. freeview loads surface in scanner space correctly (MRISread() converts surface XYZ coords from scanner space to tkregister space if mris->useRealRAS=1)

A little late to the party here, but FSLeyes is aware of this coordinate system - you can project any mesh overlay (VTK, gifti, lh.pial etc) into it by setting the Mesh coordinate space to Freesurfer coordinates. You do need to associate the mesh with a reference NIfTI/MGH image though, as the reference image volume dimensions and scaling parameters must be known in order to generate the mesh affine transformation. This is documented on the Freesurfer wiki: CoordinateSystems - Free Surfer Wiki and on Graham Wideman’s website: Understanding FreeSurfer Coordinates

And FSLeyes uses this function from the fslpy library to generate the affine: — fslpy 3.14.1 documentation

It is worth noting that for GIFTI files FreeSurfer embeds the proprietary VolGeom{X,Y,Z,C}_{R,A,S} transform. Curiously, this transform also exists in the undocumented FreeSurfer mesh footer. While most tools only read the FreeSurfer mesh header (and therefore do not see a transform), nibabel can read these:

import nibabel as nb
fsh ='lh.pial',True)
[ 4.16484833  9.03939819 -8.29063416]

Thanks @neurolabusc, yes - FSLeyes will automatically use the appropriate transform for freesurfer meshes (e.g. lh.pial), but I have learned through these posts that it should be possible to detect freesurfer-generated GIFTIs by checking the metadata, and automatically setting the freesurfer coordinate system as default. I’ll update FSLeyes to do just that.

But from my understanding, it shouldn’t actually be necessary to read the VolGeom fields, as the mesh ↔ volume affine transformation (the voxel ↔ RAS/tkr/torig transformation) can always be generated from the volume dimensions and scales (and, within FSLeyes, a mesh always has to be associated with a reference NIfTI/MGZ in order to be positioned correctly).

For what it’s worth, I recently dug through FreeSurfer docs/code and wrote up the conversions between tkras and scanner spaces: FreeSurfer coordinate spaces and translation · nipy/nibabel · Discussion #1249 · GitHub


@effigies your write up is clear and very timely for my efforts. @yhuang would it be possible to include some of @effigies insight into the FreeSurfer pages to help document these features? I am always impressed in how much wisdom has been incorporated into nibabel, but it is often implicit and it is great to see the underlying theory described explicitly.