For my analysis, I am using ?h.aparc.a2009s.stats
files as input as generated from aparcstats2table
. In the downstream analysis I want to plot a statistical surface map. For this, it is necessary to map the unique values of each region of the Destrieux a2009s atlas to corresponding values derived from my statistical analysis. Also, I need a mapping from the atlas ids to their labels.
This seems (?) to be the corresponding paper stating that there are 74 unique regions. If you look at Table 1 here all IDs range from 1-74 in ascending order and without jumps, as you would expect.
But if we load the .annot
files from the freesurfer installation, the ID 42 is missing. I took me a while to figure this out, because at first I was simply creating a .csv file myself using the table in the paper (with one ID and one label column), naively assuming that the the ?h.aparc.a2009s.annot
files in the freesurfer installlation file would contain values from 1-74 with -1 denoting the background. But as the 42 is missing the id-to-label-mapping was wrong of course.
Does somebody know what’s going on here? Nilearn also seems to offer the destrieux atlas with nilearn.datasets.fetch_atlas_surf_destrieux()
, maybe this is a better alternative?
from nilearn.surface import load_surf_data
from nilearn.datasets import fetch_atlas_surf_destrieux
# we can load the .annot files from the freesurfer installation folder. Replace
# with your installation path if you want to reproduce this. Both arrays contain
# values from -1 to 75, where -1 stands for background (?) but id 42 is missing. Therefore we end up at 75 and not at 74. If we would be using the table 1 of the paper to map ids to labels everything would be messed up.
map_left = load_surf_data('/zi/apps/opt/freesurfer_7.4.1/subjects/fsaverage/label/rh.aparc.a2009s.annot')
map_right = load_surf_data('/zi/apps/opt/freesurfer_7.4.1/subjects/fsaverage/label/lh.aparc.a2009s.annot')
unique_ids_left = np.unique(map_left)
unique_ids_right = np.unique(map_right)
# or we can load from nilearn. 1 stands now for background? Here arrays range
# from 1 to 75, 42 is not missing
destrieux = fetch_atlas_surf_destrieux()
labels = [b.decode('utf-8') for b in list(destrieux['labels'])] # why is this not a simple list btw?
map_left = destrieux['map_left']
map_right = destrieux['map_right']
unique_ids_left = np.unique(map_left)
unique_ids_right = np.unique(map_right)