Mapping Voxel number to Brain region

@emdupre Oh, so that is probably the cause of the misunderstanding - I thought that probably this 78 regions extracted are with standard mask or package. The data are just .npy 2d array with each row represents a TR.
Unfortunately, I don’t have the original code. I will try to see if I can get an access or something similar.
Thank you very much for the help!

(If you have any idea re the error in fetch_atlas_aal it will still be great :slight_smile: )

Unfortunately, I don’t have the original code. I will try to see if I can get an access or something similar.
Thank you very much for the help!

Let us know what you find, @o_aug !

(If you have any idea re the error in fetch_atlas_aal it will still be great :slight_smile: )

I think this might be because you’re running with Python3.9. Specifically, looking at the error message:

AttributeError: 'xml.etree.ElementTree.Element' object has no attribute 'getiterator'

This is a known deprecation in Python 3.9. I think we’d need to open an issue to change these lines (and maybe add in Python 3.9 testing to our build matrix !?).

WDYT @NicolasGensollen ?

Good catch @emdupre ! :+1: I can indeed replicate with Python 3.9.
I’ll open an issue and work on a fix tomorrow.
You’re also right about adding Python 3.9 to the tests, I should have added it already ! :sweat:

1 Like

@emdupre @NicolasGensollen I was able to get a mapping of each voxel to (x,y,z) coords, and a 4X4 matrix of doubles, that to my understanding should map this to MNI space.
I found this function that looks relevant:
https://nilearn.github.io/modules/generated/nilearn.image.coord_transform.html
IIUC, I should convert each voxels to (x,y,z), and then use this matrix and function to get coords in MNI space.
But, can you please explain how the matrix works? I would hate to use something I don’t understand…

Re AAL atlas error: by downgrading from Python3.9 I was able to fetch AAL atlas, but I have a question about using it: Let’s say I want to use a specific region label (e.g. Frontal_Mid_L) as a map and plot it on a brain image, how can it be done?
I looked over the examples and docs of nilean and didn’t find a way to plot specific labels.
I suppose I should use plot_img or plot_roi but I couldn’t find how to fetch the image of the specific region from the atlas (after taking the nifty file using [“maps”]).

Thanks you very much

But, can you please explain how the matrix works? I would hate to use something I don’t understand…

If I understand correctly, you’re referring to an affine matrix ? In which case, I’d strongly recommend this excellent explainer from the Nibabel docs: Neuroimaging in Python — NiBabel 3.2.0 documentation

Let’s say I want to use a specific region label (e.g. Frontal_Mid_L) as a map and plot it on a brain image, how can it be done?

You should be able to access the label, which will tell you the associated value in the niimg (each label is paired with its associated map, in increasing order). As you suggested, you could then use plot_roi to visualize it and confirm its location !

On the second point, here’s some example code:

import numpy as np
import nibabel as nib
from nilearn import datasets, plotting
from nilearn.image import math_img

aal = datasets.fetch_atlas_aal()
assert aal.labels[6] == 'Frontal_Mid_L'

# fetch valid ROI values:
data = nib.load(aal.maps).get_fdata()
np.unique(data)

# grab sixth ROI, discarding background
roi = math_img("img == 2112", img=aal.maps)
plotting.plot_roi(roi)

@emdupre Thanks, it works and I do get the ROI, can you please explain why did you choose img == 2112 for the sixth ROI ? Let’s say I now want the seventh and the 31st ROI, what should be passed instead of 2112?

If you run the code snippet you can see the full list of valid values in data; 2112 simply corresponds to the 6th item in the list after discarding the initial 0 entry (the value for the background). So you’d just subset the np.unique output appropriately:

import numpy as np
import nibabel as nib
from nilearn import datasets

aal = datasets.fetch_atlas_aal()
data = nib.load(aal.maps).get_fdata()
values = np.unique(data)
values = values[1:]  # drop 0; ie. discard background

for label, roi in zip(aal.labels, values):
    print(f'{label} has a value of {roi}')

Great, it’s working, 10x!