@emdupre Oh, so that is probably the cause of the misunderstanding - I thought that probably this 78 regions extracted are with standard mask or package. The data are just .npy 2d array with each row represents a TR.
Unfortunately, I don’t have the original code. I will try to see if I can get an access or something similar.
Thank you very much for the help!
(If you have any idea re the error in fetch_atlas_aal it will still be great )
Good catch @emdupre ! I can indeed replicate with Python 3.9.
I’ll open an issue and work on a fix tomorrow.
You’re also right about adding Python 3.9 to the tests, I should have added it already !
@emdupre@NicolasGensollen I was able to get a mapping of each voxel to (x,y,z) coords, and a 4X4 matrix of doubles, that to my understanding should map this to MNI space.
I found this function that looks relevant: https://nilearn.github.io/modules/generated/nilearn.image.coord_transform.html
IIUC, I should convert each voxels to (x,y,z), and then use this matrix and function to get coords in MNI space.
But, can you please explain how the matrix works? I would hate to use something I don’t understand…
Re AAL atlas error: by downgrading from Python3.9 I was able to fetch AAL atlas, but I have a question about using it: Let’s say I want to use a specific region label (e.g. Frontal_Mid_L) as a map and plot it on a brain image, how can it be done?
I looked over the examples and docs of nilean and didn’t find a way to plot specific labels.
I suppose I should use plot_img or plot_roi but I couldn’t find how to fetch the image of the specific region from the atlas (after taking the nifty file using [“maps”]).
Let’s say I want to use a specific region label (e.g. Frontal_Mid_L) as a map and plot it on a brain image, how can it be done?
You should be able to access the label, which will tell you the associated value in the niimg (each label is paired with its associated map, in increasing order). As you suggested, you could then use plot_roi to visualize it and confirm its location !
@emdupre Thanks, it works and I do get the ROI, can you please explain why did you choose img == 2112 for the sixth ROI ? Let’s say I now want the seventh and the 31st ROI, what should be passed instead of 2112?
If you run the code snippet you can see the full list of valid values in data; 2112 simply corresponds to the 6th item in the list after discarding the initial 0 entry (the value for the background). So you’d just subset the np.unique output appropriately:
import numpy as np
import nibabel as nib
from nilearn import datasets
aal = datasets.fetch_atlas_aal()
data = nib.load(aal.maps).get_fdata()
values = np.unique(data)
values = values[1:] # drop 0; ie. discard background
for label, roi in zip(aal.labels, values):
print(f'{label} has a value of {roi}')