Checking parcellation is succesful

hello, i have a dataset in bids format after fmriprep, i wan’t to get individual parcellated time-series according to schaefer 400.

my questions are:

  1. after i finished, what’s the easiest approach to verify the correctness of the process (making sure the node activity corresponds to the average of the voxels in the nodes region)
  2. i’m not getting any named labels. only an array of shape (400,TRs). i found on shaefer’s github page some named labels, so i guess the numbers there correspond to the order of the array?
  3. how do i verify the mm size? in my own data and in the schaefer parcellation map that i download?

my script currently:

from nilearn.input_data import NiftiLabelsMasker
from nibabel import processing
from nilearn.masking import unmask,_apply_mask_fmri
from nilearn import datasets

import matplotlib.pyplot as plt
import os
import nibabel as nib

dataset = datasets.fetch_atlas_schaefer_2018(data_dir=r'D:\users\alex')
atlas_filename = dataset.maps
masker = NiftiLabelsMasker(labels_img=atlas_filename, resampling_target='data', standardize=True)
files_path = r'D:\users\alex\pain\june\original_version_example.nii.gz'
destination = r'D:\users\alex\pain\june\MNI_to_TRs'

b = nib.load(r'D:\users\alex\HCP-1200\one_patient_original_version\rfMRI_REST1_LR.nii.gz')
shape = b.shape
vmap = vox_map(shape=shape, affine=b.affine)

file = 'sub-0005_ses-1_task-fcmri_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz'
img_orig = nib.load(file)
mask = nib.load(file.replace('preproc_bold','brain_mask'))
img_orig = _apply_mask_fmri(img_orig,mask)
img_orig = unmask(img_orig,mask)
time_series = masker.fit_transform(img_orig)
np.save(os.path.join(parcel_path, 'parcellation.npy'), time_series)

  1. I guess you could compute it in some other way and compare the results, or have a look at the tests for the niftilabelsmasker: nilearn/test_nifti_labels_masker.py at 2ffbc2e5380bde5f0ac9f20770791feb8468ce58 · nilearn/nilearn · GitHub

  2. after calling transform, the NiftiLabelsMasker has a labels_ attribute containing the list of labels in the order corresponding to the output of transform

  3. by looking at the Nifti1Image's affine attribute

  1. is there any tutorial that uses these tests? seems like it can come in handy but i’m not sure how to start.

  2. i’m using masker.fit_transform(img) . after that the labels_ attribute gives back a list of indices. how do i interpret these labels as coordinates? say i want to manually test the time series and compare it to the raw bold activity from the preproc tensor.

  3. so i read that the resolution is the diagonal of the affine - what does it mean in the case my affine is the following?
    [ 2.97300005, 0. , 0. , -96. ,
    0. , 2.97300005, 0. , -132. ,
    0. , 0. , 3. , -78. ,
    0. , 0. , 0. , 1. ])

  4. i’m assuming from this affine that the resolution of my data is 3 mm (not sure but let’s assume) in this case the fetch_atlas_schaefer_2018 returns an error:
    ValueError: Requested resolution_mm=3 not available. Valid options: [1, 2]
    is there anyway to tackle this?

  1. to run the nilearn tests, use pytest . for example from the root of the nilearn repository run pytest ./nilearn note these exist to check nilearn code produces correct results; they are not used in tutorials or outside of nilearn
  2. the labels_ attribute maps columns in the transform output to values (labels) in the provided atlas image (labels_img) . you can use these indices to index the list of label names, ie atlas.labels . This kind of inspection will become easier after the next nilearn release thanks to the addition of html reports for this masker here. I’m not sure I understand the comment about coordinates – these indices correspond to atlas regions
  3. your interpretation is correct; your voxels have 2.97 mm in the x and y directions, 3 mm in the z direction
  4. you have to download one of the existing atlases (for example the 2mm one) and resample it to match your image (see here). but actually the NiftiLabelsMasker takes care of that for you, so you don’t need to worry about resolution and it will resample the atlas to your images’ resolution. Note the images must be in MNI space