nilearn.regions.Parcellations and masking


I’m hoping to apply nilearn’s Parcellations tool to a small mask within a high-resolution dataset with many timepoints, however, I’m running into a memory error.

kmeans = Parcellations(method='kmeans', n_parcels=2,
    mask = mask_image)

Instead of passing the entire, high-resolution 4-D dataset (concat_data here), is there any sort of workaround here? E.g., where I only need to pass the np array data within the mask? Any help would be greatly appreciated.



Can you paste the full traceback, it will help us know where the exact problem is.

If you want to use directly a numpy array, you can apply scikit-learn kmean’s algorithm on it.

Here is the full trace back:

File "", line 101, in <module>
  File "/home/tvanasse/nilearn/nilearn/decomposition/", line 411, in fit
  File "/home/tvanasse/nilearn/nilearn/decomposition/", line 175, in mask_and_reduce
    ) for img, confound in zip(imgs, confounds))
  File "/home/tvanasse/miniconda/envs/nsddata/lib/python3.7/site-packages/joblib/", line 1003, in __call__
    if self.dispatch_one_batch(iterator):
  File "/home/tvanasse/miniconda/envs/nsddata/lib/python3.7/site-packages/joblib/", line 834, in dispatch_one_batch
  File "/home/tvanasse/miniconda/envs/nsddata/lib/python3.7/site-packages/joblib/", line 753, in _dispatch
    job = self._backend.apply_async(batch, callback=cb)
  File "/home/tvanasse/miniconda/envs/nsddata/lib/python3.7/site-packages/joblib/", line 201, in apply_async
    result = ImmediateResult(func)
  File "/home/tvanasse/miniconda/envs/nsddata/lib/python3.7/site-packages/joblib/", line 582, in __init__
    self.results = batch()
  File "/home/tvanasse/miniconda/envs/nsddata/lib/python3.7/site-packages/joblib/", line 256, in __call__
    for func, args, kwargs in self.items]
  File "/home/tvanasse/miniconda/envs/nsddata/lib/python3.7/site-packages/joblib/", line 256, in <listcomp>
    for func, args, kwargs in self.items]
  File "/home/tvanasse/nilearn/nilearn/decomposition/", line 205, in _mask_and_reduce_single
    this_data = masker.transform(img, confound)
  File "/home/tvanasse/nilearn/nilearn/input_data/", line 326, in transform
    return self.transform_single_imgs(imgs)
  File "/home/tvanasse/nilearn/nilearn/input_data/", line 405, in transform_single_imgs
  File "/home/tvanasse/miniconda/envs/nsddata/lib/python3.7/site-packages/joblib/", line 355, in __call__
    return self.func(*args, **kwargs)
  File "/home/tvanasse/nilearn/nilearn/input_data/", line 59, in filter_and_mask
  File "/home/tvanasse/nilearn/nilearn/input_data/", line 94, in filter_and_extract
    imgs, parameters['smoothing_fwhm'])
  File "/home/tvanasse/miniconda/envs/nsddata/lib/python3.7/site-packages/joblib/", line 355, in __call__
    return self.func(*args, **kwargs)
  File "/home/tvanasse/nilearn/nilearn/image/", line 287, in smooth_img
    ensure_finite=True, copy=True)
  File "/home/tvanasse/nilearn/nilearn/image/", line 225, in _smooth_array
    arr[np.logical_not(np.isfinite(arr))] = 0
MemoryError: Unable to allocate array with shape (182, 218, 182, 7500) and data type bool

And using scikit-learn kmean’s algorithm directly seems like a fine idea… thanks for the suggestion.

OK, it breaks in the smoothing. If you want smoothing (which is important for k-means) you cannot simply use scikit-learn. But maybe you smoothed before.

I am smoothing with the following code before:

masker = NiftiMasker(mask_img = mask_image, smoothing_fwhm=2, standardize=False)
roi_data = masker.fit_transform(all_data)
from sklearn.cluster import KMeans
kmeans = KMeans(n_clusters=2, random_state=0).fit(np.transpose(masked_data))
parcel_image = masker.inverse_transform(kmeans.labels_ + 1) # +1 change 0 to 1, and 1 to 2

Thanks for your help, and your all your work in creating these tools!