Yup as @emdupre mentioned the code that you’re referencing is from the
nltools library. A lot of our design-thinking in this library was specifically setup to make it a bit easier, intuitive, and faster to work with great libraries like
numpy together. Under-the-hood we’ve composed several existing classes and objects from these libraries together into single classes with added functionality, while still exposing all of the functionality of the original library classes.
I’d recommend checking out our docs and the overview video of the library design for more.
In particular the
Brain_Data class is an object that’s composed of several other objects like:
- numpy array (
In particular, on initialization with some input,
Brain_Data automatically tries to map and internally stores the mapping between 3d or 4d x,y,z, (time) brain volumes and a 2d numpy array. This is really handy for a lot of reasons. For example, you can arbitrarily cast any 2d numpy array (of the same shape) to voxel-space in a single line of code, just by replacing the
.data attribute, e.g.
# Replace a brain data at all with normally distributed noise at each voxel
brain.data = np.random.normal(size=brain.shape())
# Plot the noise
nifti_obj = brain.data.to_nifti() # this is just a nibabel.Nifti1Image!
nifti_obj.get_data() # use the nibabel method to get the 3d or 4d data representation
If you have a Nifti file of ROI masks, atlases, etc you can always subset the data in our toolbox using the
Brain_Data.extract_roi methods, among others or even by modifying
.data directly, as in my example above.
We also wrap
nilearn.plotting functions including
view_img with some interactive widget features for jupyter notebooks (i.e. thresholding, slice selection, etc), using
Hope that clarifies things a bit.