hey @o_aug
Yup as @emdupre mentioned the code that you’re referencing is from the nltools
library. A lot of our design-thinking in this library was specifically setup to make it a bit easier, intuitive, and faster to work with great libraries like nilearn
, nibabel
, pandas
, and numpy
together. Under-the-hood we’ve composed several existing classes and objects from these libraries together into single classes with added functionality, while still exposing all of the functionality of the original library classes.
I’d recommend checking out our docs and the overview video of the library design for more.
In particular the Brain_Data
class is an object that’s composed of several other objects like:
- a
pandas.DataFrame
(Brain_Data.X
or Brain_Data.Y
)
- numpy array (
Brain_Data.data
)
- a
NiftiMasker
from nilearn
(Brain_Data.mask
or Brain_Data.to_nifti()
)
In particular, on initialization with some input, Brain_Data
automatically tries to map and internally stores the mapping between 3d or 4d x,y,z, (time) brain volumes and a 2d numpy array. This is really handy for a lot of reasons. For example, you can arbitrarily cast any 2d numpy array (of the same shape) to voxel-space in a single line of code, just by replacing the .data
attribute, e.g.
# Replace a brain data at all with normally distributed noise at each voxel
brain.data = np.random.normal(size=brain.shape())
# Plot the noise
brain.plot()
nifti_obj = brain.data.to_nifti() # this is just a nibabel.Nifti1Image!
nifti_obj.get_data() # use the nibabel method to get the 3d or 4d data representation
If you have a Nifti file of ROI masks, atlases, etc you can always subset the data in our toolbox using the Brain_Data.apply_mask()
or Brain_Data.extract_roi
methods, among others or even by modifying .data
directly, as in my example above.
We also wrap nilearn.plotting
functions including view_img
with some interactive widget features for jupyter notebooks (i.e. thresholding, slice selection, etc), using Brain_Data.iplot()
Hope that clarifies things a bit.