Introducing neurocaps: A Python Package for Co-activation Patterns (CAPs) Analysis

Hello neurostars community!

I wanted to introduce a package I that I have been working on - neurocaps, a Python package for performing Co-activation Patterns (CAPs) analysis. This package leverages many popular neuroimaging, data science, and visualization packages to streamline the process from timeseries extraction to CAPs analysis and visualization. Currently, it supports Schaefer, AAL, and certain custom parcellations (lateralized atlases outside of Schaefer and AAL).

Some key features includes:

  • Timeseries extraction from resting-state or task fMRI data (fMRIPrep compatible) and ability to save the data as a pickle.
  • CAPs analysis using k-means clustering with option to specify a single cluster size or multiple cluster sizes and select the optimal size using a cluster selection method (elbow, silhouette, davies bouldin, & variance ratio).
  • Various visualization options (heatmaps, surface plots, radar plots, etc).
  • Calculation of some CAP metrics (temporal fraction, persistence, transition frequency, counts).
  • Parallel processing support for improved performance for timeseries extraction and when using a cluster selection method.

Here’s a quick example of how to use neurocaps:

from neurocaps.extraction import TimeseriesExtractor
from neurocaps.analysis import CAP

# Initialize TimeseriesExtractor class
confounds = ["cosine*","trans_x", "trans_x_derivative1",
             "trans_y", "trans_y_derivative1", "trans_z", 
             "trans_z_derivative1",  "rot_x", "rot_x_derivative1",
             "rot_y", "rot_y_derivative1", "rot_z", "rot_z_derivative1",
             "global_signal", "global_signal_derivative1"
            ]
extractor = TimeseriesExtractor(parcel_approach={"Schaefer": {"n_rois": 100, "yeo_networks": 7}},
				                standardize="zscore_sample", use_confounds=True, detrend=True,
				                confound_names=confounds,low_pass=0.15)

# Extract BOLD timeseries
extractor.get_bold(bids_dir="/path/to/bids/dir", task="rest", session="002", n_cores=10)

# Perform CAPs analysis w/ a cluster selection method; the best model is saved automatically
cap_analysis = CAP(parcel_approach=extractor.parcel_approach)
cap_analysis.get_caps(subject_timeseries=extractor.subject_timeseries,
		              n_clusters=list(range(2,21)), cluster_selection_method="silhouette",
		              n_cores=5)

# Visualize CAPs as heatmap or outer products at the region or node level
cap_analysis.caps2plot(visual_scope=["regions","nodes"],
                       plot_options=["outer_product","heatmap"])

# Calculate CAP metrics
metrics = cap_analysis.calculate_metrics(subject_timeseries=extractor.subject_timeseries,
                                         tr=2.0, metrics=["temporal_fraction", "persistence"])

# Create correlation matrix and return uncorrected p-values
cap_analysis.caps2corr(annot=True, figsize=(6,4), cmap="coolwarm", return_df=True)

# Map the CAPs onto the atlas used for dimensionality reduction and create surface plots of CAPs
cap_analysis.caps2surf(cmap="cold_hot", layout="row", size=(500, 100))

# Map the CAPs onto the atlas used for dimensionality reduction and save as a NifTI image
cap_analysis.caps2niftis(output_dir="/path/to/output/dir")

# Create radar plots showing cosine similarity between regions/networks and CAPs
radialaxis={"showline": True, "linewidth": 2, "linecolor": "rgba(0, 0, 0, 0.25)", 
	        "gridcolor": "rgba(0, 0, 0, 0.25)", "ticks": "outside",
	        "tickfont": {"size": 14, "color": "black"}, "range": [0,0.3], "tickvals": [0.1,0.2,0.3]}

cap_analysis.caps2radar(radialaxis=radialaxis,fill="toself")

If you want to try out the package, more details, installation instructions, or extended documentation, can be found below:

Additionally, this is my first Python package so it is definitely a work in progress and is open to contributions to improve the package. If you’re interested in contributing in any way, I created a basic contributing guideline. Don’t forget to add your credentials to the Zenodo file to receive proper credit for your contributions.

Thank you for reading!

2 Likes