I believe only the calcium data is available from the colab for the Stringer dataset, however, we saw in the article that they also recorded 8 probe Neuropixel data. We are interested in this temporally finer data to do decoding for behaviors.
We would really appreciate clarifications as we write our proposal:
- What is the recommended way of getting access and working with the Neuropixel data? We found https://figshare.com/articles/dataset/Eight-probe_Neuropixels_recordings_during_spontaneous_behaviors/7739750 and are wondering if there is a similar way to load the data into colab as for the Calcium data.
- How can we extract the “whisking, licking, sniffing, and other facial movements” from the behavioral mask? Have others extracted that information yet for other projects or is analysis typically done on the behavioral energy masks?
Using external data is considered advanced for NMA. If you already have some people in your group that are good at wrangling data files and doing data synchronization etc, then this shouldn’t be a problem. Otherwise, we recommend sticking with the curated datasets. There are some behavioral measurements in the Steinmetz dataset for example, if you’d rather work with very fine time resolution.
We assume you can figure this out if you are working with external data. It will involve some combination of downloading from figshare to your own computer, then off to google drive and from there mounting the personal google drives inside Colab.
They are not currently labelled categorically in that way, its just the motion energy masks with associated timecourses. However, the data loading notebook shows you how to reconstruct the original videos, and then from that you could train a program like DeepLabCut or similar to categorize states. Or you could make your own labels for those states and train your own neural network to predict behavioral states (very advanced).