About: Bayesian inference is indispensable for hypothesis testing and uncertainty quantification in understanding complex brain (dys)function and cognition. Meanwhile, Virtual Brain Models, implemented through neuroinformatic tools like The Virtual Brain (TVB), have gained significant popularity due to their potential for clinical translation. Bayesian inference on Virtual Brain Models translates into probabilistic estimation of latent and observed states within systems driven by network input and stimuli, modeled by high-dimensional nonlinear differential equations, with potentially correlated parameters. To address these challenges, advanced MCMC sampling and inference algorithms embedded in Probabilistic Programming Languages (PPLs) have shown remarkable results. In particular, gradient-based algorithms, such as the No-U-Turn Sampler, and automatic Laplace approximation, have demonstrated effectiveness in achieving reliable Bayesian inference even in the presence of multimodal parameter distributions. These methods have been successfully applied in contexts like dynamical causal modeling of evoked-related potentials (see here) and inferring seizure propagation at the whole-brain scale (see here). However, significant challenges persist for fMRI BOLD data inference at whole-brain scales, e.g., in resting-state. Addressing these challenges requires reparameterization techniques to improve convergence and implementation in high-level tools like NumPyro or PyMC, streamlining the inference process. The aim of this project is to extend the automatic Bayesian estimation methods available in PPLs such as NumPyro or PyMC to enable Bayesian estimation of bifurcation parameters from fMRI data at large scales. The project leverages existing Python packages and prior expertise to advance inference methods for large scales dynamical models. By leveraging existing Python packages and prior expertise, the expected outcomes aim to significantly advance Bayesian inference for VBMs, addressing current limitations and enhancing their scalability for studying brain dynamics and supporting clinical applications.
Aims:
Improving the implementation of Virtual Brain Models using JAX-based frameworks, such as NumPyro or PyMC, to improve efficiency and scalability, validated in-silico
Develop reparameterization techniques to decorrelate parameters, facilitating improved gradient calculations for convergence, and visualizing outcomes
Benchmark existing algorithms within NumPyro or PyMC to systematically identify their strengths and weaknesses in handling high-dimensional, multimodal problems
Monitor algorithm convergence and provide comprehensive guidelines for ensuring reliable inference in the presence of multimodal parameter distributions
My name is Aaron Kim, and I am a third-year Applied Mathematics student at Texas A&M University. I am particularly interested in probabilistic modeling, Bayesian inference, and computational neuroscience, and I would love the opportunity to contribute to your project.
To provide some background, I have experience with Python, JAX, NumPyro, machine learning, and data analysis. At the Brain Networks Laboratory at Texas A&M, I am assisting with research on computational modeling of neural architectures, including recurrent and dual-pathway CNNs inspired by the V1 cortex. Additionally, my work at the Air Force Research Laboratory involves time-series anomaly detection and sequential data modeling, where I have gained exposure to probabilistic inference techniques and time-series forecasting.
I’m eager to learn more and prepare, whether through research papers, existing implementations, or preliminary materials. I’d appreciate your guidance on the best way to get started.
Our objective is to implement a whole-brain network of Montbrio model in NumPyro or PyMC, and estimate parameters such as global coupling G parameter. Simply said:
reproduce Figure S8 from this paper, but using NumPyro/PyMC.
Let me know if you’re interested or have any questions!
Thank you for the clarification and for sharing these resources! I’m very interested in working on this.
To start, I’ll review the DCM_ERP_PPLs repository and the Montbrio implementation example to understand the existing workflows. Are there specific parts of the codebase or documentation that you’d recommend I prioritize to align with the project’s main objectives?
Hello Dr. @mhashemi and Dr. @Daniele_Marinazzo,
My name is Anurag Mishra, and I am a third-year undergraduate student in Electronics Engineering at Jaypee Institute of Information Technology, India. I have experience with Python, JAX, signal processing, and deep learning, and I am particularly interested in JAX-based optimization and Bayesian inference.
Last summer, I worked on EEG signal classification, and currently, I am involved in a research project focused on GANs. And so, I am very eager to explore JAX-based implementations to improve computational efficiency and scalability.
Looking forward to your guidance!
Sincerely,
Anurag Mishra
My name is Mariia Glushanina, I am a first-year master’s student in École Normale Supérieure in Masters of Cognitive Science. My main interest lies in the probabilistic computational models of cognitive functions, but I am also interested in developping computational tools for neuroscience research.
Before doing masters, I also did a bachelor’s degree in Cognitive Science in Saint-Petersburg State University, as well as online degree on Data Science and Artificial Intelligence in Moscow State University. In the latter, I gained extensive training in machine learning, data analysis and required math prerequisites for this fields. During the course on Probabilistic Graph Models I got acquinted with PyMC framework and I am eager to contribute to the project, using this and new tools. Throughout my bachelor years, I was involved in project of analysing EEG and ECoG data.
Furthermore, currently in my master’s I’m following courses on neuroimaging methods and continue to deepen my knowledge in building computational models. Hope that my skills could be useful for this kind of project