GSOC 2026 Project #11 : The Virtual Brain : New graphical widget(s) for JupyterLab

Description: In the TVB (https://www.thevirtualbrain.org/) ecosystem there is a new code repository called tvb-widgets offering neat UI widgets for Jupyterlab environments. These widgets are compatible with TVB data formats and able to display these data in different forms: 2D, 3D. The purpose of this project is to implement a new set of widgets, which would allow users to work in Jupyter Lab in 3D, with the connectivity matrices (nodes & edges), surfaces (cortical, subcortical, face), sensors and all data involved in a TVB simulation. Necessary features for this widget: select from a drive the list of available datesets, load/unload a dataset, display connectivity matrix, 3D surface, connections, colours, animated timeseries, etc. Of course, these new widgets have to run in a Jupyterlab notebook as well. We have most of these already in the existent widgets, but we are looking into maybe new libraries to use, better performance, better usability - so a renovation of some of the existent features. Finally, it would be great to have all the widgets linked into the tvb-ext-xircuits repository which is a Jupyterlab extension based on React JS. At the moment, only the PhasePlaneWidget is linked there, but the rest could be added in a similar manner.

Examples of TVB data formats can be found on Zenodo. Check out our Jupyter notebooks to play with the widgets we have available so far.

Expected results: A set of classes , with at least one demo Jupyter notebook, and unit tests.

Preferred Tech keywords: Python, IPywidgets, React JS, Jupyterlab, Jupyterlab extensions

Skills level: junior+, mid

Mentors: Lia Domide (lead) <lia.domide@codemart.ro>, Paula Prodan <paula.popa@codemart.ro>, Teodora Misan (backup) <teodora.misan@codemart.ro>

2 Likes

Hello everyone,

My name is Katerina and I’m very interested in the project “New graphical widget(s) for JupyterLab” for GSoC 2026.

I have a background in software engineering and I also hold a Master’s degree in Neuroscience. During my studies I completed a six-month research internship at a neuroscience laboratory at UCL, where I worked with research workflows and scientific data.

Because of this background, I’m particularly interested in tools that make computational neuroscience and simulation workflows easier to explore and visualize. The idea of extending the tvb-widgets ecosystem with interactive 3D widgets for connectivity matrices, surfaces and simulation data in JupyterLab sounds very exciting.

From a technical perspective, I have experience with backend development and APIs, and I am comfortable working with Python-based data analysis tools and scientific environments.

I would love to start exploring the tvb-widgets repository and experiment with the existing notebooks.

A few questions:
• Are there recommended first issues or areas where new contributors can start helping?
• Are there specific visualization libraries currently being considered for the new widgets?
• Is there documentation describing the current widget architecture or integration with tvb-ext-xircuits?

Looking forward to learning more and contributing to the project.

Best regards,
Katerina Eleftheriadi

Hello Lia, Paula, and Teodora,

I have been exploring the tvb-widgets repository and ran the existing notebooks locally. While going through the source code I can see that both head_widget.py and spacetime_widget.py use k3d as the 3D rendering backend — for mesh surfaces, connectivity points/lines, and sensor locations.

For the new 3D widgets described in the 2026 project (cortical/subcortical surfaces, connectivity matrices in 3D, sensors), I wanted to ask about the rendering backend preference before going further with a prototype.

I explored pyvista with the trame backend as a potential alternative. It renders server-side on CPU, streams to the browser, works headless without a GPU, and does not require CLB_AUTH — which would make the new widgets usable locally without an EBRAINS account. However, it would mean introducing a second rendering dependency alongside k3d rather than extending the existing pattern.

My question: is there a preference to stay consistent with k3d for the new 3D widgets, or are you open to evaluating pyvista+trame given the headless and local-first advantages? This choice significantly affects the base class architecture I would design.

I have already started working on a prototype surface widget and would like to align on this before going further.

Thank you,
Mohit Ranjan

Hello Lia, Paula, and Teodora,

I’ve been exploring the tvb-widgets repository and built a proof-of-concept for Project #11 that extends the existing k3d + ipywidgets architecture with two new widgets:

Connectivity3DWidget - 3D brain regions sized by hub strength (row-sum of weight matrix), viridis-mapped by connectivity, with live threshold / colormap / hemisphere / label controls. All updates mutate k3d traitlets in-place, no plot rebuild on any interaction.

AnimatedSurface3DWidget - full cortical surface mesh (~16k vertices) with animated per-vertex timeseries data via mesh.attribute mutation at 12 fps. Accepts any (T, N_vertices) float32 array, synthetic data auto-generated if none is provided.

Both widgets follow the existing tvb-widgets patterns exactly - same MRO, same add_datatype() API, same k3d + ipywidgets.Output placement. add_datatype() maps cleanly to an xircuits input port following the PhasePlaneWidget integration.

Repository with demo notebook and 11 unit tests: [tvb-widgets-poc]

Two questions before finalising my proposal:

  1. For xircuits integration - beyond the PhasePlaneWidget pattern, is there a preferred way to expose widget state as xircuits output ports (e.g. the currently selected threshold or hemisphere)?

  2. Is there a preferred rendering performance target for the surface animation - e.g. minimum fps on a standard EBRAINS JupyterHub instance?

Looking forward to your feedback.

1 Like