Hi @Samuele_DeCristofaro. It would indeed make sense that each object would handle reading in a serialization state itself, similar to how the _restore_from_full_state
can be overwritten by individual objects (but most of the time use the standard implementation from VariableOwner
). I think one feasible approach might be to use the before_run
section in the templates (which will be executed once at the beginning of a run), or – to make it even more in line with the way it is implemented in runtime mode – add new store
/restore
functions for each created code object, and corresponding Network::store
and Network::restore
functions (in brian2/brian2/devices/cpp_standalone/templates/network.cpp at master · brian-team/brian2 · GitHub) that would loop over all objects and call their respective functions. I.e., the C++ equivalent of what Network.store
/Network.restore
currently does for runtime mode.
hello @mstimberg I am really excited about the Brian simulator project and its focus on model and state (de)serialization. As someone with experience in Python, C++, and data serialization techniques, I find this project particularly interesting because of its real-world applications in neural network simulations and ML workflows.
I have been exploring Brian2’s store/restore mechanism and its integration with C++ standalone mode and Brian2CUDA. I understand that the goal is to extend serialization capabilities to these modes. However, I am curious about:
- What specific challenges have been encountered in serializing a complete network architecture so far?
- Would using a format like protobuf or HDF5 for storing/restoring networks be a suitable approach, or is there a Brian-specific serialization format in consideration?
I would love to contribute to this project and further enhance my understanding of spiking neural networks and efficient simulation pipelines. Looking forward to your insights!
Best regards,
Alfiya Qureshi
Hi @Alfiya_Qureshi, happy to hear you are interested in this project! Regarding your questions:
I wouldn’t really say that we faced particular challenges – mostly we did not get around doing it so far We have the
basexporter
in brian2tools
, which serializes the model architecture (equations, structure, etc.), but it is missing the deserialization from the serialized representation (which is a simple text-based dictionary representation). Also, we have the store/restore mechanism which stores the current state of a model (i.e. the values of all state variables) to memory or disk (via pickle). We don’t have anything that puts the two together yet. I expect things not to be very difficult technically, but there will be quite some work to make things work smoothly and flexible – e.g. you might want to restore a network and then add/remove something from it, you might annotate a network with additional information, etc. For this, it would be useful to look in detail at how for example pytorch
handles things for save
/load
/checkpoint
, etc.
In principal yes, but we should run tests first and see whether we cannot simply use the built-in mechanisms from Python (like pickle), and for example numpy’s save features. If it is only a marginal speed or “size-on-disk” benefit, I wouldn’t think it is worth adding a big and potentially troublesome dependency to Brian.
Hope that answers your questions, best
Marcel
@mstimberg Thank you for your detailed response! Your explanation clarified a lot of things for me.
Looking forward to exploring this further!
Hello @d.goodman , @mstimberg , Benjamin Evans
My name is Okeme Perfect, and I’m excited to learn more about the Brian simulator and explore the possibility of contributing to this project during gsoc2025. While I’m new to neuroscience-related projects, I find the idea of improving model and state (de)serialization intriguing.
I have some experience with Python and C++, though I wouldn’t call myself highly proficient yet. However, I’m eager to learn, contribute, and grow through this opportunity. I’d love to understand more about how I can get started and what initial steps I should take to better grasp the project’s scope. Looking forward to engaging with the mentors and the community!
Regards,
Okeme Perfect.
Hi @Perfect, and thank you for your interest in the project. I’ve collected some general recommendations in this earlier comment: GSoC 2025 Project Idea #6 Brian Simulator :: Serialization and deserialization for Brian simulator models (175 h) - #11 by mstimberg and the linked website post also explains the general application process in more details. Let us know if anything is unclear or if you have other questions!
Dear @mstimberg, @d.goodman, and Benjamin Evans,
My name is Devvrat Mathur, and I am from India. I am currently pursuing a Master of Technology (M.Tech) in Data Science at NIT Jalandhar, which holds the 58th NIRF rank in India for Engineering. As part of our curriculum, we engage in extensive research work and paper publications.
After deciding to contribute to the Brian Simulator project, I thoroughly read the queries from participants and the responses provided, which greatly helped me understand the project. Additionally, I read the article “Brian 2, an Intuitive and Efficient Neural Simulator” and found its purpose highly impactful:
- Brian 2 effectively addresses the trade-off between flexibility (ease of defining novel models) and performance (efficient simulations) in neural simulators.
- It utilizes runtime code generation to convert high-level user-defined models into optimized low-level code.
I have experience with Python and C++ programming, problem-solving, and working in a GitHub environment. I am thrilled about the opportunity to contribute to this project and help enhance the Brian simulator’s (de)serialization capabilities. The challenge of improving the store/restore mechanism and refining the exporter/importer tools aligns perfectly with my problem-solving mindset and passion for optimizing neural simulation workflows.
I look forward to collaborating with the team, learning from your expertise, and making meaningful contributions to this project. I am excited to get started and make an impact!
Best regards,
Devvrat Mathur
Hi @devvratmathur. Thank you for your interest in our project. As you’ve seen, I’ve written quite a few things earlier in this thread, but please let me know if anything is unclear or if you have any specific questions.
Thank you for your reply, @mstimberg. I have submitted my proposal and await excitedly for a positive feedback
Hi everyone,
My name is Elsa Felts, and I’m currently a Bachelor student at École Polytechnique in France. Although I’m joining the thread a little late, I wanted to express my strong interest in the project—neuroscience has been a field I’ve been passionate about since high school.
I’ve put together a proposal for contributing to the project, and I truly hope I’ll have the opportunity to get involved and support the team.
Thanks so much for your time, and best of luck with reviewing all the proposals!
Elsa
Dear students/open source beginners interested in the “Serialization and deserialization for Brian simulator models” project, please don’t forget that the deadline for applications on https://summerofcode.withgoogle.com is later today at 18:00 UTC, i.e. in about 5 hours. Note that it will be impossible for us to ask Google to finance an internship for a candidate that did not submit an application, and that there will be no extension of this deadline from Google’s side. Good luck everyone, hope to see a few of you staying around (with or without a GSoC internship)
Greetings @mstimberg , I am very buffled as in your site (GSoC 2025 | The Brian spiking neural network simulator) the deadline is written to be at 18th of April. I haven’t submitted my application yet, due to this. I am very confused as of what is true and whether I lost my chance.
Thank you in advance,
Maria P
Dear @mariapoliti, my sincere apologies, I did not realize that the date on our website was wrong, and if anyone else did, they did not let me know… Unfortunately, having submitted an application before yesterday’s deadline is a strict requirement by Google, there is nothing we can do from our side. Again, I am very sorry, but it indeed means that you cannot be considered for this year’s GSoC. It will not be much of a consolation, but we received a record-breaking number of applications this year (30 applications for most likely 1 internship slot), so we will certainly have to say no to many very good candidates…