There has recently been a lot of interest in converting ready-trained convolutional deep networks of artificial neurons into spiking neural networks (SNNs) for low-power inference on neuromorphic hardware. While GeNN is unlikely to compete with neuromorphic hardware in terms of energy efficiency, it is a useful and flexible platform for exploring this research area.

The first stage of this project will be to build a Python library which converts networks trained using Tensor flow into GeNN models via GeNN’s Python interface and some of the techniques discussed by Diehl et al. [1]. Possible extensions would then include modifying GeNN to implement a more efficient convolution connector and perhaps beginning to investigate some recent attempts to train deep SNNs [2].

**Skills required**: Tensor flow, Python, C++

**References**

[1] Diehl, Peter U., et al. "Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing." Neural Networks (IJCNN), 2015 International Joint Conference on. IEEE, 2015.

[2] Zenke, Friedemann, and Surya Ganguli. "SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks." Neural computation 30.6 (2018): 1514-1541.

**Mentors**: Jamie Knight & Thomas Nowotny, Sussex U, UK.