GSoC Project Idea 2.2: Deep learning in Spiking Neural Networks using GeNN

After printing out the exception, it says

/home/pradeep/genn-master/pygenn/genn_wrapper/_genn_wrapper.so: undefined symbol: deviceCount
Traceback (most recent call last):
File “/home/pradeep/example.py”, line 5, in
from pygenn import genn_wrapper
File “/home/pradeep/genn-master/pygenn/init.py”, line 2, in
from .genn_groups import SynapseGroup, NeuronGroup, CurrentSource
File “/home/pradeep/genn-master/pygenn/genn_groups.py”, line 12, in
from . import genn_wrapper
File “/home/pradeep/genn-master/pygenn/genn_wrapper/init.py”, line 1, in
from .genn_wrapper import *
File “/home/pradeep/genn-master/pygenn/genn_wrapper/genn_wrapper.py”, line 21, in
_genn_wrapper = swig_import_helper()
File “/home/pradeep/genn-master/pygenn/genn_wrapper/genn_wrapper.py”, line 18, in swig_import_helper
return importlib.import_module(’_genn_wrapper’)
File “/usr/lib/python2.7/importlib/init.py”, line 37, in import_module
import(name)
ImportError: No module named _genn_wrapper

Are you trying to install on a system with or without CUDA installed?

Finally, it all fell through. I had a CUDA-9.1 installation that wouldn’t work because of compatibility issues. On removing it, PyGENN works like a charm. Thanks for the pointer there.

Glad you got it working! What were your issues with CUDA 9.1? That should be fully-supported by GeNN.

Hi @jamie , this project is really fascinating! I’m wondering if we’re expected to give specific ideas on the implementation of the convolution connector in the project proposal. Also, as you mentioned earlier in the forum, there are several ways to generate a SNN from different deep architectures, but do I understand correctly that this project is primarily focused on building connectors based on the methods proposed by Diehl et al.?

Best,
Manvi

Hi Manvi,

Thank you for your interest in the project! Regarding the convolutional connector, I don’t think you really need to go into much detail in the proposal. However, the basic point is that currently, if you were to implement a convolution connector in GeNN you would probably do it using the SynapseMatrixConnectivity::RAGGED data structure documented here (I’m afraid the documentation is currently all based around the C++ API). However there are a number of ways this could be improved (these may be going a little beyond the scope of this project):

  1. You could reduce the load times by generating the convolutional structure on the GPU using the sparse connectivity initialisation feature documented here

  2. This would help with the load time but still doesn’t take advantage of the fact that weights are shared in conv nets. Weight sharing could then be implemented by creating a weight update model which takes weights from an “extra global parameter” containing the layer’s convolutional kernel rather than from per-connection weights. This system is described briefly in genn-team.github.io/genn/documentation/3/html/sectNeuronModels.html#sect_own but what that section fails to mention is that you can also use arbitrary-sized arrays as extra global parameters. This is done from Python in a few places within the PyNN frontend to GeNN (the result of one of LAST year’s GSoC projects!) here .

The ideal outcome of this project would be a framework that allows users to select from a variety of methods for converting trained ANNs to SNNs. The Diehl et al. method would definitely be a good starting point but, the framework should be easily extensible to allow us to incorporate the latest research in this area in future.

Hope that’s helpful!

Jamie