Are there SNN examples for AI benchmark problems using a Hebbian/STDP learning method?

Is there any working example, where a SNN can learn any of the standard benchmark problems like CIFAR-10 or something by using a Hebbian/STDP learning method?

I wonder, if this important milestone for SNNs has already been achieved…

Best regards,

Hey Kroll,

Have you seen this? Benchmarking an SNN on the MNIST digit recognition dataset?

Simplified spiking neural network architecture and STDP learning algorithm applied to image classification | EURASIP Journal on Image and Video Processing | Full Text.

They noted that their SNN was able to identify each digit - creating a unique signal for each. But, unfortunately, I don’t see any benchmarking that compares their NN to their more common gradient descent counter parts.

This is really cool because they are using the temporal signaling properties of a NN rather than the spacial properties. However, this is a tiny network with no architecture/topology adaptation.

I’m working on exactly what you are asking about. I’m using fundamental Neuroscience principles to create a NN that adapts to its tasks either in an unsupervised manner, or in a minimally supervised manner (my goal is to train it like a new born child learns).

My first goal is to have the AI find simple objects in images. I have an “eye” and an “occipital lobe”. The Occ. Lobe is extremely adaptive and generates/degenerates itself as learning goes on.

My goal is to create these very adaptive “organs” that can be pieced together ad hoc, and provided new, increasingly difficult tasks, as the “organism” matures - Just as a new born organism would.