Published on Mon Nov 19 2012

Storing cycles in Hopfield-type networks with pseudoinverse learning rule: admissibility and network topology

Chuan Zhang, Gerhard Dangelmayr, Iuliana Oprea

Cycles are ubiquitous in animal nervous systems. They are partially responsible for generating and controlling rhythmic movements such as locomotion, respiration, swallowing and so on. Clarifying the role of the network connectivities for generating cyclic patterns is fundamental for understanding the generation of rhythmic movements.

0
0
0
Abstract

Cyclic patterns of neuronal activity are ubiquitous in animal nervous systems, and partially responsible for generating and controlling rhythmic movements such as locomotion, respiration, swallowing and so on. Clarifying the role of the network connectivities for generating cyclic patterns is fundamental for understanding the generation of rhythmic movements. In this paper, the storage of binary cycles in neural networks is investigated. We call a cycle admissible if a connectivity matrix satisfying the cycle's transition conditions exists, and construct it using the pseudoinverse learning rule. Our main focus is on the structural features of admissible cycles and corresponding network topology. We show that is admissible if and only if its discrete Fourier transform contains exactly nonzero columns. Based on the decomposition of the rows of into loops, where a loop is the set of all cyclic permutations of a row, cycles are classified as simple cycles, separable or inseparable composite cycles. Simple cycles contain rows from one loop only, and the network topology is a feedforward chain with feedback to one neuron if the loop-vectors in are cyclic permutations of each other. Composite cycles contain rows from at least two disjoint loops, and the neurons corresponding to the rows in from the same loop are identified with a cluster. Networks constructed from separable composite cycles decompose into completely isolated clusters. For inseparable composite cycles at least two clusters are connected, and the cluster-connectivity is related to the intersections of the spaces spanned by the loop-vectors of the clusters. Simulations showing successfully retrieved cycles in continuous-time Hopfield-type networks and in networks of spiking neurons are presented.

Wed Jan 23 2019
Neural Networks
Robust computation with rhythmic spike patterns
Information coding by precise timing of spikes can be faster and more energy-efficient than traditional rate coding. However, spike-timing codes are often brittle, which has limited their use in theoretical neuroscience and computing applications. Here, we propose a novel type of attractor neural network in complex state space.
0
0
0
Thu Feb 14 2019
Machine Learning
A Broad Class of Discrete-Time Hypercomplex-Valued Hopfield Neural Networks
In this paper, we address the stability of a broad class of discrete-time hypercomplex-valued Hopfield-type neural networks. We introduce novel hypercomplex number systems referred to as real-part associative hypercomplexNumber systems.
0
0
0
Fri Feb 10 2006
Neural Networks
R\'{e}seaux d'Automates de Caianiello Revisit\'{e}
0
0
0
Mon Nov 25 2019
Neural Networks
Biologically Plausible Sequence Learning with Spiking Neural Networks
0
0
0
Fri Dec 27 2019
Machine Learning
Emergence of Network Motifs in Deep Neural Networks
Network science can offer fundamental insights into the structural and functional properties of complex systems. In this article we show that network science tools can successfully applied also to the study of artificial neural networks.
0
0
0
Fri Mar 09 2018
Neural Networks
On the information in spike timing: neural codes derived from polychronous groups
There is growing evidence regarding the importance of spike timing in neural information processing. We employ information-theoretic techniques for a simple reservoir model which encodes input spatiotemporal patterns into a sparse neural code.
0
0
0