Published on Thu Mar 04 2010

Learning by random walks in the weight space of the Ising perceptron

Haiping Huang, Haijun Zhou

Several variants of a stochastic local search process for constructing the synaptic weights of an Ising perceptron are studied. In this process, binary patterns are sequentially presented and are then learned. This process is able to reach a storage capacity of 0.63 for pattern length N = 101

0
0
0
Abstract

Several variants of a stochastic local search process for constructing the synaptic weights of an Ising perceptron are studied. In this process, binary patterns are sequentially presented to the Ising perceptron and are then learned as the synaptic weight configuration is modified through a chain of single- or double-weight flips within the compatible weight configuration space of the earlier learned patterns. This process is able to reach a storage capacity of for pattern length N = 101 and $\alpha \approx 0.41$ for N = 1001. If in addition a relearning process is exploited, the learning performance is further improved to a storage capacity of $\alpha \approx 0.80$ for N = 101 and for N=1001. We found that, for a given learning task, the solutions constructed by the random walk learning process are separated by a typical Hamming distance, which decreases with the constraint density of the learning task; at a fixed value of , the width of the Hamming distance distributions decreases with .

Fri Aug 08 2014
Machine Learning
Origin of the computational hardness for learning with binary synapses
Supervised learning in a binary perceptron is able to classify an extensive number of random patterns by a proper assignment of binary synaptic weights. The relation between the weight space structure and the algorithmic hardness has not yet been fully understood.
0
0
0
Wed Apr 10 2013
Machine Learning
Entropy landscape of solutions in the binary perceptron problem
The binary perceptron learns a random classification of input random patterns by a set of binary synaptic weights. The learning of this network is difficult especially when the pattern (constraint) density is close to the capacity. The geometrical organization is elucidated by the entropy landscape.
0
0
0
Fri May 04 2012
Neural Networks
Weighted Patterns as a Tool for Improving the Hopfield Model
We generalize the standard Hopfield model to the case when a weight is assigned to each input pattern. The weight can be interpreted as the frequency of the pattern occurrence at the input of the network. In the case of unequal weights our model does not lead to the catastrophic destruction of the memory.
0
0
0
Fri May 20 2016
Machine Learning
Unreasonable Effectiveness of Learning Neural Networks: From Accessible States and Robust Ensembles to Basic Algorithmic Schemes
In artificial neural networks, learning from data is a computationally demanding task. It is not well understood how learning occurs in these systems. We define a novel measure, which we call the "robust ensemble" (RE), which suppresses trapping by isolated configurations.
0
0
0
Wed Jan 30 2019
Neural Networks
Neuroevolution with Perceptron Turing Machines
We introduce the perceptron Turing machine and show how it can be used to create a system of neuroevolution. Advantages of this approach includeautomatic scaling of solutions to larger problem sizes. Hand-coded solutions may be implemented in the low-level language of Turing machines.
0
0
0
Thu Nov 28 2019
Machine Learning
Neural networks with redundant representation: detecting the undetectable
We consider a three-layer Sejnowski machine and show that features learnt via contrasting divergence have a dual representation as patterns in a dense associative memory of order P=4. Such a system is able to perform pattern recognition far below the standard signal-to-noise threshold.
0
0
0