PROJECT TITLE :
Synaptic Weight States in a Locally Competitive Algorithm for Neuromorphic Memristive Hardware
Memristors promise a suggests that for prime-density neuromorphic nanoscale architectures that leverage in situ learning algorithms. While ancient learning algorithms commonly assume analog values for synaptic weights, actual physical memristors might have a finite set of achievable states during online learning. In this paper, we have a tendency to simulate a learning algorithm with limitations on each the resolution of its weights and therefore the means that of switching between them to explore how these properties have an effect on classification performance. For our experiments, we have a tendency to use the domestically competitive algorithm (LCA) by Rozell et al. in conjunction with the MNIST dataset and a group of natural images. We tend to investigate the effects of each linear and non-linear distributions of weight states. Our results show that as long as the weights are distributed roughly close to linear, the algorithm is still effective for classifying digits, while reconstructing pictures edges from non-linearity. More, the resolution needed from a device depends on its transition operate between states; for transitions reminiscent of spherical-to-nearest, synaptic weights should have around sixteen doable states (four-bit resolution) to obtain optimal results. We tend to notice that lowering the brink needed to change states or adding stochasticity to the system can reduce that demand down to four states (2-bit resolution). The outcomes of our analysis are relevant for building effective neuromorphic hardware with state-of-the-art memristive devices.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here