Shady Agwa

Research Fellow


Curriculum vitae


[email protected]


+44 (0) 7950676030


School of Engineering

The University of Edinburgh

1.24D Murchison House, King's Buildings Campus, Edinburgh, EH9 3BF, UK



PyT-NeuroPack: A Hybrid PyTorch/Memristor-Crossbar Simulation Tool for Convolutional Neural Networks


Journal article


Cristian Sestito, Weijie Huang, Shady O. Agwa, T. Prodromakis
IEEE International New Circuits and Systems Conference, 2024

Semantic Scholar DBLP DOI
Cite

Cite

APA   Click to copy
Sestito, C., Huang, W., Agwa, S. O., & Prodromakis, T. (2024). PyT-NeuroPack: A Hybrid PyTorch/Memristor-Crossbar Simulation Tool for Convolutional Neural Networks. IEEE International New Circuits and Systems Conference.


Chicago/Turabian   Click to copy
Sestito, Cristian, Weijie Huang, Shady O. Agwa, and T. Prodromakis. “PyT-NeuroPack: A Hybrid PyTorch/Memristor-Crossbar Simulation Tool for Convolutional Neural Networks.” IEEE International New Circuits and Systems Conference (2024).


MLA   Click to copy
Sestito, Cristian, et al. “PyT-NeuroPack: A Hybrid PyTorch/Memristor-Crossbar Simulation Tool for Convolutional Neural Networks.” IEEE International New Circuits and Systems Conference, 2024.


BibTeX   Click to copy

@article{cristian2024a,
  title = {PyT-NeuroPack: A Hybrid PyTorch/Memristor-Crossbar Simulation Tool for Convolutional Neural Networks},
  year = {2024},
  journal = {IEEE International New Circuits and Systems Conference},
  author = {Sestito, Cristian and Huang, Weijie and Agwa, Shady O. and Prodromakis, T.}
}

Abstract

In-Memory Computing (IMC) is gaining attention to deploy large-scale architectures for AI-oriented vector-matrix multiplications. Memristor-crossbars encapsulate these features performing computations using Ohm's and Kirchhoff's current laws. In addition, their feature of retaining data as non-volatile resistances makes them powerful resources for IMC. To quickly validate memristor-crossbar AI architectures, simulators have been proposed, such as NeuroPack. It works at the algorithmic-level, offering different neurons and learning rules. However, it is limited to Fully-Connected Layers (FCLs), thus being unable to process complex paradigms, such as Convolutional Neural Networks. This work presents PyT-NeuroPack that integrates the original memristor-crossbar FCL with a reconfigurable PyTorch-based interface to simulate features extraction through convolutions. Features are converted into spikes based on different coding schemes and feed the FCL for classification. The tool is showcased using the MNIST digits dataset, with features encoded into spikes through binarization and rate coding. The highest achieved accuracy is 91.59%, 9.6% higher than the NeuroPack case, and competitive with a state-of-the-art CNN using a spiking FCL.


Share

Tools
Translate to