Exploring Analog Noise Effect on Deep Learning Applications

Background information:

Recently, several different memristive technologies (ReRAM, CBRAM, PCM and STT-MRAM)

have emerged as promising candidates for digital and analog in-memory computation. Deep neural networks (DNNs) are one of the main application to benefit from analog in-memory computation. However, the noisy nature of analog computation may let to performance (“accuracy”) degradation.

In this project, you will use IBM analog hardware acceleration kit, a kit developed by IBM to simulate the effects of analog computation on the performance of the DNNs models when deployed on memristive-based analog computation. You will implement several known DNNs and evaluate the effect of different parameters on the network performance.

Project Tasks:

  1. Ramp –  introduction to
    1. Memristors
    2. In-memory analog computation and analog noise
    3. deep neural networks (DNNs) and PyTorch
    4. IBM analog hadware acceleration kit (aihwkit)
  2. Adopt known pretrained models to aiwkit
  3. Experiments and evaluation of the effect of analog noise on DNN accuracy
    1. Several architectures
    2. Number of bits per cell
    3. Number of bit per parameter
    4. Crossbar dimensions
    5. Device noise model
  4. Implement and examine the effect of solutions suggested in the literature
    1. BatchNorm calibration
    2. Modeling the noise during training

Prerequisites:

  • “Machine Learning” course
  • Programming in python
  • “Introduction to VLSI” is a plus

Mentor: Tzofnat Greenberg-Toledo (tzofnat.grin@gmail.com)