
Background information:
Recently, several different memristive technologies (ReRAM, CBRAM, PCM and STT-MRAM)
have emerged as promising candidates for digital and analog in-memory computation. Deep neural networks (DNNs) are one of the main application to benefit from analog in-memory computation. However, the noisy nature of analog computation may let to performance (“accuracy”) degradation.
In this project, you will use IBM analog hardware acceleration kit, a kit developed by IBM to simulate the effects of analog computation on the performance of the DNNs models when deployed on memristive-based analog computation. You will implement several known DNNs and evaluate the effect of different parameters on the network performance.
Project Tasks:
- Ramp – introduction to
- Memristors
- In-memory analog computation and analog noise
- deep neural networks (DNNs) and PyTorch
- IBM analog hadware acceleration kit (aihwkit)
- Adopt known pretrained models to aiwkit
- Experiments and evaluation of the effect of analog noise on DNN accuracy
- Several architectures
- Number of bits per cell
- Number of bit per parameter
- Crossbar dimensions
- Device noise model
- Implement and examine the effect of solutions suggested in the literature
- BatchNorm calibration
- Modeling the noise during training
Prerequisites:
- “Machine Learning” course
- Programming in python
- “Introduction to VLSI” is a plus
Mentor: Tzofnat Greenberg-Toledo (tzofnat.grin@gmail.com)