Deep neural networks can be extraordinarily accelerated by using memristive devices as synaptic connections. However, traditionally, the deep neural networks utilize the error backpropagation algorithms, which face some issues when the networks are implemented in hardware based on memristive devices: i) complex peripheral circuits with expensive ADCs and DACs and memory back for intermediate layer states; ii) lack of efficient online training methods.
We recently developed an efficient online training method for deep belief nets, whose learning is based on the contrastive divergence of the restricted Boltzmann machine. The new online training method designed with the new network structure does not have the above issues and shows some additional advantages compared with traditional deep neural networks, for instance, immune to non-idealities of memristive devices. One remaining concern, however, is that the deep belief net has a larger network size (more connections and neural node) when all the layers are in a fully connected configuration. Replacing the fully connected layer with the convolutional layer, thus convolutional DBN, can effectively reduce the size of both the memristive array and the CD accumulation array and, in principle, should result in better accuracy.
• Study the mechanism and learning algorithm of deep belief nets and restricted Boltzmann machines;
• Study the principle of convolutional neural network layers and how they efficiently work on two-dimensional images;
• Learn the Matlab code (I will provide you the code) of deep belief nets based on memristive devices;
• Implement the convolutional layer in the deep belief nets and continue the simulations of the neural networks;
• Run the code on the GPU server in the lab; • Tune the network structures for best performance of various deep learning datasets;
• Study the network performance’s dependence on non-idealities of memristive devices
Prerequisites: Matlab/Python; Deep learning.
Supervisor: Wei Wang: email@example.com, Eric Herbelin : firstname.lastname@example.org – ASIC2 Lab, Mayer 961