Mixed-signal architecture for DNN
2020 - Review An Updated Survey of Efficient Hardware Architectures for Accelerating Deep Convolutional Neural Networks Maurizio Capra
2020 A Survey of Accelerator Architectures for Deep Neural Networks Yiran Chen
Although the explosion of big data applications is driving the development of ML, it also imposes severe challenges of data processing speed and scalability on conventional computer systems.
2013 Finding a roadmap to achieve large neuromorphic hardware systems Jennifer Hasler and Bo Marr
2019 Analog Architecture Complexity Theory Empowering Ultra-Low Power Configurable Analog and Mixed Mode SoC Systems by Jennifer Hasler
2020 Mixed-precision architecture based on computational memory for training deep neural networks S. R. Nandakumar
We propose a mixed-precision architecture that combines a computational memory unit storing the synaptic weights with
a digital processing unit and an additional memory unit that stores the accumulated weight updates in high precision. The
new architecture delivers classification accuracies comparable to those of floating-point implementations without being constrained by challenges associated with the non-ideal weight update characteristics of emerging resistive memories.
youtube Working at the Intersection of Machine Learning, Signal Processing, Sensors, and Circuits Dina Katabi
new devices
2020 NeuroMem: Analog Graphene-Based Resistive Memory for Artificial Neural Networks Heba Abunahla
- 2018 Li-ion ECRAM as Scalable Synaptic Cell for High-Speed, Low-Power Neuromorphic Computing Jianshi Tang