1. Analog in-memory subthreshold deep neural network accelerator
- Author
-
Dennis Sylvester, Laura Fick, Malav Parikh, David Blaauw, Skylar Skrzyniarz, and David Fick
- Subjects
Artificial neural network ,Neuromorphic engineering ,Subthreshold conduction ,Computer science ,Computation ,020208 electrical & electronic engineering ,0202 electrical engineering, electronic engineering, information engineering ,Electronic engineering ,Process (computing) ,02 engineering and technology ,020202 computer hardware & architecture ,Power (physics) ,Efficient energy use - Abstract
Low duty-cycle mobile systems can benefit from ultra-low power deep neural network (DNN) accelerators. Analog in-memory computational units are used to store synaptic weights in on-chip non-volatile arrays and perform current-based calculations. In-memory computation entirely eliminates off-chip weight accesses, parallelizes operation, and amortizes readout power costs by reusing currents. The proposed system achieves 900nW measured power, with an estimated energy efficiency of 0.012pJ/MAC in a 130nm SONOS process.
- Published
- 2017
- Full Text
- View/download PDF