Information for Paper ID 1590
Paper Information:
Paper Title: Zero-Aware Regularization for Energy-Efficient Inference on Akida Neuromorphic Processor 
Student Contest: Yes 
Affiliation Type: Academia 
Keywords: Edge AI, Energy efficiency, Neuromorphic Chips, Regularization, Spiking Neural Networks 
Abstract: Spiking Neural Networks (SNNs) and their hardware accelerators have emerged as promising systems for advanced cognitive processing with low power consumption. Although the development of SNN hardware accelerators is particularly active, research on the intelligent use of these accelerators remains limited. This study focuses on the SNN accelerator Akida, a commercially available neuromorphic processor, and presents a novel training method designed to reduce inference energy by leveraging the unique architecture of the hardware. Specifically, we apply sparse constraints on neuron activations and synaptic connection weights, aiming to minimize the number of firing neurons by considering Akida's batch spike processing feature. Our proposed method was applied to a network consisting of three convolutional layers and two fully connected layers. In the MNIST image classification task, the activations became 76.1% sparser, and the weights became 22.1% sparser, resulting in a 13.8% reduction in energy consumption per image. 
Track ID: 8.2 
Track Name: Spiking Neural Networks and Systems 
Final Decision: Accept as Poster 
Session Name: Neural Learning Systems: Circuits & Systems III (Poster) 
Author Questions:
TCAS: Yes
WiCAS: Yes
YP: Yes
Theme Information:
Selected Theme(s):
Circuits and Systems for Communication
AI in Circuits and Systems