Spiking Neural Network Accelerator Architecture for Differential-Time Representation using Learned Encoding

Research output: Chapter in Book/Report/Conference proceedingConference proceedingspeer-review

Abstract

Spiking Neural Networks (SNNs) have garnered attention over recent years due to their increased energy efficiency and advantages in terms of operational complexity compared to traditional Artificial Neural Networks (ANNs). Two important questions when implementing SNNs are how to best encode existing data into spike trains and how to efficiently process these spike trains in hardware. This paper addresses both of these problems by incorporating the encoding into the learning process, thus allowing the network to learn the spike encoding alongside the weights. Furthermore, this paper proposes a hardware architecture based on a recently introduced differential-time representation for spike trains allowing decoupling of spike time and processing time. Together these contributions lead to a feedforward SNN using only Leaky-Integrate and Fire (LIF) neurons that surpasses 99% accuracy on the MNIST dataset while still being implementable on medium-sized FPGAs with inference times of less than 295µs.
Original languageEnglish
Title of host publication2025 IEEE International Symposium on Circuits and Systems (ISCAS)
PublisherIEEE
Pages1-5
Number of pages5
Edition1
ISBN (Electronic)9798350356830
ISBN (Print)979-8-3503-5684-7
DOIs
Publication statusPublished - 27 Jun 2025
Event2025 IEEE International Symposium on Circuits and Systems (ISCAS) - London, United Kingdom
Duration: 25 May 202528 May 2025

Publication series

NameProceedings - IEEE International Symposium on Circuits and Systems
ISSN (Print)0271-4310

Conference

Conference2025 IEEE International Symposium on Circuits and Systems (ISCAS)
Period25.05.202528.05.2025

Fields of science

  • 202034 Control engineering
  • 202017 Embedded systems
  • 202015 Electronics
  • 202030 Communication engineering
  • 202028 Microelectronics
  • 202027 Mechatronics
  • 102019 Machine learning
  • 202040 Transmission technology
  • 202 Electrical Engineering, Electronics, Information Engineering
  • 202025 Power electronics
  • 202041 Computer engineering
  • 202037 Signal processing
  • 202023 Integrated circuits
  • 202036 Sensor systems
  • 202022 Information technology

JKU Focus areas

  • Digital Transformation

Cite this