Projects per year
Abstract
In order to ease the analysis of error propagation in neuromorphic computing and to get a better understanding of spiking neural networks (SNN), we address the problem of mathematical analysis of SNNs as endomorphisms that map spike trains to spike trains. A central question is the adequate structure for a space of spike trains and its implication for the design of error measurements of SNNs including time delay, threshold deviations, and the design of the reinitialization mode of the leaky-integrate-and-fire (LIF) neuron model. First, we identify the underlying topology by analyzing the closure of all sub-threshold signals of a LIF model. For zero leakage this approach yields the Alexiewicz topology, which we adopt to LIF neurons with arbitrary positive leakage. As a result, LIF can be understood as spike train quantization in the corresponding norm. This way we obtain various error bounds and inequalities such as a quasi-isometry relation between incoming and outgoing spike trains. Another result is a Lipschitz-style global upper bound for the error propagation and a related resonance-type phenomenon.
| Original language | English |
|---|---|
| Article number | 128190 |
| Pages (from-to) | 128190 |
| Number of pages | 1 |
| Journal | Neurocomputing |
| Volume | 601 |
| DOIs | |
| Publication status | Published - 07 Oct 2024 |
Fields of science
- 202017 Embedded systems
- 202036 Sensor systems
- 102019 Machine learning
- 202 Electrical Engineering, Electronics, Information Engineering
- 202015 Electronics
- 202022 Information technology
- 202037 Signal processing
JKU Focus areas
- Digital Transformation
Projects
- 1 Active
-
Spike-based Sampling and Learning
Moser, B. (Researcher) & Lunglmayr, M. (PI)
01.01.2023 → 31.12.2026
Project: Funded research › FFG - Austrian Research Promotion Agency