Abstract
Deep Neural Networks are known to be very demanding in
terms of computing and memory requirements. Due to the ever
increasing use of embedded systems and mobile devices with a
limited resource budget, designing low-complexity models without
sacrificing too much of their predictive performance gained great
importance. In this work, we investigate and compare several wellknown methods to reduce the number of parameters in neural networks. We further put these into the context of a recent study on
the effect of the Receptive Field (RF) on a model’s performance,
and empirically show that we can achieve high-performing lowcomplexity models by applying specific restrictions on the RFs, in
combination with parameter reduction methods. Additionally, we
propose a filter-damping technique for regularizing the RF of models, without altering their architecture and changing their parameter
counts. We will show that incorporating this technique improves
the performance in various low-complexity settings such as pruning
and decomposed convolution. Using our proposed filter damping,
we achieved the 1st rank at the DCASE-2020 Challenge in the task
of Low-Complexity Acoustic Scene Classification.
Original language | English |
---|---|
Title of host publication | Proceedings of the Detection and Classification of Acoustic Scenes and Events 2020 Workshop (DCASE2020) |
Place of Publication | Tokyo, Japan |
Number of pages | 5 |
Publication status | Published - Nov 2020 |
Fields of science
- 202002 Audiovisual media
- 102 Computer Sciences
- 102001 Artificial intelligence
- 102003 Image processing
- 102015 Information systems
JKU Focus areas
- Digital Transformation