Abstract
In this paper we present a new beat tracking algorithm
which extends an existing state-of-the-art system with a
multi-model approach to represent different music styles.
The system uses multiple recurrent neural networks, which
are specialised on certain musical styles, to estimate possible
beat positions. It chooses the model with the most appropriate
beat activation function for the input signal and
jointly models the tempo and phase of the beats from this
activation function with a dynamic Bayesian network. We
test our system on three big datasets of various styles and
report performance gains of up to 27% over existing stateof-
the-art methods. Under certain conditions the system is
able to match even human tapping performance.
Original language | English |
---|---|
Title of host publication | Proceedings of the 15th International Society for Music Information Retrieval Conference (ISMIR 2014), |
Number of pages | 6 |
Publication status | Published - 2014 |
Fields of science
- 202002 Audiovisual media
- 102 Computer Sciences
- 102001 Artificial intelligence
- 102003 Image processing
- 102015 Information systems
JKU Focus areas
- Computation in Informatics and Mathematics
- Engineering and Natural Sciences (in general)