Towards Effective "Any-Time" Music Tracking

Andreas Arzt, Gerhard Widmer

Research output: Chapter in Book/Report/Conference proceedingConference proceedingspeer-review

Abstract

The paper describes a new method that permits a computer to listen to, and follow, live music in real-time, by analysing the incoming audio stream and aligning it to a symbolic representation (e.g, score) of the piece(s) being played. In particular, we present a multi-level music matching and tracking algorithm that, by continually updating and evaluating multiple high-level hypotheses, effectively deals with almost arbitrary deviations of the live performer from the score – omissions, forward and backward jumps, unexpected repetitions, or (re-)starts in the middle of the piece. Also, we show that additional knowledge about the structure of the piece (which can be automatically computed by the system) can be used to further improve the robustness of the tracking process. The resulting system is discussed in the context of an automatic page-turning device for musicians, but it will be of use in a much wider class of scenarios that require reactive and adaptive musical companions.
Original languageEnglish
Title of host publicationProceedings of the Starting AI Reserachrs´ Symposium (STAIRS 2010)
Number of pages6
Publication statusPublished - 2010

Fields of science

  • 102 Computer Sciences
  • 102001 Artificial intelligence
  • 102003 Image processing
  • 102015 Information systems
  • 202002 Audiovisual media

Cite this