Abstract
The paper reports on first steps towards automated computational
analysis of a unique and unprecedented corpus
of symbolic performance data. In particular, we focus on
between-hand asynchronies an expressive device that plays
an important role particularly in Romantic music, but has
not been analyzed quantitatively in any substantial way. The
historic data were derived from performances by the renowned
pianist Nikita Magaloff, who played the complete work
of Chopin live on stage, on a computer-controlled grand piano.
The mere size of this corpus (over 320,000 performed
notes or almost 10 hours of continuous performance) challenges
existing analysis approaches. The computational steps
include score extraction, score-performance matching, definition
and measurement of the analyzed features, and a
computational visualization tool. We then present preliminary
data to demonstrate the potential of our approach for
future computational modeling and its application in computational
musicology.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the 6th Sound and Music Computing Conference (SMC 2009) |
| Number of pages | 6 |
| Publication status | Published - 2009 |
Fields of science
- 102 Computer Sciences
- 102001 Artificial intelligence
- 102003 Image processing
- 102015 Information systems
- 202002 Audiovisual media