InstruData – Data-driven Musical Instrument Education

cyvy Research Project

Learning to play a musical instrument is a long and difficult endeavor. Not everyone can afford the help of a professional teacher, and even with this help, feedback is limited in terms of latency and expressiveness. To tackle these problems, we will design a collection of new data-driven techniques and tools. The main idea is to systematically record musical practice data of students and feed it back through smart, visual interfaces. With a Visual Analytics web-tool, we will allow students, teachers, and professional musicians to detect errors and improve their style in a completely innovative way. By additionally recording motion data, we will also be able to convey fingering instructions or correct poses through augmented reality displays that visualize information directly attached to a physical music instrument.

As musical data can be complex, and notes or audio data signals recorded from instruments are usually noisy, AI is a useful, if not necessary vehicle for data processing and analysis. We will follow a human-centered design process that involves musicians and music teachers of different backgrounds and skill-levels in data acquisition, development, and evaluation of our techniques and tools. Our goal is to provide ready-to-use music education tools, re-usable data processing techniques, and datasets comprised of notes, audio, motion capturing, and other features that we record from instruments and players.