Visualizations that represent high-level features of a score (tempo, mode, note density, time signature, instrumentation, and difficulty level)

A Visual Exploration Tool for Sheet Music

A Visual Exploration Tool for Sheet Music

Musicians often use online repositories to find sheet music. However, it can be challenging to find pieces with the desired instrumentation, mood, tempo, difficulty, and other features. In addition, users may have to visit multiple sites to find all the information they need before purchasing a score, such as score previews and audio recordings. Users may gravitate towards familiar composers or those highlighted in search results, potentially creating bias towards well-known composers. To address these challenges, we used a visual approach instead of the more common text-dependent filtering to create a more intuitive and user-friendly experience.


We created visualizations representing high-level features of each score (tempo, mode, note density, time signature, instrumentation, and difficulty level) in a single image. Each musical feature maps to a different visual element (color, shape, icons, and pattern density). Users interact with the tool by selecting from a set of visualizations, each representing one piece. For the selected piece, users can view and download the score, listen to a MIDI version, and listen to an audio recording in an embedded Spotify player.

The prototype serves as a proof of concept for using visualizations to explore a database of sheet music. Preliminary user feedback suggests that users found the tool easy to navigate, and the overall design was effective in conveying high-level information about musical features in a visual format. It addresses several issues identified in our user research, enabling comparison of pieces by musical features, and having score previews, musical features, and audio easily accessible within a single tool.


Future work for this project would explore solutions to the challenges of implementing this on a larger scale, including streamlining the data processing, addressing cross-browser and mobile compatibility, and validating our measure of difficulty. A larger and more diverse set of pieces could be used to test whether this visual approach helps address bias issues in selection of sheet music.

Headshot of Laney Light

Laney Light

Laney Light recently completed an M.S. in Music Technology, where she worked with Dr. Claire Arthur in the Computational and Cognitive Musicology lab. Before coming to Georgia Tech, she was a statistician and researcher in the health care field for 12 years (M.S. and B.A., Clemson University, Mathematical Sciences). Laney loves working with data and applying her analytical skills in the music industry. She applies a systematic approach to the ways in which we create, listen to, and interact with music. Her interests include computational musicology, data visualization, cognitive science, multimedia, UI/UX research and design, and creative computing. She is a flute player and has played in many community ensembles and flute choirs. A South Carolina native, Laney is relocating to Raleigh, NC this summer and is currently searching for the perfect job opportunity.

 

Questions?

 
If you can't find the information you were looking for, we'll get you to the right place.
Contact Us