This project explores several different ways of visualizing sets of extracted audio features in real-time. These visualizations are realized in a toolkit for the Max/MSP/Jitter programming environment. The primary purpose is to visualize timbral changes in the sense of exploratory data analysis. The program has four main parts: feature extraction, visualization, similarity, and audio control. Features are calculated by using a combination of pre-existing libraries, as e.g. the zsa.descriptors and the CNMAT analyser object. Additionally, we introduce a simple notion of timbral distance, which can be used in real-time performance situations, and present its performance for a set of different textures. The visualizations are further used to inform the control of audio effects by feature trajectories.
Researcher: Kai Siedenburg
Paper: An Exploration of Real-Time Visualizations of Musical Timbre