h1

Sync Lost

February 19, 2010

The popularization of electronic instruments and computers, allied to the broad and easy reachable information through the internet, enabled the appearance of countless rhythmic structures, giving rise to new styles and sub styles within contemporary electronic music.

Created in Processing, SyncLost is a multi-user immersive installation on the history of electronic music. The project’s objective is to create an interface where users can view all the connections between the main styles of electronic music through visual and audible feedback.

When you click on a particular node, all connections are shown – where the style comes from and which had been influenced by it – furthermore the music plays and a representative textual information is displayed. The visual feedback is given in real time, according to the user’s choice. The music rhythm serves as a visualization parameter of the style’s icon, creating multiple sonorous visualizations. You control the visualization through wiimote controls, while audible feedback is given through wireless headphones.

via

h1

Musart – music on the spiral array

February 15, 2010

Music on the Spiral Array . Real-Time (MuSA.RT) explores the use of Chew’s Spiral Array model in real-time analysis and scientific visualization of tonal structures in music.

Tone-based music consists of sequential arrangements of notes that generate pitch structures over time. An expert listener is able to ascertain these structures over time. MuSA.RT allows listeners to see tonal structures as they hear them. Real-time tracking of tonal patterns in music also has widespread applications in music analysis, information retrieval, performance analysis, and expression synthesis.

MuSA.RT shows the names of the pitches played, the triads, and the keys, as the music unfolds in a performance. The structures are visualized and computed using the three-dimensional Spiral Array model. Two trackers, called Centers of Effect (CEs), one for longterm and one for shortterm information, show the history of the tonal trajectories.

The three-dimensional model dances to the rhythm of the music, spinning smoothly so that the current triad forms the background for the CE trails. The real-time MIDI (Musical Instrument Digital Interface) input can be captured from an acoustic piano through a Moog piano bar.

MuSA.RT was designed using François’ Software Architecture for Immersipresence, a general formalism for the design, analysis and implementation of complex and interactive software systems.

Project sitehttp://www-rcf.usc.edu/~mucoaco/MuSA.RT/

h1

Decoding the “EROICA”

December 17, 2009

The graph above plots tempi in the first movement, in terms of average beats per minute; the fastest is Hermann Scherchen, at 174.58, and the slowest is Otto Klemperer, in 1970, at 110.74.

Read more: http://www.newyorker.com/online/blogs/alexross

(Thanks for the tip, Adam)

h1

The Landscape of music

December 6, 2009

From AT&T’s lab.  A nifty geographic representation of musical artist.  Zoom in and out to find artists.

Creator:  AT&T

Uses: GMAP – a technique for visualizing relations and structures as maps

h1

Using Visualizations for Music Discovery

October 22, 2009

Hot of the presses, here are the sides for the tutorial that Justin and Paul are presenting at ISMIR 2009 on October 26.

Note that the live presentation will include many demonstrations and videos of visualizations that just are not practical to include in a PDF.  If you have the chance, be sure to check out the tutorial at ISMIR in Kobe on the 26th.

h1

Last.fm tube tags

October 20, 2009

mapPDF of the map

Last.fm has added a few visualizations to their VIP (subscribers only) section of their playground.  One visualization is called Tube Tags – it will represent your listening history in the style of the London Tube.  Each colored line represents a genre / social tag:

map (1 page)It’s an attractive visualization drawing on the design of the original tube map designer Harry Beck

h1

An Exploration of Real-Time Visualizations of Musical Timbre

October 16, 2009

This project explores several different ways of visualizing sets of extracted audio features in real-time. These visualizations are realized in a toolkit for the Max/MSP/Jitter programming environment. The primary purpose is to visualize timbral changes in the sense of exploratory data analysis. The program has four main parts: feature extraction, visualization, similarity, and audio control. Features are calculated by using a combination of pre-existing libraries, as e.g. the zsa.descriptors and the CNMAT analyser object. Additionally, we introduce a simple notion of timbral distance, which can be used in real-time performance situations, and present its performance for a set of different textures. The visualizations are further used to inform the control of audio effects by feature trajectories.

Researcher: Kai Siedenburg
Paper
: An Exploration of Real-Time Visualizations of Musical Timbre