Archive for the ‘research’ Category

h1

Metasyn – Interactive Information Visualizer

April 5, 2010

Metasyn is an interface that allows visitors to explore the collection of contemporary art in Roskiilde.  The visualization includes an interactive 3D browser that is among the best I’ve seen.  Items are organized in the space  as follows:

The objects are lined up vertically by year showing the distribution of objects over time. For a given object, its vertical order is a product of the ‘grade of dominance’ that the related artist has. The objects that are made by artists whose objects are commonly accruing in the collection are placed closer to the ground plane. This results in an organisation where the most dominant artists are represented close to ‘the core’ of the structure, while the less known artists ends up in the periphery. This decision was made to support the impression of exploring the unknown in the outer areas of the collection, and to increase chances additionally that the museum’s choice of popular artists are promoted.

For the patient, be sure to check out the hi-res version of the video

Created by: Cark Emil Carlsen
Project site: Metasyn

h1

Musart – music on the spiral array

February 15, 2010

Music on the Spiral Array . Real-Time (MuSA.RT) explores the use of Chew’s Spiral Array model in real-time analysis and scientific visualization of tonal structures in music.

Tone-based music consists of sequential arrangements of notes that generate pitch structures over time. An expert listener is able to ascertain these structures over time. MuSA.RT allows listeners to see tonal structures as they hear them. Real-time tracking of tonal patterns in music also has widespread applications in music analysis, information retrieval, performance analysis, and expression synthesis.

MuSA.RT shows the names of the pitches played, the triads, and the keys, as the music unfolds in a performance. The structures are visualized and computed using the three-dimensional Spiral Array model. Two trackers, called Centers of Effect (CEs), one for longterm and one for shortterm information, show the history of the tonal trajectories.

The three-dimensional model dances to the rhythm of the music, spinning smoothly so that the current triad forms the background for the CE trails. The real-time MIDI (Musical Instrument Digital Interface) input can be captured from an acoustic piano through a Moog piano bar.

MuSA.RT was designed using François’ Software Architecture for Immersipresence, a general formalism for the design, analysis and implementation of complex and interactive software systems.

Project sitehttp://www-rcf.usc.edu/~mucoaco/MuSA.RT/

h1

Decoding the “EROICA”

December 17, 2009

The graph above plots tempi in the first movement, in terms of average beats per minute; the fastest is Hermann Scherchen, at 174.58, and the slowest is Otto Klemperer, in 1970, at 110.74.

Read more: http://www.newyorker.com/online/blogs/alexross

(Thanks for the tip, Adam)

h1

The Landscape of music

December 6, 2009

From AT&T’s lab.  A nifty geographic representation of musical artist.  Zoom in and out to find artists.

Creator:  AT&T

Uses: GMAP – a technique for visualizing relations and structures as maps

h1

Using Visualizations for Music Discovery

October 22, 2009

Hot of the presses, here are the sides for the tutorial that Justin and Paul are presenting at ISMIR 2009 on October 26.

Note that the live presentation will include many demonstrations and videos of visualizations that just are not practical to include in a PDF.  If you have the chance, be sure to check out the tutorial at ISMIR in Kobe on the 26th.

h1

An Exploration of Real-Time Visualizations of Musical Timbre

October 16, 2009

This project explores several different ways of visualizing sets of extracted audio features in real-time. These visualizations are realized in a toolkit for the Max/MSP/Jitter programming environment. The primary purpose is to visualize timbral changes in the sense of exploratory data analysis. The program has four main parts: feature extraction, visualization, similarity, and audio control. Features are calculated by using a combination of pre-existing libraries, as e.g. the zsa.descriptors and the CNMAT analyser object. Additionally, we introduce a simple notion of timbral distance, which can be used in real-time performance situations, and present its performance for a set of different textures. The visualizations are further used to inform the control of audio effects by feature trajectories.

Researcher: Kai Siedenburg
Paper
: An Exploration of Real-Time Visualizations of Musical Timbre

h1

Visualizing emotion in lyrics

September 11, 2009

emotionLyrics_JorisKlerkx.020

Joris Klerkx has built a visualizer of the emotions in lyrics.  Joris has  integrated a karaoke player and Synesketch, a framework for visualizing 6 basic emotions, defined by Ekman (happiness, anger, fear, surprise, sadness, disgust). The player takes a song, plays it, and with each line of text that plays in the lyrics, the strongest emotion of that line is visualized.  In the image above, on the left hand side, you’ll see the 6 emotions and their visualization. On the right hand side, 2 screenshots of demo’s of the prototype.
Some video of the player in action:

  • Thriller by Michael Jackson: emotions fear, angry, sad & disgust are well visible in the end.
  • Shiny Happy People by REM: pretty happy.

Jorik points out that it can be interesting to see how the visualizations contrast with how the song sounds since offten times the emotion and mood of the lyrics of a song contrast with how the song sounds

Creator: