Archive for the ‘live’ Category

h1

The Landscape of music

December 6, 2009

From AT&T’s lab.  A nifty geographic representation of musical artist.  Zoom in and out to find artists.

Creator:  AT&T

Uses: GMAP – a technique for visualizing relations and structures as maps

h1

Using Visualizations for Music Discovery

October 22, 2009

Hot of the presses, here are the sides for the tutorial that Justin and Paul are presenting at ISMIR 2009 on October 26.

Note that the live presentation will include many demonstrations and videos of visualizations that just are not practical to include in a PDF.  If you have the chance, be sure to check out the tutorial at ISMIR in Kobe on the 26th.

h1

MarGrid – using self-organizing maps to organize music

September 9, 2009

margrid1

MarGrid is a visualization that uses Self-Organizing Maps to organize music collections into a two-dimensional grid based on music similarity.  On the MarGrid website you can use find a flash-based interface that will let you explore a 1,000 song music collection.

margrid2

The MarGrid interface is incorporated into AudioScapes,  a framework for prototyping and exploring how touch-based and gestural controllers can be used with state-of-the-art content and context-aware visualizations. AudioScapes provides well-defined interfaces and conventions a variety of different audio collections, controllers and visualization methods so they can be easily combined to create innovative ways of interacting with large audio collections.

Here’s an AudioScape video that  shows the MarGrid in an iPhone app that is designed to to help people with disabilities navigate through their personal collections.  There are more videos worth watching on the AudioScapes site.

Creator:

MarGrid and AudioScapes is a project being built by researcher Steven Ness and George Tzanetakis at the University of Victoria  It is built using the venerable Marsyas audio framework

More Info:

h1

FM4 Soundpark

September 8, 2009

sp3d_body

soundpark

soundpark2.1

The FM4 Soundpark is a web platform run by the Austrian public radio station FM4,  that visualizes an audio similarity music space.  Soundpark incorporates purely content-based rcommendations based upon a seed track and provides a 2D visualization based on audio similarity as well as an interactive 3D visualization based upon a combination of audio and metadata features.  Features of Soundpark:

  • Music Recommendation: The core of all applications is a content based music similarity function. The similarity is automatically computed and based on models of the songs’ audio content. Musical instruments and voices exhibit specific frequency patterns in the audio signal. These patterns are estimated with statistical models and used to compute the audio similarity.
  • Soundpark Player: Whenever a visitor of the Soundpark listens to a song, a recommendation of five or more similar songs is provided. These recommendations are visualized in a graph-based representation. Users can interactively explore the similarity space by clicking on songs in the recommendation graph.
  • Soundpark 3D: The entire database of songs in the Soundpark is visualized as an audio landscape of sea and islands. Songs from the same genre inhabit the same islands, within islands songs are grouped according to their audio similarity. Users can travel through the landscape and explore their own audio path through the data base.
  • Automatic generation of playlists: Visitors of the Soundpark can choose a start and an end song from the data base and a playlist of eight more songs “in-between” is automatically computed. The playlist is a smooth transition from the start to the end song. This functionality is not online any more.

More Info:

Creators:

h1

RAMA – Relational Artist Maps

September 7, 2009

rama

rama2Description

RAMA is a prototype web-based application for visualizing and interacting with networks of music artists. It uses data of roughly 200000 artists and 3 million tags, collected from Last.fm’s API. Data includes artists similarities, associated tags and popularity.
RAMA provides two simultaneous layers of information:

  1. a graph built from artist similarity data, modeled as a physical system representing nodes as negatively charged particles and edges as springs;
  2. overlaid labels containing user-defined tags.

A number of different features aim at providing enhanced browsing experiences to users: RAMA emphasizes commonalities as well as main differences between artists, users can interact with the graph in different ways (changing the graph’s initial ramification R, the depth D and how the ramification decays with depth, the population factor P), etc. Optionally, users can edit graphs manually, removing some artists and expanding artist’s neighbors.

Creators:

  • Diogo Costa,Luis Sarmento, Fabien Gouyon

Links

More Info