Archive for the ‘discovery’ Category

h1

Metasyn – Interactive Information Visualizer

April 5, 2010

Metasyn is an interface that allows visitors to explore the collection of contemporary art in Roskiilde.  The visualization includes an interactive 3D browser that is among the best I’ve seen.  Items are organized in the space  as follows:

The objects are lined up vertically by year showing the distribution of objects over time. For a given object, its vertical order is a product of the ‘grade of dominance’ that the related artist has. The objects that are made by artists whose objects are commonly accruing in the collection are placed closer to the ground plane. This results in an organisation where the most dominant artists are represented close to ‘the core’ of the structure, while the less known artists ends up in the periphery. This decision was made to support the impression of exploring the unknown in the outer areas of the collection, and to increase chances additionally that the museum’s choice of popular artists are promoted.

For the patient, be sure to check out the hi-res version of the video

Created by: Cark Emil Carlsen
Project site: Metasyn

h1

Sync Lost

February 19, 2010

The popularization of electronic instruments and computers, allied to the broad and easy reachable information through the internet, enabled the appearance of countless rhythmic structures, giving rise to new styles and sub styles within contemporary electronic music.

Created in Processing, SyncLost is a multi-user immersive installation on the history of electronic music. The project’s objective is to create an interface where users can view all the connections between the main styles of electronic music through visual and audible feedback.

When you click on a particular node, all connections are shown – where the style comes from and which had been influenced by it – furthermore the music plays and a representative textual information is displayed. The visual feedback is given in real time, according to the user’s choice. The music rhythm serves as a visualization parameter of the style’s icon, creating multiple sonorous visualizations. You control the visualization through wiimote controls, while audible feedback is given through wireless headphones.

via

h1

Using Visualizations for Music Discovery

October 22, 2009

Hot of the presses, here are the sides for the tutorial that Justin and Paul are presenting at ISMIR 2009 on October 26.

Note that the live presentation will include many demonstrations and videos of visualizations that just are not practical to include in a PDF.  If you have the chance, be sure to check out the tutorial at ISMIR in Kobe on the 26th.

h1

FM4 Soundpark

September 8, 2009

sp3d_body

soundpark

soundpark2.1

The FM4 Soundpark is a web platform run by the Austrian public radio station FM4,  that visualizes an audio similarity music space.  Soundpark incorporates purely content-based rcommendations based upon a seed track and provides a 2D visualization based on audio similarity as well as an interactive 3D visualization based upon a combination of audio and metadata features.  Features of Soundpark:

  • Music Recommendation: The core of all applications is a content based music similarity function. The similarity is automatically computed and based on models of the songs’ audio content. Musical instruments and voices exhibit specific frequency patterns in the audio signal. These patterns are estimated with statistical models and used to compute the audio similarity.
  • Soundpark Player: Whenever a visitor of the Soundpark listens to a song, a recommendation of five or more similar songs is provided. These recommendations are visualized in a graph-based representation. Users can interactively explore the similarity space by clicking on songs in the recommendation graph.
  • Soundpark 3D: The entire database of songs in the Soundpark is visualized as an audio landscape of sea and islands. Songs from the same genre inhabit the same islands, within islands songs are grouped according to their audio similarity. Users can travel through the landscape and explore their own audio path through the data base.
  • Automatic generation of playlists: Visitors of the Soundpark can choose a start and an end song from the data base and a playlist of eight more songs “in-between” is automatically computed. The playlist is a smooth transition from the start to the end song. This functionality is not online any more.

More Info:

Creators:

h1

Fidgt: Visualize

September 7, 2009


amorphic_1


amorphic_2

The Fidg’t Visualizer allows you to play around with your network. You interface with the Visualizer through Flickr and LastFM tags, using any tag to create a Magnet. Once a Tag Magnet is created, members of the network will gravitate towards it if they have photos or music with that same Tag.

This simple mechanic lets you visualize your Network in a unique way, demonstrating its Predisposition towards certain things. What is more popular amongst people in your Network – rock or electronic music? Are photos of buildings more popular than photos of sunsets? Based on how your network reacts to those Tags, you might get an answer. The Visualizer also shows how your Network compares to a random sampling of the networks of other Fidg’t users, letting you see how your network stacks up to others?

For good measure, you can also search through the network for certain users, and check out their recent photos and music. This visualizer is just one example of some of the cool Address Book applications you could build on top of our web services.

More Info

h1

The World of Music

September 7, 2009

world_of_music3-R

One of the most beautiful renderings of the music space is shown in The World of Music by researchers at Standford, MIT and Yahoo!. This visualizations shows 10,000 artists and how they are related to each other. The artist relation data is mined from user ratings of artists in the Yahoo! Music service. They use a technique called semidefinite programming (which is sometimes called Semidefinite embedding) to layout and cluster the data. Semidefinite embedding is a method for mapping high dimensional data into a lower dimensional euclidean vector space.

More Info

Creator:

David Gleich, Matt Rasmussen, Leonid Zhukov, and Kevin Lang

h1

Databionic Music Miner

September 7, 2009

mmgui

musicmap

The Databionic MusicMiner is a browser for music based on data mining techniques. You can create MusicMaps to visualize the similarity of songs and artists. Explore your music and create playlists based on the paradigm of geographical maps! Features include:

  • Automatic parsing of a folder tree with music files (MP3, OGG, WMA, M4A, MP2, WAV).
  • Automatic description of digital audio files by sound.
  • Creation of MusicMaps to navigate the sound space based on the paradigm of geographical maps.
  • Visual creation of playlists.
  • Similarity search in music collection based on sound.
  • Customizable hierarchichal browsing of the database by e.g. genre/artist/album or year/artist.
  • Flexible database including the seperate storage of several artists per song, albums and playlists as part of a playlist.
  • Import and export of meta information based on XML.

Creator:This system was developed by the Databionics Research Group at the University of Marburg, Germany. This group has released a number of open source tools that perform data mining tasks such as clustering, visualization and classification with Emergent Self-Organizing Maps. There’s a paper giving an overview of their toolkit here: ESOM-Maps: Tools for clustering, visualization, and classification with Emergent SOM

More Info


h1

MusicBox:Mapping and visualizing music

September 7, 2009

screenshot june 2008

path selection done

Navigating increasingly large personal music libraries is commonplace.  Yet most music browsers do not enable their users to explore their collections in a guided and manipulable fashion, often requiring them to have a specific target in mind.  MusicBox is a new music browser that provides this interactive control by mapping a music collection into a two-dimensional space, applying principal components analysis (PCA) to a combination of contextual and content-based features of each of the musical tracks. The resulting map shows similar songs close together and dissimilar songs farther apart.  MusicBox is fully interactive and highly flexible: users can add and remove features from the included feature list, with PCA recomputed on the fly to remap the data.  MusicBox is also extensible; we invite other MIR researchers to contribute features to its PCA engine.  MusicBox has been shown to help users find music in their libraries, discover new music, and challenge their assumptions about relationships between types of music.

Creator

Anita Lillie

More Info

h1

nepTune

September 7, 2009

nepTune_large

nepTune_modes

Description

nepTune is an innovative user interface to music repositories. Given an arbitrary collection of digital music files, nepTune creates a virtual landscape which allows the user to freely navigate in this collection. This is accomplished by automatically extracting features from the audio signal and clustering the music pieces. The clustering is used to generate a 3D island landscape in which the user can freely navigate and hear the closest sounds with respect to his/her current position via a surround sound system. Additionally, knowledge extracted automatically from the Web is incorporated to enrich the landscape with semantic information. More precisely, nepTune displays words that describe the heard music and related images on the landscape to support the exploration.

Developed in 2006, 2007 by:

Knees, P., Schedl, M., Pohle, T., and Widmer, G from the Department of Computational Perception

Johannes Kepler Universität Linz

More Info

h1

Search Inside the Music

September 7, 2009
sitm-1

Engaging visualizations and animations in SITM

Exploring the music space with SITM

Exploring the music space with SITM

Description

The goal of the ‘Search Inside the Music’ project is to explore new methods of analyzing, categorizing, indexing and organizing large collections of music to allow us to build more effective tools to explore, discover and recommend music. This project extends music search to search ‘inside the music’, that is, to search not just titles, keywords and artists, but to search and recommend music by music content and context.

One goal of Search Inside the Music is to create new ways to help people explore and discover new music. In particular SITM is using interactive 3D visualizations of a music similarity space to allow a music listener to explore their music collection, to receive recommendations for new music, to generate interesting and coherent playlists and to interact with the album artwork of a music collection. The resulting user interface is arguably more engaging and enjoyable to use than currently available interfaces.

Developer

Sun Microsystems Research Lab including: Paul Lamere, Stephen Green, Jeff Alexander, Douglas Eck, Thierry Bertin-Mahieux, Francois Maillet, Rebecca Fiebrink, Kris West

More Info