Archive for the ‘video’ Category

h1

Metasyn – Interactive Information Visualizer

April 5, 2010

Metasyn is an interface that allows visitors to explore the collection of contemporary art in Roskiilde.  The visualization includes an interactive 3D browser that is among the best I’ve seen.  Items are organized in the space  as follows:

The objects are lined up vertically by year showing the distribution of objects over time. For a given object, its vertical order is a product of the ‘grade of dominance’ that the related artist has. The objects that are made by artists whose objects are commonly accruing in the collection are placed closer to the ground plane. This results in an organisation where the most dominant artists are represented close to ‘the core’ of the structure, while the less known artists ends up in the periphery. This decision was made to support the impression of exploring the unknown in the outer areas of the collection, and to increase chances additionally that the museum’s choice of popular artists are promoted.

For the patient, be sure to check out the hi-res version of the video

Created by: Cark Emil Carlsen
Project site: Metasyn

Advertisements
h1

Musart – music on the spiral array

February 15, 2010

Music on the Spiral Array . Real-Time (MuSA.RT) explores the use of Chew’s Spiral Array model in real-time analysis and scientific visualization of tonal structures in music.

Tone-based music consists of sequential arrangements of notes that generate pitch structures over time. An expert listener is able to ascertain these structures over time. MuSA.RT allows listeners to see tonal structures as they hear them. Real-time tracking of tonal patterns in music also has widespread applications in music analysis, information retrieval, performance analysis, and expression synthesis.

MuSA.RT shows the names of the pitches played, the triads, and the keys, as the music unfolds in a performance. The structures are visualized and computed using the three-dimensional Spiral Array model. Two trackers, called Centers of Effect (CEs), one for longterm and one for shortterm information, show the history of the tonal trajectories.

The three-dimensional model dances to the rhythm of the music, spinning smoothly so that the current triad forms the background for the CE trails. The real-time MIDI (Musical Instrument Digital Interface) input can be captured from an acoustic piano through a Moog piano bar.

MuSA.RT was designed using François’ Software Architecture for Immersipresence, a general formalism for the design, analysis and implementation of complex and interactive software systems.

Project sitehttp://www-rcf.usc.edu/~mucoaco/MuSA.RT/

h1

Using Visualizations for Music Discovery

October 22, 2009

Hot of the presses, here are the sides for the tutorial that Justin and Paul are presenting at ISMIR 2009 on October 26.

Note that the live presentation will include many demonstrations and videos of visualizations that just are not practical to include in a PDF.  If you have the chance, be sure to check out the tutorial at ISMIR in Kobe on the 26th.

h1

An Exploration of Real-Time Visualizations of Musical Timbre

October 16, 2009

This project explores several different ways of visualizing sets of extracted audio features in real-time. These visualizations are realized in a toolkit for the Max/MSP/Jitter programming environment. The primary purpose is to visualize timbral changes in the sense of exploratory data analysis. The program has four main parts: feature extraction, visualization, similarity, and audio control. Features are calculated by using a combination of pre-existing libraries, as e.g. the zsa.descriptors and the CNMAT analyser object. Additionally, we introduce a simple notion of timbral distance, which can be used in real-time performance situations, and present its performance for a set of different textures. The visualizations are further used to inform the control of audio effects by feature trajectories.

Researcher: Kai Siedenburg
Paper
: An Exploration of Real-Time Visualizations of Musical Timbre

h1

Visualizing emotion in lyrics

September 11, 2009

emotionLyrics_JorisKlerkx.020

Joris Klerkx has built a visualizer of the emotions in lyrics.  Joris has  integrated a karaoke player and Synesketch, a framework for visualizing 6 basic emotions, defined by Ekman (happiness, anger, fear, surprise, sadness, disgust). The player takes a song, plays it, and with each line of text that plays in the lyrics, the strongest emotion of that line is visualized.  In the image above, on the left hand side, you’ll see the 6 emotions and their visualization. On the right hand side, 2 screenshots of demo’s of the prototype.
Some video of the player in action:

  • Thriller by Michael Jackson: emotions fear, angry, sad & disgust are well visible in the end.
  • Shiny Happy People by REM: pretty happy.

Jorik points out that it can be interesting to see how the visualizations contrast with how the song sounds since offten times the emotion and mood of the lyrics of a song contrast with how the song sounds

Creator:

h1

Zune – MixView

September 10, 2009

zune1

zune2.1.1.1

Zune offers a rather rich music browsing experience on the web showing all sorts of artist info including songs, videos, bios, news, reviews and artist popularity data.  One rather nifty tool is their MixView.  When browsing an artist you can click on MixView to display a variety of information related to the seed artist in various-sized boxes.  Each box is clickable, which brings focus of the related item into view, and in turn, a new set of related boxes appear.  Additionally, each box has other actions such as “play” and “learn more” depending on the view that allows the user to jump to different places in the Zune Marketplace.   I like how MixView combines different types of information in one view.  In one view they show related artist, artist influences, artist albums, related albums and so on.  It is a well done browser – and one of the first that I’ve seen implemented in Silverlight.

This quick video shows off MixView.

Creator:

  • The Microsoft Zune Team

Submitted by  Tom Butcher


h1

MarGrid – using self-organizing maps to organize music

September 9, 2009

margrid1

MarGrid is a visualization that uses Self-Organizing Maps to organize music collections into a two-dimensional grid based on music similarity.  On the MarGrid website you can use find a flash-based interface that will let you explore a 1,000 song music collection.

margrid2

The MarGrid interface is incorporated into AudioScapes,  a framework for prototyping and exploring how touch-based and gestural controllers can be used with state-of-the-art content and context-aware visualizations. AudioScapes provides well-defined interfaces and conventions a variety of different audio collections, controllers and visualization methods so they can be easily combined to create innovative ways of interacting with large audio collections.

Here’s an AudioScape video that  shows the MarGrid in an iPhone app that is designed to to help people with disabilities navigate through their personal collections.  There are more videos worth watching on the AudioScapes site.

Creator:

MarGrid and AudioScapes is a project being built by researcher Steven Ness and George Tzanetakis at the University of Victoria  It is built using the venerable Marsyas audio framework

More Info: